This document discusses best practices for note-taking during usability testing. It emphasizes the importance of effective note-taking to improve data analysis. Note-takers should understand study goals and recognize what is most important to capture. Different note-taking approaches are explored, such as verbatim notes or focusing on key points. Defining metrics like task success ratings helps note-takers systematically record observations. With preparation and awareness of challenges, note-takers can effectively support analysis of usability findings.
UXPA DC Redux 2013 Notetaker Perspective 10-25-2013.pptUserWorks
Kristen Davis and Dick Horst from UserWorks presentation slides on the "Notetaker's Perspective During Usability Testing: Recognizing What's Important, What's Not" from UXPA-DC Conference Redux 2013
Presented by: Brian Utesch, Annette Tassone, Jon Temple and Stephen Woodburn. Businesses strive to monetize the relationship between user sentiment and success outcomes including user adoption, user retention, and revenue. Customer satisfaction is embraced as a top predictor of success. There are of course many ways that satisfaction can be measured. We will review several methods of measuring user satisfaction, including simple Likert scale measures of overall satisfaction, the System Usability Scale (SUS), UMUX-Lite and the popular Net Promoter Scale (NPS). Not all of these measures are created equally or even measure the same sentiment. We’ll further compare the advantages and disadvantages of each measure, best practices around the use of each, and original research we’ve conducted that informs our recommended best practices.
The Secret Sauce for Effective Usability Testing Diane Bowen
The document discusses Diane Bowen's experience conducting her first usability tests as a new UX researcher. It describes how her initial method of just interviewing participants and writing observations was not very effective. She later learned better practices like asking questions to define test goals and participant criteria, writing non-leading questions, having product teams observe interviews and debrief immediately after, and synthesizing findings into a task analysis to identify trouble spots. The document advocates for testing with a small number of users, inviting product teams to interviews, debriefing sessions directly after, and sending synthesized findings to inform the team.
The document provides an overview of user experience (UX) research methods. It explains that research is done to answer questions, remove ambiguity, understand human behaviors and needs, and build empathy. Research methods include interviews, observations, surveys, usability testing and more. Both qualitative and quantitative methods are used depending on the questions being asked and stage of the project. Numbers from research don't tell the whole story and can sometimes be misleading.
UXPA DC UX 101 Workshop - Usability TestingUXPA DC
This document provides an overview of usability testing from Stephanie Pratt of UXPA DC. It defines usability and its importance, how to set up and conduct a usability test, and tips for moderating a test. Key points include:
- Usability is how easy a product is to use, defined by criteria like effectiveness, efficiency and ease of learning.
- Usability testing evaluates a product's usability by observing users complete tasks. It identifies problems and improves the product.
- To set up a test, identify tasks, recruit 5-10 participants, develop a script, and practice moderation skills. When moderating, keep questions neutral and let the participant think aloud.
Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...Joshua Ledwell
This document summarizes two case studies of ensuring user research findings and early design guidance stay relevant for agile teams over time. Case study 1 involved creating a long-term customer data experience strategy to guide four agile teams. Case study 2 aimed to improve a complex software feature with dependencies on other parts. Key lessons included creating artifacts in the team's language, showing how design builds on research, hijacking agile ceremonies, sustaining buy-in from stakeholders, and committing to sustainability over burnout. The document concludes by discussing making artifacts easy to maintain and evolve the practice across projects.
Presented by Ari Weissman. How do you start from scratch? How do you build and grow a UX team within your organization where none existed?
Many organizations “do UX” in name only. There are people who might have the UX Designer title, but aren’t talking to users, leaving the product or engineering teams to drive the experience. It’s not that these organizations don’t want to be user-driven. It’s just that they don’t know how. That is what I walked into when I started as Director of UX for [my company].
This is the story of my ongoing successes and failures at building a UX practice. It’s not about one decision, but the many strategies you can employ to build, grow, and thrive.
M Hawley Desirability Studies Boston Upa Presentation V4hawleymichael
This document discusses desirability testing, which assesses people's emotional responses to designs. It outlines various methods considered, including triading, questionnaires, quick exposure tests, and physiological measurements. The selected method, Product Reaction Cards, uses 60 descriptive words to elicit feedback on designs for a hospital website redesign. Three designs were tested quantitatively with 50 people each and qualitatively through interviews. Designs scored well on conveying caring, warmth and trustworthiness. Lessons included the value of mixed methods and positioning results as input, not declaring winners.
UXPA DC Redux 2013 Notetaker Perspective 10-25-2013.pptUserWorks
Kristen Davis and Dick Horst from UserWorks presentation slides on the "Notetaker's Perspective During Usability Testing: Recognizing What's Important, What's Not" from UXPA-DC Conference Redux 2013
Presented by: Brian Utesch, Annette Tassone, Jon Temple and Stephen Woodburn. Businesses strive to monetize the relationship between user sentiment and success outcomes including user adoption, user retention, and revenue. Customer satisfaction is embraced as a top predictor of success. There are of course many ways that satisfaction can be measured. We will review several methods of measuring user satisfaction, including simple Likert scale measures of overall satisfaction, the System Usability Scale (SUS), UMUX-Lite and the popular Net Promoter Scale (NPS). Not all of these measures are created equally or even measure the same sentiment. We’ll further compare the advantages and disadvantages of each measure, best practices around the use of each, and original research we’ve conducted that informs our recommended best practices.
The Secret Sauce for Effective Usability Testing Diane Bowen
The document discusses Diane Bowen's experience conducting her first usability tests as a new UX researcher. It describes how her initial method of just interviewing participants and writing observations was not very effective. She later learned better practices like asking questions to define test goals and participant criteria, writing non-leading questions, having product teams observe interviews and debrief immediately after, and synthesizing findings into a task analysis to identify trouble spots. The document advocates for testing with a small number of users, inviting product teams to interviews, debriefing sessions directly after, and sending synthesized findings to inform the team.
The document provides an overview of user experience (UX) research methods. It explains that research is done to answer questions, remove ambiguity, understand human behaviors and needs, and build empathy. Research methods include interviews, observations, surveys, usability testing and more. Both qualitative and quantitative methods are used depending on the questions being asked and stage of the project. Numbers from research don't tell the whole story and can sometimes be misleading.
UXPA DC UX 101 Workshop - Usability TestingUXPA DC
This document provides an overview of usability testing from Stephanie Pratt of UXPA DC. It defines usability and its importance, how to set up and conduct a usability test, and tips for moderating a test. Key points include:
- Usability is how easy a product is to use, defined by criteria like effectiveness, efficiency and ease of learning.
- Usability testing evaluates a product's usability by observing users complete tasks. It identifies problems and improves the product.
- To set up a test, identify tasks, recruit 5-10 participants, develop a script, and practice moderation skills. When moderating, keep questions neutral and let the participant think aloud.
Research Ready to Build: Compelling Artefacts that Speak Your Agile Team's La...Joshua Ledwell
This document summarizes two case studies of ensuring user research findings and early design guidance stay relevant for agile teams over time. Case study 1 involved creating a long-term customer data experience strategy to guide four agile teams. Case study 2 aimed to improve a complex software feature with dependencies on other parts. Key lessons included creating artifacts in the team's language, showing how design builds on research, hijacking agile ceremonies, sustaining buy-in from stakeholders, and committing to sustainability over burnout. The document concludes by discussing making artifacts easy to maintain and evolve the practice across projects.
Presented by Ari Weissman. How do you start from scratch? How do you build and grow a UX team within your organization where none existed?
Many organizations “do UX” in name only. There are people who might have the UX Designer title, but aren’t talking to users, leaving the product or engineering teams to drive the experience. It’s not that these organizations don’t want to be user-driven. It’s just that they don’t know how. That is what I walked into when I started as Director of UX for [my company].
This is the story of my ongoing successes and failures at building a UX practice. It’s not about one decision, but the many strategies you can employ to build, grow, and thrive.
M Hawley Desirability Studies Boston Upa Presentation V4hawleymichael
This document discusses desirability testing, which assesses people's emotional responses to designs. It outlines various methods considered, including triading, questionnaires, quick exposure tests, and physiological measurements. The selected method, Product Reaction Cards, uses 60 descriptive words to elicit feedback on designs for a hospital website redesign. Three designs were tested quantitatively with 50 people each and qualitatively through interviews. Designs scored well on conveying caring, warmth and trustworthiness. Lessons included the value of mixed methods and positioning results as input, not declaring winners.
This presentation was provided by Serena Rosenhan of ProQuest, during Session Four of the NISO event "Agile Product and Project Management for Information Products and Services," held on June 4, 2020.
Beyond Usability Testing: Assessing the Usefulness of Your DesignDan Berlin
This document discusses how usability testing can be adapted to assess the usefulness of a design when the goals differ from just finding usability problems. It proposes conducting usability tests with three components: 1) Pre-task questions that set the context of usefulness instead of just demographics, 2) Participant-directed tasks instead of predefined tasks, and 3) Post-task questions that compare expectations and value instead of just satisfaction. This adapted approach leverages the strengths of usability testing while allowing different objectives of understanding usefulness rather than just usability problems.
Preference and Desirability Testing: Measuring Emotional Response to Guide De...Paul Doncaster
(From UPA 2011-Atlanta) Usability practitioners have a variety of methods and techniques to inform interaction design and identify usability problems. However, these tools are not as effective at evaluating the visceral and emotional response generated by visual design and aesthetics. This presentation will discuss why studying visual design is important, review considerations for preference and desirability testing and present two alternative approaches to user studies of visual designs in the form of case studies.
Introduction to usability and usability testing as a discipline, followed by how to do guerilla usability testing. Presented at Duke Tech Expo April 13, 2018 with co-author Lauren Hirsh, with content from a prior collaborative presentation of hers.
The document provides an agenda and introduction for a design presentation. It includes sections on the presenter's background, education, work experience, hobbies and interests. The presentation agenda outlines sharing a past proud project, walking through a design exercise solution for SessionM, and time for questions. Contact information is provided at the end.
Beyond Usability Testing: Assessing the Usefulness of Your Designhawleymichael
Usability tests are meant to find usability problems. If your question is, “where are the usability problems in this design”, usability testing is right for you. With usability testing, can study how well someone can get from point A to point B and where are the problems along the way. Finding usability problems is the focus, and the method works great.
But, we are finding that many of the questions business sponsors and stakeholders have are not about finding usability problems. The questions they have are more about the overall usefulness of a design, its potential for success, and how well it meets expectations.
This presentation will define usefulness research, show how it is different from usability tests, and offer different approaches for asking the right questions of users. Whether you think this is slap-your-forehead obvious or a method that needs to be expanded and refined, we seek to have a lively conversation.
Kirk Doggett and Kate Lawrence presented on using the usertesting.com tool for unmoderated user testing. They demonstrated a test of Vistaprint's business card design page. They discussed that usertesting.com allows recruiting specific users, testing websites and prototypes, and getting results quickly. However, tasks must be very clear as tests are unmoderated, and it may not be suitable for complex or time-consuming tasks. They highlighted when usertesting.com is appropriate to use, such as for targeted tasks, and its benefits like testing from anywhere.
This document provides an agenda and overview for a workshop on getting started with usability testing. The workshop will include introductions, presentations on measurements and testing, and activities for planning test logistics and interacting with participants. It will cover why usability testing is important, common myths, and tips for getting started. Participants will learn about planning and preparing for tests, recruiting profiles, scripts, test materials, facilitating sessions, and reporting results. The goal is to help attendees understand the basics of usability testing and conduct initial tests.
How to effectively implement different online research methods - UXPA 2015 - ...Steve Fadden
Are you the sole User Experience Researcher in your organization? Do you struggle to get timely research insights and feedback for your stakeholders? Online research tools offer practitioners the ability to gather feedback quickly and asynchronously, without the need for direct facilitation or moderation.
In this presentation, we provide an overview of some of the many online research tools that are available for gathering quick, asynchronous feedback on requirements, designs, and stakeholder sentiment. We offer general guidelines for recruiting, planning, implementing, and analyzing feedback, and then present how to use specific methods that have proven particularly useful for design and requirements research.
The document discusses various methods for testing the usability of websites, including scenario-based inspection, heuristic evaluation, and user observation. Scenario-based inspection involves evaluators examining a website to complete tasks and note any problems. Heuristic evaluation has evaluators check if a website follows design principles. User observation involves observing real users complete tasks and recording their experiences. Setting up these tests properly is important and involves choosing participants, creating task descriptions, and deciding how to record the sessions. The results can then be analyzed to identify usability issues and prioritize improvements.
This presentation was provided by Jonathan Clark of Jonathan Clark & Partners, during Session One of the NISO event "Agile Product and Project Management for Information Products and Services," held on May 14, 2020.
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...UserZoom
Does this sound familiar? Researchers sitting around a meeting table arguing about which methods to use, especially when it comes to unmoderated remote testing vs moderated? Usually without any empirical data?
In this webinar we'll give you the power of data to say "ELMO!" (Enough, let’s move on!) and end the argument once and for all.
We collected this data by conducting 10 moderated and 10 unmoderated remote sessions across six tasks on Patagonia.com, in order to show how moderated and unmoderated remote studies compare in terms of the number and severity of usability issues surfaced.
Register for this upcoming webinar and discover the theoretical and actual strengths and weaknesses of various user research methods to stop the argument before it even begins.
Product discovery involves an iterative process of understanding customer needs, setting priorities, and testing solutions. Agile product discovery emphasizes shared learning among the team through activities like stakeholder interviews, prototyping, and usability testing. It focuses on understanding the problem rather than jumping to solutions. Prioritization considers factors like value, cost, risk, and learning potential.
Tips for involving users in your website design - commercial property markete...estatesgazette.com, RBI
The document discusses the importance of user-centered design (UCD) and involving users in the website design process. It defines UCD as placing users and their needs at the center of the design process. The key benefits of UCD include a better user experience, increased traffic and loyalty, and reduced costs. Common methods for user research include online surveys, telephone interviews, and usability testing in labs. Personas can also be created to represent different types of users. The document emphasizes that UCD is an ongoing process and should continue even after a site is launched.
This presentation was provided by Eric Swenson of Swensonia Consulting, during Session Two of the NISO event "Agile Product and Project Management for Information Products and Services," held on May 21, 2020.
Usability testing (or user testing) involves measuring the ease with which users can complete common tasks on your website. The results of the analysis are a huge eye-opener and their implementation often leads to:
Increased sales and task completion and a high rate of return site visitors
A greatly improved understanding of your customers’ needs
A significant reduction in call centre enquiries
A much more user-focused in-house development team Source: http://www.wbcsoftwarelab.com/wbcblog/read-basics-of-usability-testing
Remote Fieldwork: How observational studies elevated usability at AutoTrader.comEmily Schroeder
While traditional task-based usability research provides invaluable insights, sometimes expanding your practice to include additional methodologies allows usability to have greater influence in an organization. In this session, you will learn how adding remote observational studies enabled the team at AutoTrader.com to become more involved in projects from the beginning.
Great products address the real needs of real people. Many companies risk bringing products to life without hearing customers' needs because their design teams don't have a way to bring the customer "inside" where product development happens. UX designers use personas to represent real customers so the design process focuses on addressing real user needs.
When design teams take advantage of personas, they see faster development times and better quality products. The entire team is on the same page and the designs satisfy users’ goals.
In this presentation, you’ll learn methods for performing user research in the field, synthesizing the results and communicating user needs to your internal product team. Specifically, we’ll cover techniques for interviewing customers, defining problems in the form of clear, concise problem statements and drafting user personas.
The product under examination is the IKEA website www.ikea.com, accessed from a desktop computer. Participants complete tasks and answer questions about the ease of use, confidence, likelihood of future use, etc. They do so in their own environment, using their own devices. Participants follow a carefully crafted survey. Tasks and survey questions were created and assessed using Qualtrics to gather:
1. First impressions of the homepage.
2. User experience when browsing for a dining room table.
The corresponding survey can be found here: https://www.slideshare.net/secret/GN6dE3iDXM3NtQ
Tackle the Problem with Design Thinking - GDSC UADgallangsadewa
The document discusses UX design processes and concepts. It covers empathizing with users to understand their needs, defining problems through research and personas, and ideating potential solutions through brainstorming and wireframing. Key aspects of UX design include ensuring solutions are usable, useful, and enjoyable for users. The document also discusses visual design foundations such as typography, color, and principles of contrast, repetition, alignment and proximity. UX designers work to create intuitive user interfaces that provide clear guidance and feedback to users.
Note-Taker's Perspective During Usability Testing (Kristen Davis & Dana Douglas)uxpa-dc
The document discusses the importance of effective note-taking during usability testing. It notes that note-taking requires preparation and an understanding of the goals and metrics of the test. The document outlines exercises where participants watch video clips of usability tests and record notes based on predefined goals and scoring metrics. Lessons learned emphasize understanding the goals of the study, defining appropriate metrics, and planning for analysis before collecting data to improve note-taking.
The Note-Taker's Perspective (User Focus 2012)Dana Douglas
The document discusses the importance of effective note-taking during usability testing. It notes that note-taking requires preparation and an understanding of the goals and metrics of the test. The document outlines exercises where participants watch video clips of usability tests and record notes based on predefined goals and scoring metrics. Lessons learned emphasize understanding the goals of the study, defining appropriate metrics, and planning for analysis before collecting data to improve note-taking.
This presentation was provided by Serena Rosenhan of ProQuest, during Session Four of the NISO event "Agile Product and Project Management for Information Products and Services," held on June 4, 2020.
Beyond Usability Testing: Assessing the Usefulness of Your DesignDan Berlin
This document discusses how usability testing can be adapted to assess the usefulness of a design when the goals differ from just finding usability problems. It proposes conducting usability tests with three components: 1) Pre-task questions that set the context of usefulness instead of just demographics, 2) Participant-directed tasks instead of predefined tasks, and 3) Post-task questions that compare expectations and value instead of just satisfaction. This adapted approach leverages the strengths of usability testing while allowing different objectives of understanding usefulness rather than just usability problems.
Preference and Desirability Testing: Measuring Emotional Response to Guide De...Paul Doncaster
(From UPA 2011-Atlanta) Usability practitioners have a variety of methods and techniques to inform interaction design and identify usability problems. However, these tools are not as effective at evaluating the visceral and emotional response generated by visual design and aesthetics. This presentation will discuss why studying visual design is important, review considerations for preference and desirability testing and present two alternative approaches to user studies of visual designs in the form of case studies.
Introduction to usability and usability testing as a discipline, followed by how to do guerilla usability testing. Presented at Duke Tech Expo April 13, 2018 with co-author Lauren Hirsh, with content from a prior collaborative presentation of hers.
The document provides an agenda and introduction for a design presentation. It includes sections on the presenter's background, education, work experience, hobbies and interests. The presentation agenda outlines sharing a past proud project, walking through a design exercise solution for SessionM, and time for questions. Contact information is provided at the end.
Beyond Usability Testing: Assessing the Usefulness of Your Designhawleymichael
Usability tests are meant to find usability problems. If your question is, “where are the usability problems in this design”, usability testing is right for you. With usability testing, can study how well someone can get from point A to point B and where are the problems along the way. Finding usability problems is the focus, and the method works great.
But, we are finding that many of the questions business sponsors and stakeholders have are not about finding usability problems. The questions they have are more about the overall usefulness of a design, its potential for success, and how well it meets expectations.
This presentation will define usefulness research, show how it is different from usability tests, and offer different approaches for asking the right questions of users. Whether you think this is slap-your-forehead obvious or a method that needs to be expanded and refined, we seek to have a lively conversation.
Kirk Doggett and Kate Lawrence presented on using the usertesting.com tool for unmoderated user testing. They demonstrated a test of Vistaprint's business card design page. They discussed that usertesting.com allows recruiting specific users, testing websites and prototypes, and getting results quickly. However, tasks must be very clear as tests are unmoderated, and it may not be suitable for complex or time-consuming tasks. They highlighted when usertesting.com is appropriate to use, such as for targeted tasks, and its benefits like testing from anywhere.
This document provides an agenda and overview for a workshop on getting started with usability testing. The workshop will include introductions, presentations on measurements and testing, and activities for planning test logistics and interacting with participants. It will cover why usability testing is important, common myths, and tips for getting started. Participants will learn about planning and preparing for tests, recruiting profiles, scripts, test materials, facilitating sessions, and reporting results. The goal is to help attendees understand the basics of usability testing and conduct initial tests.
How to effectively implement different online research methods - UXPA 2015 - ...Steve Fadden
Are you the sole User Experience Researcher in your organization? Do you struggle to get timely research insights and feedback for your stakeholders? Online research tools offer practitioners the ability to gather feedback quickly and asynchronously, without the need for direct facilitation or moderation.
In this presentation, we provide an overview of some of the many online research tools that are available for gathering quick, asynchronous feedback on requirements, designs, and stakeholder sentiment. We offer general guidelines for recruiting, planning, implementing, and analyzing feedback, and then present how to use specific methods that have proven particularly useful for design and requirements research.
The document discusses various methods for testing the usability of websites, including scenario-based inspection, heuristic evaluation, and user observation. Scenario-based inspection involves evaluators examining a website to complete tasks and note any problems. Heuristic evaluation has evaluators check if a website follows design principles. User observation involves observing real users complete tasks and recording their experiences. Setting up these tests properly is important and involves choosing participants, creating task descriptions, and deciding how to record the sessions. The results can then be analyzed to identify usability issues and prioritize improvements.
This presentation was provided by Jonathan Clark of Jonathan Clark & Partners, during Session One of the NISO event "Agile Product and Project Management for Information Products and Services," held on May 14, 2020.
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...UserZoom
Does this sound familiar? Researchers sitting around a meeting table arguing about which methods to use, especially when it comes to unmoderated remote testing vs moderated? Usually without any empirical data?
In this webinar we'll give you the power of data to say "ELMO!" (Enough, let’s move on!) and end the argument once and for all.
We collected this data by conducting 10 moderated and 10 unmoderated remote sessions across six tasks on Patagonia.com, in order to show how moderated and unmoderated remote studies compare in terms of the number and severity of usability issues surfaced.
Register for this upcoming webinar and discover the theoretical and actual strengths and weaknesses of various user research methods to stop the argument before it even begins.
Product discovery involves an iterative process of understanding customer needs, setting priorities, and testing solutions. Agile product discovery emphasizes shared learning among the team through activities like stakeholder interviews, prototyping, and usability testing. It focuses on understanding the problem rather than jumping to solutions. Prioritization considers factors like value, cost, risk, and learning potential.
Tips for involving users in your website design - commercial property markete...estatesgazette.com, RBI
The document discusses the importance of user-centered design (UCD) and involving users in the website design process. It defines UCD as placing users and their needs at the center of the design process. The key benefits of UCD include a better user experience, increased traffic and loyalty, and reduced costs. Common methods for user research include online surveys, telephone interviews, and usability testing in labs. Personas can also be created to represent different types of users. The document emphasizes that UCD is an ongoing process and should continue even after a site is launched.
This presentation was provided by Eric Swenson of Swensonia Consulting, during Session Two of the NISO event "Agile Product and Project Management for Information Products and Services," held on May 21, 2020.
Usability testing (or user testing) involves measuring the ease with which users can complete common tasks on your website. The results of the analysis are a huge eye-opener and their implementation often leads to:
Increased sales and task completion and a high rate of return site visitors
A greatly improved understanding of your customers’ needs
A significant reduction in call centre enquiries
A much more user-focused in-house development team Source: http://www.wbcsoftwarelab.com/wbcblog/read-basics-of-usability-testing
Remote Fieldwork: How observational studies elevated usability at AutoTrader.comEmily Schroeder
While traditional task-based usability research provides invaluable insights, sometimes expanding your practice to include additional methodologies allows usability to have greater influence in an organization. In this session, you will learn how adding remote observational studies enabled the team at AutoTrader.com to become more involved in projects from the beginning.
Great products address the real needs of real people. Many companies risk bringing products to life without hearing customers' needs because their design teams don't have a way to bring the customer "inside" where product development happens. UX designers use personas to represent real customers so the design process focuses on addressing real user needs.
When design teams take advantage of personas, they see faster development times and better quality products. The entire team is on the same page and the designs satisfy users’ goals.
In this presentation, you’ll learn methods for performing user research in the field, synthesizing the results and communicating user needs to your internal product team. Specifically, we’ll cover techniques for interviewing customers, defining problems in the form of clear, concise problem statements and drafting user personas.
The product under examination is the IKEA website www.ikea.com, accessed from a desktop computer. Participants complete tasks and answer questions about the ease of use, confidence, likelihood of future use, etc. They do so in their own environment, using their own devices. Participants follow a carefully crafted survey. Tasks and survey questions were created and assessed using Qualtrics to gather:
1. First impressions of the homepage.
2. User experience when browsing for a dining room table.
The corresponding survey can be found here: https://www.slideshare.net/secret/GN6dE3iDXM3NtQ
Tackle the Problem with Design Thinking - GDSC UADgallangsadewa
The document discusses UX design processes and concepts. It covers empathizing with users to understand their needs, defining problems through research and personas, and ideating potential solutions through brainstorming and wireframing. Key aspects of UX design include ensuring solutions are usable, useful, and enjoyable for users. The document also discusses visual design foundations such as typography, color, and principles of contrast, repetition, alignment and proximity. UX designers work to create intuitive user interfaces that provide clear guidance and feedback to users.
Note-Taker's Perspective During Usability Testing (Kristen Davis & Dana Douglas)uxpa-dc
The document discusses the importance of effective note-taking during usability testing. It notes that note-taking requires preparation and an understanding of the goals and metrics of the test. The document outlines exercises where participants watch video clips of usability tests and record notes based on predefined goals and scoring metrics. Lessons learned emphasize understanding the goals of the study, defining appropriate metrics, and planning for analysis before collecting data to improve note-taking.
The Note-Taker's Perspective (User Focus 2012)Dana Douglas
The document discusses the importance of effective note-taking during usability testing. It notes that note-taking requires preparation and an understanding of the goals and metrics of the test. The document outlines exercises where participants watch video clips of usability tests and record notes based on predefined goals and scoring metrics. Lessons learned emphasize understanding the goals of the study, defining appropriate metrics, and planning for analysis before collecting data to improve note-taking.
This document discusses the principles of user-centered design. It emphasizes the importance of understanding users, conducting research to learn about their needs and tasks, and involving users throughout the design process. Some key user research methods mentioned include wants and needs analysis, card sorting, group task analysis, and contextual interviews. The document stresses that good design starts with the user, and that consulting with and keeping users as the central focus leads to designs that best solve the problems users face.
This document discusses the principles of user-centered design. It emphasizes the importance of understanding users, conducting research to learn about their needs and tasks, and involving users throughout the design process. Some key user research methods mentioned include wants and needs analysis, card sorting, group task analysis, and contextual interviews. The document stresses that good design starts with the user, and that innovation comes from addressing the right problems for the target users.
Does the field of user-centered design mystify you? Does user research seem like the last thing you have time to think about?
Any team can look at analytics to understand what users are doing and how often they’re doing it. What analytics won’t tell you is *why* users are doing certain things — sometimes you need more context. That’s where user research comes in. This session will map out a framework for incorporating user research into your development cycle.
This document outlines 15 strategies for improving user experience across 5 areas: planning, discovery, design, content, and testing. It discusses keeping the focus on users, keeping things simple, and keeping things real by using actual user data and tasks rather than assumed needs. For planning, it emphasizes establishing strategic goals, priorities, and project briefs. Discovery involves understanding user goals, needs, and how the website currently meets or fails to meet those needs. The document provides examples and activities to help apply these strategies with limited resources.
This document provides guidance on effective note-taking practices for usability testing and user research. It discusses that note-taking is an important but often overlooked part of the research process. The type of notes taken and how they are structured determines what data is available for analysis and how quickly results can be reported. Different note-taking methods are presented, including narrative notes, structured notes organized by task or participant, and using tools like spreadsheets or customized forms. The key is to select a method that works for the goals of the research and allows important insights to be easily identified later. Planning the note-taking approach in advance helps ensure the right data is captured to answer the research questions.
This document discusses various usability methods that can be used at different stages of electronic medical record (EMR) development to improve usability. It describes contextual inquiry, personas, use case scenarios, requirements gathering, user stories, prototyping, card sorting, concept exploration, usability testing, incorporating user feedback, and maintaining design intent. Implementing usability methods early in the development process is most cost-effective, as it allows findings to be incorporated before significant code is written. While usability adds costs, it can provide measurable benefits like improved productivity, satisfaction, and safety.
How do you know you're ready for a Design Sprint?Highland
For leaders who want their teams to embrace human-centered approaches and collaborate in new ways, Sprints are a fantastic way to start.
Join Highland’s CX Practice Director David Whited and Lead Experience Designer Amrita Kulkarni as they share how Research Sprints and Design Sprints make Design Thinking—a reliable methodology to address complex, ambiguous problems—accessible in a way they have never been before. David and Amrita will introduce the purpose and philosophy of Sprints, talk through the differences between Research and Design Sprints, and what kind of issues, problems, or opportunities are the right fit for each.
We’ll be joined by Jennifer Severns, CXO, and Jennifer O’Brien, Innovation and Insights Manager, from the American Marketing Association, who will share how their organization has used Sprints to catalyze a culture of Design Thinking at the AMA. They will reflect on the realities of introducing Sprints and Design Thinking into an established organization, sharing advice for helping others think and work in new ways.
Attendees will learn:
- How are Research Sprints different from Design Sprints
- When is the right time or moment to conduct a Sprint
- What it takes for Sprints to be successful
- How to amplify Sprint outcomes for change in your organization
The document discusses various topics related to usability testing, including:
1. An agenda for a usability technical workshop that covers topics like UX testing, usability vs UX, usability metrics, test design, recruitment, running tests, and data analysis.
2. Guidelines for test design that include defining metrics, success rates, tasks, and subject profiles.
3. Methods for measuring usability like success rates, time on task, error rates, and satisfaction.
4. Best practices for running usability tests like making participants comfortable, remaining neutral, taking detailed notes, and measuring both performance and subjective feedback.
This presentation covers how to combine traditional qualitative methods and user research approaches to satisfy your clients and add value to findings.
Requirements Engineering for the HumanitiesShawn Day
This workshop explores how requirements engineering can be employed by digital and non-digital humanities scholars (and others) to conceptualise and communicate a research project.
requirementsEngineeringAs the field of digital humanities has evolved, one of the biggest challenges has been getting the marrying technical expertise with humanities scholarly practice to successfully deliver sustainable and sound digital projects. At its core this is a communications exercise. However, to communicate effectively demands an ability to effectively translate, define and find clarity in your own mind.
The document outlines a design thinking process that includes four key phases:
1) Research and understand through empathy, research, and user data
2) Explore and converge by sketching designs, exploring options, and converging on a solution
3) Test and refine through rapid iterative testing, collecting user feedback, and making refinements
4) Analyze test results and user behavior through analytics to ensure improvements are effective
AccessU 2018 - Introduction to User ResearchKate Walser
AccessU 2018: “It’s too expensive to do testing.” “We don’t have time.” “We can’t find any users.” If that these sound like something your team might say, this session’s for you! Come learn how - with a little creativity and planning - you can find and learn from users in time for your next release.
Learn about different user research methods (e.g., interviews, personas, testing, and more) exist for your project
Learn how and when to apply them to your project
Learn how to find and engage users of all abilities
Learn how logistics change for these different situations
Everyone always want their own site look nice but how much they know about their user characteristics. This presentation will guide you about "key success factor to design a web site", "how to reach your target", "leading to win-win situation" and "testing your site and analyze results"
On June 25, TryMyUI hosted a webinar with speaker Ritvij Gautam on collaborative UX analysis. This is the slide deck from that webinar.
Full recording of the webinar:
https://www.youtube.com/watch?v=9g05rGMnmYs
This document proposes evaluating the usability of VALET, a visual analytics software developed by Purdue University's VACCINE lab for law enforcement officials. The project aims to collect user data through surveys, interviews, and recording user actions during goal-directed tasks. This will provide insights into difficulties with the current interface to propose design improvements around information scent, cognitive task analysis, and GOMS modeling. Future work would compare a new interface design to the current one through additional user studies and expert reviews.
This document discusses usability testing and related methodologies. It provides information on what usability testing is, how it is conducted, and factors to consider when deciding which methodology to use. Specifically, it notes that usability testing involves systematically observing users under controlled conditions to determine how well they can use a product. When conducting a test, key steps include recruiting representative participants, creating tasks, observing users without guidance, and analyzing the results to identify issues. The goal is to identify problems and improve the user experience.
Similar to UXPA International 2013 The Note-Taker's Perspective (20)
User Research Delivers for the U.S. Postal Service (UXDC 2017)UserWorks
Mark Becker
Prepared for UXDC 2017
User Research Delivers for the U.S. Postal Service: The Impact of Customer Inputs on the Enhancement of USPS.com
UXDC 2017 Listing:
http://uxdcconference.org/sessions/user-research-delivers-u-s-postal-service-impact-customer-inputs-enhancement-usps-com/
Description:
In this session, attendees will learn about a program of user research we have conducted over the past three years for the U.S. Postal Service (USPS). This research has been part of a broad user-centered design approach to website enhancement implemented by USPS, with the overall goal of improving the usability of its website, USPS.com. (...)
Remote Mobile User Testing Workshop (UXDC 2017)UserWorks
Tristan Wilson, Dana Douglas
UXDC 2017 Workshop:
Remote Mobile User Testing: New Tools for Moderated Mobile Testing at a Distance
UXDC 2017 Listing:
http://uxdcconference.org/sessions/remote-mobile-user-testing-new-tools-moderated-mobile-testing-distance/
Description:
Remote user testing on personal computers is now a staple of UX research, but until recently, methods for remote testing on mobile platforms were still underdeveloped. In the past, technological limitations made mobile screen mirroring and streaming impractical. Thus, researchers were forced to use to use low fidelity methods like the “laptop hug” to test mobile interactions remotely.
Fortunately, mobile hardware has improved in recent years and with this advancement, new screen mirroring and streaming tools have emerged. Over the last year, we looked into a range of these new tools. We compared their functionality and capabilities, and adapted traditional remote testing methods to work with this new technology. (.....)
User research to enhance the us postal service websiteUserWorks
User research was conducted from 2012-2016 to enhance the US Postal Service website. Iterative user testing using methods like usability testing, tree testing, card sorting and first click testing helped optimize sections, navigation, task flows and the responsive design. Both moderated and unmoderated methods with hundreds of participants provided insights to modernize the site and improve the user experience on desktop, mobile and tablet. The evolution of the site from 2013 to 2016 showed design and functionality improvements based on the user research findings.
The document discusses various methods for measuring and testing website usability. It provides an overview of moderated usability testing, including conducting tests both in-person and remotely. Other evaluation methods mentioned include heuristic and accessibility evaluations, eye tracking, and unmoderated usability testing. Books and resources on usability testing are also listed.
Promoting Accessibility on Projects With No Accessibility AspirationsUserWorks
The document announces a panel discussion at UXPA 2015 on promoting accessibility on projects without accessibility requirements. The panelists are Elle Waters, Dana Douglas, Cory Lebson, and Jennifer Sutton, moderated by Dick Horst. They will discuss making projects more accessible even without official accessibility goals. The document also provides several references on related topics such as testing accessibility, motivating accessibility changes, and integrating accessibility into content planning.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Pushing the limits of ePRTC: 100ns holdover for 100 days
UXPA International 2013 The Note-Taker's Perspective
1. www.userworks.com
(301) 431-0500
Kristen Davis & Dick Horst
UXPA International Conference 2013
July 10, 2013
The Note-Taker's PerspectiveThe Note-Taker's Perspective
During Usability Testing:During Usability Testing:
Recognizing What's Important, What'sRecognizing What's Important, What's
NotNot
2. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
OverviewOverview
Presentation Objectives:
To collectively explore ways to become a more efficient and effective
note-taker by:
• Developing the skills and planning necessary
• Recognizing what’s important to note and what’s not
• Understanding the pros and cons of various note-taking styles
• Determining what metrics to define and utilize in note-taking
• Creating a list of note-taking best practices and tips
2
3. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Why is Effective Note-TakingWhy is Effective Note-Taking
Important?Important?
To improve the data analysis process by:
• Capturing the appropriate data the first time; less need for reviewing
recordings
• Making analyses more efficient by categorizing observations on the fly
• Increasing accuracy and completeness of notes (so you don’t miss
anything important)
• Identifying trends across participants
3
4. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Typical ContextTypical Context
One-on-one, moderated sessions (with the participant in-person or
remote)
Moderator interacting with the participant, note-taker in the
background taking notes
Participant attempting typical task scenarios at the direction of
the moderator
Participant thinking aloud
Many of the same principles and best practices would also apply to
taking notes in focus groups, user interviews, or ethnographic
settings, but not all
4
5.
6. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
ExerciseExercise
Think about and/or record your observations as you watch this
video clip
6
7. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
DiscussionDiscussion
What did you note?
Did you capture what you thought were the key findings?
Did you miss anything?
7
8. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Note-Taking MisconceptionsNote-Taking Misconceptions
No preparation needed
Take the notes, then figure out how to analyze and summarize the
data
Anyone can do it
Attention to detail, fast typing are the key skills
A one-size-fits-all approach will work for all practitioners and for all
products under evaluation
8
9.
10. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
The ContextThe Context
Product: NIHSeniorHealth.gov website
Participant: Seniors who look for health information online
Task: What types of food or drink are related to balance problems?
Study goals:
• What navigation path do participants use when looking for specific
pieces of information?
• Do participants notice and use the pagination?
• Does the visual treatment used in the left hand navigation clearly
indicate location within a specific health topic?
Task goals:
Are participants able to locate information on the 2nd
page?
Are the left hand navigation labels clear?
10
11. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
ExerciseExercise
Think about and/or record your observations as you watch this
video clip
11
12. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
DiscussionDiscussion
What did you note?
Did you capture the key findings?
Did you miss anything?
Did you note any of these:
• Study goals:
What navigation path do participants use when looking for specific
pieces of information?
Do participants notice and use the pagination?
Does the visual design treatment used in the left hand navigation
clearly indicate location within the topic?
• Task goals:
Are participants able to locate information on the 2nd
page?
Are the left hand navigation labels clear?
12
13. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Lessons Learned: Understand theLessons Learned: Understand the
GoalsGoals
Important to be familiar with the product and understand the
overall goals of the study and the specific goals of the tasks
Goals should be set prior to data collection in order to:
• Record appropriate data
• Target the type of data that will be useful when summarizing and
reporting the findings
13
14. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Different Note-Taking ApproachesDifferent Note-Taking Approaches
Would you try to note:
• Everything the participant says?
• Everything the participant does?
Pages visited
Links visited
Scrolling, hovering
Participant’s demeanor
Non-verbal signs
Or would you be more selective?
• Task scoring (success vs failure)
• Errors, wrong paths
• Pre-identified issues of interest
• Occasional verbatims
14
15. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Different Note-Taking StylesDifferent Note-Taking Styles
Moderator also takes notes
The court stenographer, verbatim
High level interpretation only, not behavior or comment
Coding behaviors in addition to free form notes
Working from a checklist
Multiple note-takers looking for different things
Multiple note-takers working redundantly
Taking notes from playback of a video/audio recording of the
session
Using a tool like Morae, Ovo, Silverback, free form typing into a
word processing or spreadsheet program, or paper-based
(checklist)
15
16.
17. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Practice ExercisePractice Exercise
Product: HealthIT.gov website
Participant: Private physician
Task: Medical professionals and hospitals are encouraged to
participate in the Government EHR reimbursement programs, early,
to receive the maximum payment. Find out when the Government will
start or has started these reimbursement payments.
Study goals:
• Are the labels used for navigation clear?
• Does the information architecture make sense?
• What suggestions do participants have to improve the content?
Task goal:
• What navigation path do participants use when looking for specific pieces of
information?
17
18. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
ExerciseExercise
Think about and/or record your observations as you watch this
video clip
18
19. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
DiscussionDiscussion
What did you note?
Did you capture the key findings?
Did you miss anything?
19
20. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Lessons Learned: RecordLessons Learned: Record Only KeyOnly Key
PointsPoints
The method of note-taking impacts the type of data collected
during the evaluation
Depending on the evaluation objective, one style or a combination
of styles may be more appropriate
20
21. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Challenges for the Note-taker (andChallenges for the Note-taker (and
Moderator)Moderator)
Participant says one thing and does another
Participant thinks they have been successful when they really
haven’t
Participant doesn’t want to look bad, so bluffs
Participant initially struggles, then succeeds, and speaks highly of
the product
Participant blames self and rates product highly despite
disastrous task performance
Participant is overly chatty and goes off on tangents
21
22. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
More Challenges for the Note-TakerMore Challenges for the Note-Taker
Being selective in the information you note, without introducing
your biases (seeing what you want/expect to see)
Noting participant behavior versus comments
Whether to try to capture participant’s clickpath, menu choices,
data entry
Whether to capture timing
Whether to use a shorthand “code” for observations
Whether to capture participant’s non-verbal behavior
Whether to note good clips for a highlights video
22
23.
24. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Practice ExercisePractice Exercise
Product: Blood glucose meter
Participant: Person with diabetes who uses a meter
Task: Change default settings in the meter
Rating scale for scoring task completion:
• Completed with ease: Participant easily completed the task (Score: 3)
• Completed with minor difficulty: Participant somewhat struggled to
complete the task (e.g., attempted more than two paths) (Score: 2)
• Completed with major difficulty: Participant significantly struggled
to complete the task (e.g., attempted numerous paths and/or used
customer service) (Score: 1)
• Failed to complete: Participant was unable to complete the task
(Score: 0)
24
25. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
ExerciseExercise
Think about and/or record your observations as you watch this
video clip
25
26. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
DiscussionDiscussion
What did you note?
Did you capture the key findings?
Did you miss anything?
What score did you give the participant?
• Completed with ease: Participant easily completed the task (Score: 3)
• Completed with minor difficulty: Participant somewhat struggled to
complete the task (e.g., attempted more than two paths) (Score: 2)
• Completed with major difficulty: Participant significantly struggled to
complete the task (e.g., attempted numerous paths and/or used customer
service) (Score: 1)
• Failed to complete: Participant was unable to complete the task (Score: 0)
26
27. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Lessons Learned:Lessons Learned: Define MetricsDefine Metrics
Using a task completion rating scale allows:
• Participant performance to be categorized
• Note-taker to record the scores in a systematic and consistent way
Carefully defining the performance categories before the fact
helps the note-taker
• Know what to look for
• Efficiently categorize on the fly
Some additional metrics to consider (facilitated by commercial
data logging software):
• Clickpaths
• Time on task
• Shorthand codes for behaviors or incidents
Example: Number of times a participant clicked the pagination
27
28.
29. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
"Take-Aways" for Effective Note-"Take-Aways" for Effective Note-
TakingTaking
Create a mindset that values the role of the note-taker
Choose someone with appropriate education and experience to
appreciate the design issues of interest
Include the note-taker in test planning discussions and debriefs
Establish project-specific note-taking procedures and coding
schemes; anticipate analysis needs when planning note-taking
Be sure the note-taker is thoroughly familiar with the product
interface, task scenarios, and project goals
29
30. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
And a Few Tips for the EffectiveAnd a Few Tips for the Effective
Note-TakerNote-Taker
Focus on what the participant does, not just what they say
Strive to interpret what you are seeing on the fly and take notes
about that, not just “raw” observations
Review your notes and fill in any gaps as soon as possible after
the session while observations are fresh in your mind
Consult with the team and client mid-course to be sure your notes
reflect project goals and team intent
Check available storage space on recording devices before each
session
Spot check recordings throughout the study to ensure that
everything is working as intended
30
31. The Note-Taker's Perspective During Usability Testing
UXPA International Conference 2013
Kristen Davis and Dick Horst - UserWorks, Inc.
Questions/Comments?Questions/Comments?
Kristen Davis
UserWorks, Inc.
User Experience Specialist/
Note-Taker Extraordinaire
kdavis@userworks.com
31
Dick Horst
UserWorks, Inc.
President/
Principal User Experience
Specialist
dhorst@userworks.com
www.userworks.com
(301) 431-0500
Editor's Notes
Kristen: Welcome to The Note-Taker's Perspective During Usability Testing: Recognizing What's Important, What’s Not. User experience methodology often focuses on the role and technique of the moderator and may neglect the role of the note-taker, who should be a user experience professional familiar with the project goals and context of use. The note-taker must also be able to identify and efficiently obtain the data you need for actionable reporting. Today, we will be using video clips from actual usability test sessions to facilitate our discussion regarding an effective note-taker’s skillset and collectively we will create a list of note-taking best practices and tips. Copyright UserWorks, Inc. 2013
Kristen: Today we will explore ways to become more efficient and effective note-takers by… [read bulleted list] We will share some of our lessons learned, but we are also here to learn from your experiences, approaches and methods for note-taking. This presentation will use video clips to facilitate a discussion to explore note-taking best practices and tips. We will conduct a series of short exercises wherein we will show a series of video clips from usability tests. You will take your own notes while watching these short video clips. After each clip, we will lead a discussion of what was important and what wasn’t important to note about the test participant’s performance and/or comments during that clip. We will show four video clips. Before each clip there will be a brief setup commentary, providing the context of the clip and the test that it illustrates. After each clip, there will be a brief discussion to share ideas and thoughts about what you noted, given the contextual information that was provided. We hope to illustrate the skills and planning necessary for effective note-taking, and discuss the pros and cons of various styles. We hope to decide if it is possible and worthwhile to even strive for developing principles, which could then be codified and taught. We want to make this an interactive discussion because note-taking is a skill better learned by doing rather than being lectured to, similar to moderating. By a show of hands How many have conducted usability tests? How many have been the note-taker? Copyright UserWorks, Inc. 2008
Kristen: So why is important to be a good note-taker? [After reading bulleted list] All of these help the report basically write itself. Copyright UserWorks, Inc. 2008
Kristen: Copyright UserWorks, Inc. 2008
Kristen: Let’s start with the first exercise. Copyright UserWorks, Inc. 2008
Kristen: Let’s say you are being pulled into a study at the last minute. I’m going to show you a video clip and I want you to think about what you would make note of, you can write them down if this will help you remember what you would note for our discussion. I’m not going to give you any background to the study shown in the clip or to the situation. Other than this is a clip from a usability test session for NIHSeniorHealth.com, which is a health information site designed for seniors. The task she was asked to complete was, “Why do people with diabetes need to check their feet?” Please write down or think about what is important to note. [Video Clip: NIHSeniorHealth.gov phase 1B P4 13:57-16:59] Copyright UserWorks, Inc. 2008
Kristen: Without having the appropriate context it can be difficult to take meaningful notes. What did you note? Did you capture what you thought were the key findings? Did you feel that you missed anything? It’s probably safe to say with a little more context you could improve your notes, and therefore, the report. Copyright UserWorks, Inc. 2008
Dick Copyright UserWorks, Inc. 2008
Kristen: So let’s try this activity again, this time with some background information on the study. Copyright UserWorks, Inc. 2008
Kristen: This is the same study, but a different information retrieval task… [After presenting the bullets] Now, understanding these goals, can anyone think of anything they would have done differently when you were taking notes during the last clip? Copyright UserWorks, Inc. 2008
Kristen: Point out left menu on page and mention the pagination appears at the bottom of the page [Video Clip: SeniorHealth.gov 1B P1 30:07-32:29] Copyright UserWorks, Inc. 2008
Kristen: Now that you had some context your note-taking should improve. As you might recall, the woman from the first video clip quickly went to the correct section of the site, “Self-monitoring,” but she never interacted with the pagination to locate the information located on the 2 nd page. The man from the second video clip did not immediately click into the correct section of the site, “Causes and Preventions”. He did use the pagination, but was unable to locate information on the 2 nd page of “Causes and Preventions.” He also clicked on the “Symptoms and Diagnosis” link when he was already on that page. Copyright UserWorks, Inc. 2008
Kristen: [Incorporate what audience members came up with as well.] Copyright UserWorks, Inc. 2008
Dick Copyright UserWorks, Inc. 2008
Dick: The stenographer style records verbatim what the participants says in a stream-of-consciousness format. This style works best for studies focused on preference data and is structured more like a cognitive interview. When the evaluation focuses more on user action and behavior, this style may not collect the right type of data by missing what the participant is actually doing. This may include body language and verbal tone in addition to interactions with the product being evaluated. Many professionals say this style should not be used. Recording high level interpretation works best when the team is evaluating prototypes. This approach allows the note-taker to record their interpretation of what the user is doing and saying and why. It combines analysis into data collection. If the project timeline is short, this style saves time. However, such “on the fly” interpretations may prove to be incorrect or inaccurate causing the team to review the recordings to reassess the interpretation. This style works best when the team has established clearly defined objectives for each task. Coding behaviors in addition to free form notes allows the team to collect some of the stream-of-consciousness data along with verbal and nonverbal data. Verbal data includes what the participant is saying, whereas nonverbal data includes their interactions with the product being evaluated, and perhaps facial expressions and body language. Copyright UserWorks, Inc. 2008
Kristen: Now, let’s move on to the third exercise. Copyright UserWorks, Inc. 2008
Kristen: Purpose of the site is to provide information to the general public and practitioners about the government’s initiative to increase the use of electronic health records or EHRs. Copyright UserWorks, Inc. 2008
Kristen: [Video clip: HealthIT.gov] Copyright UserWorks, Inc. 2008
Kristen: What did you note? [Ask one person to say what they noted.] By a show of hands, who else noted that? Copyright UserWorks, Inc. 2008
Kristen: Which do you think we should have used for this exercise? Why? [A: High-level interpretation.] [Incorporate what audience members came up with as well.] Copyright UserWorks, Inc. 2008
Dick Copyright UserWorks, Inc. 2008
Dick Copyright UserWorks, Inc. 2008
Kristen: Now, let’s move on to the fourth exercise. This time, we’ll be focusing on metrics. It’s important for all members of the research team to come to a consensus and define all metrics prior to data collection. Copyright UserWorks, Inc. 2008
Kristen Copyright UserWorks, Inc. 2008
Kristen: [Glucose meter study: P1 29:50-32:18] Copyright UserWorks, Inc. 2008
Kristen: By a show of hands how many people rated it a … We scored this task with a 2 (minor difficulty) because the participant tried two different menu items before reaching the correct item, and as defined in the rating scale, that would be a 2. By a show of hands, how many also gave this participant a 2… But, as you can see, that task would have received various ratings had this scale not been established before data collection began. Copyright UserWorks, Inc. 2008
Kristen: [Incorporate what audience members came up with as well.] Identifying what constitutes an assist during the planning phase also allows the team to distinguish the difference between “completing a task with assistance or help” from “completing the task with difficulty.” An “assist” implies the participants would not have completed the task on their own without intervention from the moderator; whereas “completed with difficulty” implies the participant completed the task on his/her own, but struggled to do so, needing several attempts or did not fully understand how the system was working. Copyright UserWorks, Inc. 2008
Dick Copyright UserWorks, Inc. 2008
Dick: Do you have any others? Copyright UserWorks, Inc. 2008
Dick: Do you have any others? Copyright UserWorks, Inc. 2008