Heuristic evaluation is a usability inspection method where 3-5 evaluators examine a user interface and judge its compliance with recognized usability principles called "heuristics." Each evaluator independently explores the interface twice and notes any violations of heuristics, such as consistency, visibility of system status, or flexibility of use. Evaluators then aggregate their findings and rate the severity of identified usability problems to prioritize fixes. With 3-5 evaluators, heuristic evaluation typically identifies around 75% of usability issues in a cost-effective manner.
The document discusses various methods for evaluating user experience design when users are located in different countries, including heuristic evaluation, usability testing, and GOMS analysis. Heuristic evaluation involves having 3-5 evaluators examine a user interface and note where it violates established usability heuristics. Usability testing involves testing an interface with real users performing representative tasks and collecting both quantitative and qualitative data. GOMS analysis estimates the time required to complete tasks based on the number and types of user actions involved.
The document discusses various methods for evaluating user experience design when users are located in different countries, including heuristic evaluation, usability testing, GOMS analysis, and collecting different types of data. Heuristic evaluation involves having 3-5 evaluators examine a user interface and identify usability issues based on usability heuristics. Usability testing involves testing an interface with real users to observe what they do and collect their feedback. GOMS analysis estimates the time and effort required to complete tasks in an interface. It is recommended to use multiple evaluation methods and data types to get a comprehensive understanding of the user experience.
Heuristic evaluation is a usability inspection method where 3-5 evaluators examine a user interface and judge its compliance with recognized usability principles called "heuristics." Each evaluator independently explores the interface twice and notes any violations of heuristics, such as consistency, visibility of system status, or flexibility of use. Evaluators then meet to aggregate their findings and determine the severity of usability problems. With 3-5 evaluators, heuristic evaluation can find around 75% of usability issues in a cost-effective manner.
The document discusses various methods for evaluating user experience when users are located in different countries, including heuristic evaluation, usability testing, GOMS analysis, and collecting different types of data. Heuristic evaluation involves having 3-5 evaluators examine a user interface and identify any violations of usability principles or heuristics. Usability testing involves testing the interface with representative users performing tasks and collecting both quantitative and qualitative data. GOMS analysis estimates the time required to complete tasks based on the number and types of user actions. The document recommends using multiple evaluation methods and data collection approaches.
The document discusses various methods for evaluating user experience design when users are located in different countries, including heuristic evaluation, usability testing, and GOMS analysis. Heuristic evaluation involves having 3-5 evaluators examine a user interface and note where it violates established usability heuristics. Usability testing involves testing an interface with real users performing representative tasks and collecting both quantitative and qualitative data. GOMS analysis estimates the time required to complete tasks based on the number and types of user actions involved.
The document discusses various methods for evaluating user experience design when users are located in different countries, including heuristic evaluation, usability testing, GOMS analysis, and collecting different types of data. Heuristic evaluation involves having 3-5 evaluators examine a user interface and identify usability issues based on established usability heuristics. Usability testing involves testing an interface with real users to observe what they do and collect their feedback. GOMS analysis estimates the time and cognitive effort required to complete tasks in an interface. The document recommends using multiple evaluation methods and data collection approaches to comprehensively evaluate a remote user experience.
The document discusses various methods for evaluating user experience when users are located in different countries, including heuristic evaluation, usability testing, GOMS analysis, and collecting different types of data. Heuristic evaluation involves having 3-5 evaluators examine a user interface and identify usability issues based on established usability heuristics. Usability testing involves testing an interface with real users to observe what they do and collect their feedback. GOMS analysis estimates the time and cognitive load required to complete tasks in an interface. The document recommends using multiple evaluation methods and data collection approaches to comprehensively evaluate remote user experience.
Heuristic evaluation is a usability inspection method where 3-5 evaluators examine a user interface and judge its compliance with recognized usability principles called "heuristics." Each evaluator independently explores the interface twice and notes any violations of heuristics, such as consistency, visibility of system status, or flexibility of use. Evaluators then aggregate their findings and rate the severity of identified usability problems to prioritize fixes. With 3-5 evaluators, heuristic evaluation typically identifies around 75% of usability issues in a cost-effective manner.
The document discusses various methods for evaluating user experience design when users are located in different countries, including heuristic evaluation, usability testing, and GOMS analysis. Heuristic evaluation involves having 3-5 evaluators examine a user interface and note where it violates established usability heuristics. Usability testing involves testing an interface with real users performing representative tasks and collecting both quantitative and qualitative data. GOMS analysis estimates the time required to complete tasks based on the number and types of user actions involved.
The document discusses various methods for evaluating user experience design when users are located in different countries, including heuristic evaluation, usability testing, GOMS analysis, and collecting different types of data. Heuristic evaluation involves having 3-5 evaluators examine a user interface and identify usability issues based on usability heuristics. Usability testing involves testing an interface with real users to observe what they do and collect their feedback. GOMS analysis estimates the time and effort required to complete tasks in an interface. It is recommended to use multiple evaluation methods and data types to get a comprehensive understanding of the user experience.
Heuristic evaluation is a usability inspection method where 3-5 evaluators examine a user interface and judge its compliance with recognized usability principles called "heuristics." Each evaluator independently explores the interface twice and notes any violations of heuristics, such as consistency, visibility of system status, or flexibility of use. Evaluators then meet to aggregate their findings and determine the severity of usability problems. With 3-5 evaluators, heuristic evaluation can find around 75% of usability issues in a cost-effective manner.
The document discusses various methods for evaluating user experience when users are located in different countries, including heuristic evaluation, usability testing, GOMS analysis, and collecting different types of data. Heuristic evaluation involves having 3-5 evaluators examine a user interface and identify any violations of usability principles or heuristics. Usability testing involves testing the interface with representative users performing tasks and collecting both quantitative and qualitative data. GOMS analysis estimates the time required to complete tasks based on the number and types of user actions. The document recommends using multiple evaluation methods and data collection approaches.
The document discusses various methods for evaluating user experience design when users are located in different countries, including heuristic evaluation, usability testing, and GOMS analysis. Heuristic evaluation involves having 3-5 evaluators examine a user interface and note where it violates established usability heuristics. Usability testing involves testing an interface with real users performing representative tasks and collecting both quantitative and qualitative data. GOMS analysis estimates the time required to complete tasks based on the number and types of user actions involved.
The document discusses various methods for evaluating user experience design when users are located in different countries, including heuristic evaluation, usability testing, GOMS analysis, and collecting different types of data. Heuristic evaluation involves having 3-5 evaluators examine a user interface and identify usability issues based on established usability heuristics. Usability testing involves testing an interface with real users to observe what they do and collect their feedback. GOMS analysis estimates the time and cognitive effort required to complete tasks in an interface. The document recommends using multiple evaluation methods and data collection approaches to comprehensively evaluate a remote user experience.
The document discusses various methods for evaluating user experience when users are located in different countries, including heuristic evaluation, usability testing, GOMS analysis, and collecting different types of data. Heuristic evaluation involves having 3-5 evaluators examine a user interface and identify usability issues based on established usability heuristics. Usability testing involves testing an interface with real users to observe what they do and collect their feedback. GOMS analysis estimates the time and cognitive load required to complete tasks in an interface. The document recommends using multiple evaluation methods and data collection approaches to comprehensively evaluate remote user experience.
The document discusses various methods for evaluating user experience design when users are located in different countries, including heuristic evaluation, usability testing, and GOMS analysis. Heuristic evaluation involves having 3-5 evaluators examine a user interface and note where it violates recognized usability principles or heuristics. Usability testing involves testing an interface with representative users and collecting both qualitative and quantitative data on their experiences. GOMS analysis estimates the time and cognitive load required to complete tasks in an interface based on the basic operations involved.
Heuristic evaluation is a usability inspection method where 3-5 evaluators examine a user interface and judge its compliance with recognized usability principles called "heuristics." Each evaluator independently explores the interface twice and notes any violations of heuristics, such as consistency, visibility of system status, or flexibility of use. Evaluators then meet to aggregate their findings and rate the severity of any usability problems. With 3-5 evaluators, heuristic evaluation can find around 75% of usability issues in a user interface.
This document discusses user-centered design (UCD) and usability testing. It defines UCD as a process that systematically researches end users' needs and incorporates them into each stage of design. Usability is defined as how effectively, efficiently, and satisfactorily users can complete tasks. The document outlines UCD activities like requirements analysis, design, development, and testing. It discusses benefits of UCD like eliminating assumptions and validating the product with users. Finally, it provides examples of usability metrics and sources for healthcare usability documentation.
When you have limited resources and no access to users, this process will allow you to evaluate a product's usability and present findings and recommendations to the product team.
Alfonso de la Nuez's talk, "How to conduct global UX benchmarking", at BigDesign event, about what, why, and how to conduct website user experience & usability benchmarking.
UserZoom Webinar: How to Conduct Web Customer Experience BenchmarkingUserZoom
You can't manage what you can't measure, so... How do you actually measure user experience?
In this webinar we covered what, why, and how to conduct website user experience & usability benchmarking. We discussed how to effectively measure the quality of a website's user experience across various competitors, within one industry, across time, using an online quantitative research methodology commonly referred to as "unmoderated remote usability testing."
“Markets are certainly looking at election results with some apprehension, but what is also true is that they are in for a correction. Elections might act as the trigger for such a correction,” said Jagannadham Thunuguntla, equity head at SMC Capitals.
Usability evaluation is aimed at finding usability problems in user interfaces to provide qualitative and quantitative data about user behavior. There are two main types of evaluations: formative evaluations conducted during design to gather feedback, and summative evaluations conducted after completion to identify problems. Common methods include heuristic evaluation where experts judge compliance with usability principles, cognitive walkthroughs to simulate user behavior, and usability testing where representative users perform tasks. Usability evaluations provide benefits for both technology improvements and business outcomes.
How can User Experience and Business Analysis work well together?User Vision
UX and business analysis – achieving the benefits of a close relationship
Many UX professionals cross paths with business analysts in the course of delivering projects. Both professions define and apply requirements, though typically one leans toward user requirements and the other toward business requirements. However these worlds often converge, especially as more organisations realise the business value of focusing on customers through user research and user-centred design. It is perhaps inevitable that these two professions, increasingly valued for customer-oriented projects, occasionally have overlapping remits which may lead to either internal friction or positive outcomes.
In this session we explore the areas of similarity, difference and potential collaboration in the respective fields of user experience and business analysis.
We will co-present the briefing with Sarah Williams, a senior business analyst and UX practitioner with leading law firm Linklaters who has successfully integrated the fields and evangelised the UX and service design approach for many internal and client-facing projects. Sarah and Chris Rourke from User Vision will discuss the goals and perspectives of the two fields and where the greatest opportunities are for knowledge transfer and co-operation for successful project delivery.
The talk will be especially of interest for UX professionals working alongside BAs, Business Analysts wanting to know more about user experience and service design, or anyone managing teams that have either or both of these important roles.
Majestic MRSS provides expert usability engineering services using a rigorous process that incorporates usability activities throughout product development. This includes planning usability testing early, conducting requirements workshops with users and experts, iterative design and testing, and post-release monitoring. Majestic MRSS uses a usability lab equipped with specialized recording technology to capture user interactions and feedback, which helps identify problems and ensure usability objectives are met.
Majestic MRSS provides usability engineering services to help make computer products and services more usable. Their approach involves planning usability activities early in the design process, gathering requirements from stakeholders, designing interfaces iteratively based on user feedback, implementing designs according to usability guidelines, and testing products to evaluate how well requirements have been met. They provide an ROI framework explaining how usability engineering can reduce costs and increase sales, productivity, and customer satisfaction. Majestic MRSS uses a usability lab called mLAB to record and analyze user testing sessions.
Unlock your website's potential with a powerful UX audit in 2024! This guide explores UX audits, user experience audits, UX audit reports, and everything in between.
Static techniques allow for examining software work products like requirements, design documents, and source code manually or with tools, without executing the software. This is known as static testing. Static testing can evaluate all software work products early in the development lifecycle through review techniques. Reviews involve examining documents for defects and quality issues in a team setting. This allows information sharing and helps focus testing. Reviews have been shown to improve productivity and quality by reducing defects found later.
The document discusses various aspects of usability testing and evaluation for websites, including definitions, methods, and best practices. Specifically, it describes heuristic evaluation, which involves examining a user interface against recognized usability principles and industry standards. Ten common usability heuristics are outlined, such as ensuring visibility of system status and having a clear match between the system and the real world. Evaluation criteria also cover areas like navigation, functionality, branding, documentation, and instructions.
In today’s global marketplace, successful companies must be able to integrate and quickly view quality audit information from their manufacturing sites all over the world. This strategic capability has become even more important as manufacturers have moved offshore and have become more complex. The value and immediacy of quality assurance data is a critical element to the survival of competitive manufacturing organizations. Software systems can address these issues.
Source:
Lyons Information Systems, Inc
http://www.lyonsinfo.com
The document discusses requirements engineering and analysis. It defines requirements elicitation, analysis, and specification. The goal of requirements analysis is to study user needs to define software requirements. A requirements specification precisely describes required functions, performance, constraints, and quality attributes. It also discusses types of requirements, the difference between requirements and design, and quality attributes.
The document discusses different methods for evaluating user interface designs, including expert evaluation techniques like heuristic evaluation and cognitive walkthroughs. It also covers user testing, which is considered more reliable than expert evaluation alone. Formative evaluation involves testing prototypes during development to identify issues, while summative evaluation assesses the final product. Both qualitative and quantitative methods are important to identify usability problems from the user's perspective.
Automatic Recommendation of Trustworthy Users in Online Product Rating SitesIRJET Journal
1) The document discusses methods for identifying trustworthy users and recommendations in online product rating sites. It notes that some recommendations may be misleading or harmful if they come from users with malicious intentions or lack competence.
2) It describes challenges with current recommendation systems, such as fake users corrupting ratings predictions and reducing accuracy. The goal is to identify and filter out untrustworthy recommendations to provide more accurate ratings and recommendations to users.
3) Several papers are reviewed that propose techniques like natural language processing of reviews, calculating reputation scores based on review criteria, and using interaction data and attributes to identify trustworthy friends in online communities. The objective is to develop robust methods for identifying trustworthy users and recommendations.
This document summarizes the results of a survey conducted by Lowe's to create user personas for different types of customers that are affected by promotions. 12 people completed the survey with a 40% completion rate. The survey was designed to gather information to create user personas within time and resource constraints imposed by the pandemic. The document lists the team members involved and provides a link to the survey in an appendix.
More Related Content
Similar to Heuristic Evaluation of System & Application
The document discusses various methods for evaluating user experience design when users are located in different countries, including heuristic evaluation, usability testing, and GOMS analysis. Heuristic evaluation involves having 3-5 evaluators examine a user interface and note where it violates recognized usability principles or heuristics. Usability testing involves testing an interface with representative users and collecting both qualitative and quantitative data on their experiences. GOMS analysis estimates the time and cognitive load required to complete tasks in an interface based on the basic operations involved.
Heuristic evaluation is a usability inspection method where 3-5 evaluators examine a user interface and judge its compliance with recognized usability principles called "heuristics." Each evaluator independently explores the interface twice and notes any violations of heuristics, such as consistency, visibility of system status, or flexibility of use. Evaluators then meet to aggregate their findings and rate the severity of any usability problems. With 3-5 evaluators, heuristic evaluation can find around 75% of usability issues in a user interface.
This document discusses user-centered design (UCD) and usability testing. It defines UCD as a process that systematically researches end users' needs and incorporates them into each stage of design. Usability is defined as how effectively, efficiently, and satisfactorily users can complete tasks. The document outlines UCD activities like requirements analysis, design, development, and testing. It discusses benefits of UCD like eliminating assumptions and validating the product with users. Finally, it provides examples of usability metrics and sources for healthcare usability documentation.
When you have limited resources and no access to users, this process will allow you to evaluate a product's usability and present findings and recommendations to the product team.
Alfonso de la Nuez's talk, "How to conduct global UX benchmarking", at BigDesign event, about what, why, and how to conduct website user experience & usability benchmarking.
UserZoom Webinar: How to Conduct Web Customer Experience BenchmarkingUserZoom
You can't manage what you can't measure, so... How do you actually measure user experience?
In this webinar we covered what, why, and how to conduct website user experience & usability benchmarking. We discussed how to effectively measure the quality of a website's user experience across various competitors, within one industry, across time, using an online quantitative research methodology commonly referred to as "unmoderated remote usability testing."
“Markets are certainly looking at election results with some apprehension, but what is also true is that they are in for a correction. Elections might act as the trigger for such a correction,” said Jagannadham Thunuguntla, equity head at SMC Capitals.
Usability evaluation is aimed at finding usability problems in user interfaces to provide qualitative and quantitative data about user behavior. There are two main types of evaluations: formative evaluations conducted during design to gather feedback, and summative evaluations conducted after completion to identify problems. Common methods include heuristic evaluation where experts judge compliance with usability principles, cognitive walkthroughs to simulate user behavior, and usability testing where representative users perform tasks. Usability evaluations provide benefits for both technology improvements and business outcomes.
How can User Experience and Business Analysis work well together?User Vision
UX and business analysis – achieving the benefits of a close relationship
Many UX professionals cross paths with business analysts in the course of delivering projects. Both professions define and apply requirements, though typically one leans toward user requirements and the other toward business requirements. However these worlds often converge, especially as more organisations realise the business value of focusing on customers through user research and user-centred design. It is perhaps inevitable that these two professions, increasingly valued for customer-oriented projects, occasionally have overlapping remits which may lead to either internal friction or positive outcomes.
In this session we explore the areas of similarity, difference and potential collaboration in the respective fields of user experience and business analysis.
We will co-present the briefing with Sarah Williams, a senior business analyst and UX practitioner with leading law firm Linklaters who has successfully integrated the fields and evangelised the UX and service design approach for many internal and client-facing projects. Sarah and Chris Rourke from User Vision will discuss the goals and perspectives of the two fields and where the greatest opportunities are for knowledge transfer and co-operation for successful project delivery.
The talk will be especially of interest for UX professionals working alongside BAs, Business Analysts wanting to know more about user experience and service design, or anyone managing teams that have either or both of these important roles.
Majestic MRSS provides expert usability engineering services using a rigorous process that incorporates usability activities throughout product development. This includes planning usability testing early, conducting requirements workshops with users and experts, iterative design and testing, and post-release monitoring. Majestic MRSS uses a usability lab equipped with specialized recording technology to capture user interactions and feedback, which helps identify problems and ensure usability objectives are met.
Majestic MRSS provides usability engineering services to help make computer products and services more usable. Their approach involves planning usability activities early in the design process, gathering requirements from stakeholders, designing interfaces iteratively based on user feedback, implementing designs according to usability guidelines, and testing products to evaluate how well requirements have been met. They provide an ROI framework explaining how usability engineering can reduce costs and increase sales, productivity, and customer satisfaction. Majestic MRSS uses a usability lab called mLAB to record and analyze user testing sessions.
Unlock your website's potential with a powerful UX audit in 2024! This guide explores UX audits, user experience audits, UX audit reports, and everything in between.
Static techniques allow for examining software work products like requirements, design documents, and source code manually or with tools, without executing the software. This is known as static testing. Static testing can evaluate all software work products early in the development lifecycle through review techniques. Reviews involve examining documents for defects and quality issues in a team setting. This allows information sharing and helps focus testing. Reviews have been shown to improve productivity and quality by reducing defects found later.
The document discusses various aspects of usability testing and evaluation for websites, including definitions, methods, and best practices. Specifically, it describes heuristic evaluation, which involves examining a user interface against recognized usability principles and industry standards. Ten common usability heuristics are outlined, such as ensuring visibility of system status and having a clear match between the system and the real world. Evaluation criteria also cover areas like navigation, functionality, branding, documentation, and instructions.
In today’s global marketplace, successful companies must be able to integrate and quickly view quality audit information from their manufacturing sites all over the world. This strategic capability has become even more important as manufacturers have moved offshore and have become more complex. The value and immediacy of quality assurance data is a critical element to the survival of competitive manufacturing organizations. Software systems can address these issues.
Source:
Lyons Information Systems, Inc
http://www.lyonsinfo.com
The document discusses requirements engineering and analysis. It defines requirements elicitation, analysis, and specification. The goal of requirements analysis is to study user needs to define software requirements. A requirements specification precisely describes required functions, performance, constraints, and quality attributes. It also discusses types of requirements, the difference between requirements and design, and quality attributes.
The document discusses different methods for evaluating user interface designs, including expert evaluation techniques like heuristic evaluation and cognitive walkthroughs. It also covers user testing, which is considered more reliable than expert evaluation alone. Formative evaluation involves testing prototypes during development to identify issues, while summative evaluation assesses the final product. Both qualitative and quantitative methods are important to identify usability problems from the user's perspective.
Automatic Recommendation of Trustworthy Users in Online Product Rating SitesIRJET Journal
1) The document discusses methods for identifying trustworthy users and recommendations in online product rating sites. It notes that some recommendations may be misleading or harmful if they come from users with malicious intentions or lack competence.
2) It describes challenges with current recommendation systems, such as fake users corrupting ratings predictions and reducing accuracy. The goal is to identify and filter out untrustworthy recommendations to provide more accurate ratings and recommendations to users.
3) Several papers are reviewed that propose techniques like natural language processing of reviews, calculating reputation scores based on review criteria, and using interaction data and attributes to identify trustworthy friends in online communities. The objective is to develop robust methods for identifying trustworthy users and recommendations.
Similar to Heuristic Evaluation of System & Application (20)
This document summarizes the results of a survey conducted by Lowe's to create user personas for different types of customers that are affected by promotions. 12 people completed the survey with a 40% completion rate. The survey was designed to gather information to create user personas within time and resource constraints imposed by the pandemic. The document lists the team members involved and provides a link to the survey in an appendix.
The usability test summary is as follows:
1. The participant was asked to favorite and unfavorite application cards on the Lowe's Vendor Gateway homepage through guided tasks. They were able to successfully complete the favoriting and unfavoriting tasks with minimal issues.
2. Post-task questions examined ease of use ratings, expectations around notification of new cards, and preferences for different card designs. The participant felt favoriting was intuitive but unfavoriting could be improved.
3. An interview portion discussed information priorities for cards, expectations for interaction and training resources, and preferences for notification of new functionality. Insights focused on ensuring key data is visible above the fold and establishing clear guidance for issues.
September Usability 2022 - UAM Focus.pptxJaime Brown
The document describes plans for a usability testing session of the Lowe's Vendor Gateway application. The session aims to gather feedback on several aspects of the application, including processes for removing a user, updating contact information, and managing application access. Participants will work through scenarios and be asked questions to identify ease of use, terminology clarity, functionality gaps, and preferences around different design concepts. The session will be recorded and observed by other team members to capture participant insights.
October 2022 Usability Test Script (2).pptxJaime Brown
The document discusses concepts for improving the Lowe's Vendor Gateway user experience. It describes testing search functionality, including site-wide search concepts and displaying search results. It also reviews options for the company banner, such as making it collapsible. Finally, it presents a concept for gathering feedback on resolved support tickets using thumbs up/down icons and gets participant feedback. The usability test examines these concepts to determine what enhancements would be most helpful to vendors.
This document describes a usability study conducted to evaluate competing designs for a vendor user access management feature on Lowe's new vendor gateway platform. The study found one design was slightly easier to complete tasks in but that all designs met vendor needs. Key recommendations included simplifying access categories and using clear, common language. The final design incorporated elements of multiple concepts based on vendor feedback. Lessons included adapting to changes, communicating well, and prioritizing user research.
The document proposes solutions to optimize password lockouts and improve the user experience for clients logging into and resetting their accounts. It recommends updating user interfaces and processes, including redirecting locked users to instructions, allowing account resets by answering security questions instead of using a temporary password, and providing clearer instructions throughout the login and password reset flows.
Wr circle bap focus groups 30-minute qa 23-may-2019 smc ts (v3)Jaime Brown
The document outlines an agenda for focus group discussions with top advisors to gather feedback on improving processes for prospecting, onboarding clients, managing cases not initiated or completed (NIGOs), client maintenance, and communication. The goal is to better configure new solutions based on how advisors and their teams currently perform key tasks in order to make the solutions more efficient and effective. Advisors of varying experience levels and roles will be asked questions about their current workflows and any issues, as well as how new electronic systems could impact their processes and what support they would need. Feedback from the discussions will inform the design, implementation, and support of new solutions.
Wr circle bap focus groups 5-minute qa 23-may-2019 smc ts (v2)Jaime Brown
The document outlines an agenda for focus group sessions with Waddell & Reed advisors to discuss key client onboarding and servicing processes. The goal is to understand current practices in order to design new solutions that meet objectives. Specific topics include prospecting, application processes, managing clients not in good order, and client maintenance. Feedback will help configure functionality and usability of new solutions to efficiently and effectively support advisors and their teams. The optimal audience includes advisors of varying experience levels and roles, and an informal small table discussion format is proposed.
Wr bap program success metrics 28-june-2019 smc jc (v3)Jaime Brown
The document proposes metrics to measure the success of a new Business Administration Platform (BAP) program. It identifies six primary metrics: effort, support, adoption, not-in-good-order (NIGO) rates, cycle time, and sales. Secondary and tertiary metrics are also outlined to provide more granular data and serve as leading indicators. Baselines must be set before BAP deployment to enable tracking of improvements. A scorecard will be developed and updated weekly to monitor progress and trends. Feedback from upcoming field office visits may influence further refinement of the metrics.
The document lists 10 accounts with their account numbers. It includes sections to search or select accounts, and to display account categories by name, number, product type, open date, and net value.
This document summarizes analytics data from the site spanning October 2016 to October 2017. Key findings include:
- Users drop off at high rates when navigating between some content sections like CME and Journals
- Search terms could be improved to better direct users within and to the site
- Some pages have many clicks to reach them which could cause users to get lost
- Sections like Practice Management and Career Development could be consolidated
The summary proposes a draft sitemap and recommends usability testing to refine the site structure and navigation.
The document provides an overview of WCAG 2.0 guidelines for web accessibility. It discusses the four principles of WCAG 2.0 - Perceivable, Operable, Understandable, and Robust. For each principle, it lists the guidelines and requirements for level A and AA conformance. It also provides an overview of WAI-ARIA, which contains additional accessibility attributes for developers. The document includes resources for web accessibility extensions, applications, and general information.
This document discusses redefining the useful life of smartphones by repurposing their components for new purposes. It notes that while smartphones are constantly being upgraded, this creates a surplus of older devices. The project aims to influence how people understand smartphones by reconstructing their context and perceived purpose. It explores deconstructing smartphones into their individual components and sensors to see how they could be combined in new ways outside their typical smartphone context. The document discusses literature on previous work repurposing smartphones and outlines some research questions around making these ideas more accessible to users and fostering creative thinking around transformations. It also covers trends like the Internet of Things that could impact possible new uses of smartphone components.
This document introduces Symbio, a service model that facilitates a mutually beneficial exchange of services between seniors and other community members. Seniors can offer services like storytelling, tutoring, gardening, or professional knowledge, while also receiving services that help them stay social, active, and financially stable. The service aims to support successful aging by matching people through an online platform and mobile app. Key aspects of the model include raising awareness through community leaders, organizations, and word-of-mouth, as well as creating service matches and fostering connections that benefit both seniors and the community.
This document describes the design process for an app called InPlace that connects older adults in need of services with individuals who can provide services. The design process included brainstorming possibilities, creating an affinity diagram to understand user needs, co-designing with users, wireframing screens, prototyping, and UX design. The app is intended to provide a sense of community and autonomy for older adults by matching them with providers for basic needs and allowing virtual communication and easy scheduling. Key features include an intuitive interface, support for reciprocal exchanges, user ratings and feedback, and notifications to confirm exchanges.
The document describes testing a new immunization mobile app. It provides 3 sample activities for the participant to complete within the app to review immunization schedules and notes. The participant is asked to rate each activity and provide feedback on ease of use and time to complete. At the end, the participant fills out a post-test questionnaire about their overall impression of the app and any suggested changes.
3. What is a Heuristic Evaluation?
Heuristic Evaluation
Level Setting
A heuristic evaluation is a method using in user interface (UI) and user experience (UX) design to identify potential usability problems in a digital product or system
application. It is an examination of the interface and a judgement of its compliance against universally recognized usability principles, commonly known as heuristics.
Evaluators will typically inspect the product’s interface and compare it against a set of predefined heuristics and guidelines. These heuristics are often based on
principles established by the usability experts such as Jakob Nielsen or Bruce Tognazzini
During the evaluation process, evaluators identify areas where the interface may deviate from the established heuristics, indicating potential usability issues. These
issues could include problems with navigation, layout, terminology, feedback, error prevention, and other aspects that impact the overall user experience.
These evaluations are often quickly and inexpensive compared to other usability testing methods, so they are a valuable tool for identifying usability problems, both
early in the design process and after product release
Heuristic evaluations are not a substitute for user-testing. They can not full replicate the experience of real users interacting with the product. Heuristic evaluations are
not a one-time activity. These are most effective when conducted iteratively throughout the design process and after release. These are subjective assessments and
should be combined with other methods and considerations in the design process.
4. Vendor Compliance Application | Heuristic Review Checklist
Heuristic Evaluation
Complete Checklist
Severity Ratings
Evaluator Details
Evaluator Name: Jaime Brown
Device / Browser / OS: MacBook Pro, Chrome, MacOS Monterey
Site URL: https://lxappvgatdev015.lowes.com:8443/vendorcompliance/
Date: 04.02.2024
0
1
2
3
4
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
2
Checklist
Checklist
Checklist
Checklist
Checklist
Yes
Yes
Yes
Yes
Yes
No
No
No
No
No
N/A
N/A
N/A
N/A
N/A
Comments
Comments
Comments
Comments
Comments
Severity Rating
Severity Rating
Severity Rating
Severity Rating
Severity Rating
Visibility of system status
Match between system and the real world
User control andfreedom
Consistency andStandards
HelpUsers Recognize,Diagnose,andRecover From Errors
2
1
1
1
1
5. Vendor Compliance Application | Heuristic Review Checklist
Heuristic Evaluation
Complete Checklist
Date: 04.02.2024
Checklist
Checklist
Checklist
Checklist
Checklist
Yes
Yes
Yes
Yes
No
No
No
N/A
No
N/A
Comments
N/A
N/A
Severity Rating
Comments
Comments
Comments
Severity Rating
Severity Rating
Severity Rating
Checklist Yes No N/A Comments Severity Rating
Error Prevention
Recognition Rather Than Recall
Flexibility and Minimalist Design
Aesthetic and Minimalist Design
Help and Documentation
2
2
2
0
0
0
0
0
0
1
1
1
1
0
0
0
2
1
1
7. Vendor Feedback
Heuristic Evaluation
Vendor Feedback
UX scheduled 30-minute interview sessions with various vendor-users of Vendor Compliance. In addition to gaining insights about the process of managing fines,
rebuttals and exceptions, I also asked vendors to assign a number value for certain categories. Those ratings are on the subsequent slide, and below are the Top 3
vendor comments related to Vendor Compliance.
“Whenever we have come to an agreement on fill rate fines, it happens too late and past the deadline. I have to follow up constantly to find out the
status of my inquiry.” - JeffO’Bannon-Amazon
“I normally have to send emails as responses do not always make sense; then we need to work with our US counterparts at Lowes for some specific
details and correspondence.” - ShauntaMcCracken,Samsung
“If Lowe's requests further information for a dispute, you can't reply to more than one at a time, so you have to go into each one and respond to it.” -
SarahFlorczyk-Energizer
8. Vendor Ratings
Heuristic Evaluation
Vendor Ratings
Baseline Content and Ease of Task Completion: out of 7
5.8
Along with interviews, we asked vendors to provide ratings on several key features, including Vendor Compliance Application specific ratings and the baseline rating for
content across Lowe’s Vendor Gateway. Each rating was from 1 to 7, with 1 being the most difficult or worse, and 7 being the easiest or best. The scores for each category
were than averaged:
Vendor Compliance - Ease of Completing Tasks: 4.8 out of 7
Vendor Compliance - System Messaging: 4.7 out of 7
Level of Customer Support: out of 7
3.9
Total Vendor Compliance Score - 68.5%
11. UX Recommendations Continued
Heuristic Evaluation
UX Recommendations
Insights
Data Analytics
People Process Technology
Internal Associate Support
User Interface - Continued
13. Lowe’s Internal User’s Feedback
Heuristic Evaluation
Stakeholder Feedback
In connecting with the Vendor Compliance application team, UX was able to gather insights into the internal process. This was not a major part of the heuristic effort, so
further conversations may be required between the internal compliance users and UX. The conversation focused on the most pressing pain points, including needing the
ability to add more detail and comments in the Vendor Compliance System and the ability to edit rebuttal information at the PO level, instead of the SFVBU level.
“I have the capability to do it [change the fine amount], but not at the PO level as in the rebuttal level. We could do it at the ECF preview level, so one SFP
view might...require that I need to make changes for one rebuttal...but I have to do it at the SFVBU level, but not at the PO level.” - Tulsi Dewangan
“If there could be a capability in the tool to say we have closed the rebuttals, but based on discussions or based on additional information that we got for
this particular rebuttal, we are going to waive off the fine for the vendor and it should ideally maybe have a comments column with ‘Approval Details’...” -
Neha Chaudhary
“There’s so much human manipulation going on with these files to get them in a point where we can produce the number and that just leaves room for
errors, right?” - Amy Monroe
14. Lowe’s Internal User’s Feedback - Continued
Heuristic Evaluation
Stakeholder Feedback
In connecting with the Vendor Compliance application team, UX was able to gather insights into the internal process. This was not a major part of the heuristic effort, so
further conversations may be required between the internal compliance users and UX. The conversation focused on the most pressing pain points, including needing the
ability to add more detail and comments in the Vendor Compliance System and the ability to edit rebuttal information at the PO level, instead of the SFVBU level.
“...lot of changes, like every quarter there is a process change...before we were approving merchants, we were agreeing for merchants but not now, now
we are not agreeing for merchants approval and we come across scenarios where when they're [vendors] asking ‘wait, why is there a lot of changes and
before you used to approve for the BOL for collect vendors no we are not allowed to’”. - S. Danush
“I don't have any issues with the training, but as we have seen in the past one year, like so many process changes as we being the tenure in a team, it
becomes, for us, very difficult because one month we are having this, these are the instructions, now we have to work and then the other month it
changes.” - Ramanpreet Matharu
16. Wins
Heuristic Evaluation
Short Term/Long Term Wins
Low Hanging-Do First
Heuristic Evaluation - Identified Areas for Improvement
Visibility of System Status
Fast Fixes Highly Important Nice to Have Suggestions
User Control and Freedom
Match Between System and Real World
Consistency and Standards
Error Prevention
Flexibility and Minimalist Design
Help and Documentation
18. Heuristic Evaluation
Complete Checklist
1
Consistency and Standards
Tab hover underline should be
blue to match BDS and Lowe’s
Vendor Gateway design
standards.
2
Consistency and Standards
Call to actions should be limited
when used in tight spaces to
prevent wrapping, and
displacement of icon (X).
19. 4
Consistency and Standards
Best practice is to use
standardized date
presentations as <MM/DD/
YYYY>
3
Consistency and Standards
Button styles (sizing, version)
20. 6
Consistency and Standards
Font weight should match
Lowe’s Vendor Gateway design
standards. Inconsistent font
design.
7
Consistency and Standards
Capitalized text should be used
sparingly. We’re not yelling at
the users, consider making
case-sensitive statuses and
SFVBU names.
5
Visibility of System Status
Removing the required
Company Banner and VBU
switch has created lack of
consistency across the entire
Lowe’s Vendor Gateway
ecosystem. All applications are
required to have those
components, so it’s jarring
when users switch and see
nothing but the primary nav
and the application pages.
22. 9
Consistency and Standards
Modal components should be
consistent across the
application. Match Lowe’s
Vendor Gateway and BDS
design standards.
23.
24. 11
Consistency and Standards
Dropdowns should adhere to
consistent functionality and
standards following Lowe’s
Vendor Gateway and BDS V3.
12
Consistency and Standards
Comments font should be
consistent with all pages and
match Felix. Component is
missing character count limits.
10
Consistency and Standards
Too much green makes it
difficult to quickly scan and
view. Limit success green to
smaller cells or icons to indicate
success.
13
Consistency and Standards
Button should match V3 design
standards and be consistent in
placement and design.
26. 18
Visibility of System Status
When all available table
column headers are selected,
the user is forced to scroll
horizontally in the table. That’s
not intuitive or easy to see.
15
Help and Documentation
Column icons and headers are
smaller, while the table font is
large. Follow standards for
Lowe’s Vendor Gateway.
16
Consistency and Standards
Button styles (sizing, version)
17
Consistency and Standards
For table columns where users
may need to do math, the type
should be right aligned.
27. 19
Error Prevention
No color confirmation that
Status or Total Adjusted Fine is
updated, just checkmark and X
icon.
28. 20
Consistency and Standards
Button styles (sizing, color, version) and
table component styling should be
consistent across the application. Follow
Lowe’s Vendor Gateway standards.
32. 25
Consistency and Standards
Buttons should be disabled
when form is not complete.
Upload Documents should be
left centered to match Lowe’s
Vendor Gateway standards.
24
Consistency and Standards
Button hierarchy should follow
standards. One primary,
secondary and tertiary status
for these options.
33. 26
Consistency and Standards
Tabs should follow Backyard
Design System and Lowe’s
Vendor Gateway to match V3
standards.
28
Consistency and Standards
Button size and color needs to
be consistent across all pages
and follow Lowe’s Vendor
Gateway design standards.
27
Consistency and Standards
Toggle needs to be blue to
match Lowe’s Vendor Gateway
standards.
34. 29
Visibility and System Status
Need consistency with page
names across entire
application. Confusing that
some pages have page
headers and some don’t.
Page header names should
also follow Lowe’s Vendor
Gateway’s design standards.
This header is too small.
30
Consistency and Standards
Button size and color needs to
be consistent across all pages
and follow Lowe’s Vendor
Gateway design standards.
31
Flexibility and Minimalist
Design
Too much white space can
disrupt the visual hierarchy.
Consider using slimmer
columns when content requires
less space.
35. 32
Consistency and Standards
Red font is for errors, not instructional
messaging. Update to match Lowe’s
Vendor Gateway design standards.
36. 33
Consistency and Standards
Layout spacing is inconsistent.
Font and styling is inconsistent
with the rest of the application.
Follow BDS and Lowe’s Vendor
Gateway guidelines.
37. 34
Consistency and Standards
Blue font signifies interactive
states, so this font color could
change to black or any other
color that matches BDS and
Lowe’s Vendor Gateway
design standards.
35
Consistency and Standards
Best practice is to use
standardized date
presentations as <MM/DD/
YYYY>
38. 36
Consistency and Standards
If there is only one selection in a
dropdown, it should default to
that selection and not allow
expansion.
40. UX Recommendations
Heuristic Evaluation
UX Recommendations
Align the user interface (UI) with both Lowe’s Vendor Gateway’s Design Style Guide and the Backyard Design System (BDS), version 3.
Ensuring that icons and components are uniform in style throughout an interface can greatly enhance the overall user experience and
make the product feel more polished and professional.
Correct the 37 identified heuristic concerns
Along with enhancing the UI, add additional features such as search for the data table, or allow vendors to mass download PO Line
Items, instead of line-by-line downloading
Enhance the communication process with vendors so that they are informed of any fine, rebuttal or exemption communications. This
improvement aims to ensure that vendors can stay up-to-date, which they highlighted as a major source of frustration in the current
experience. Either email communications or making use of the current Lowe’s Vendor Gateway global notification system, vendors want to
be notified of updates
We received consistent feedback regarding confusion around the fine and rebuttal processes. Provide vendors with clear guidance on
what documentation is required when rebutting a fine. Provide clarity around the process from the Lowe’s side, so that vendors
understand why they’re being rejected and what precisely is required of them from the start.
41. UX Recommendations Cont’d
Heuristic Evaluation
UX Recommendations
Vendors feel the current support process is inadequate, both in ticketing responses and in application-support documentation.
Provide vendors with more intuitive responses from Lowe’s support, with stronger guidance on where to locate resources in the
Knowledge Center and how to respond to Vendor Compliance in rebuttals.
Provide accurate and up-to-date documentation, as vendors feel the current resources are outdated
Ensure that vendors receive the assistance and information they need to easily and quickly rebut fines, as well as respond to rebuttal
requests or rejections
Enhance the Vendor Compliance System so vendors are able to mass select line items in the system. Vendors can only select one page of
Purchase Orders (POs) at a time, and paging through multiple pages of POs is heavy and cumbersome. Multiple vendors expressed this
frustration
Finally, the question of what success looks like for this effort. Initially, UX was told success centers around rebuttals, specifically how to
lower the number of vendor submitted rebuttals
Ultimately, there is no single method for lowering rebuttal numbers. The rebuttal process itself is murky to most vendors, and they need
clarity on what documentation or resources are required when rebutting fines.
42. UX Recommendations Cont’d
Heuristic Evaluation
UX Recommendations
Otherwise, the best updates to make are
List the steps that are required for vendors to complete a rebuttal process. Provide updates so that they’re informed every step of the
way. Provide clear expectations on when vendors can expect a response from the Vendor Compliance team during rebuttals
Engaging with Lowe’s Support (Remedy) so that they are better trained to respond to vendor questions would be enormously helpful for
vendors
Ensure the documentation in the Knowledge Center is up-to-date
Enhance the rebuttal reason options. Add more dropdown options that account for reasoning scenarios when the vendor was “on
time” and can provide verifications, something that gives more reason options
This UX effort allowed for the integration of the Vendor Compliance System into FullStory. Now that a dashboard of metrics is established,
it’s recommended that the engineering team and UX continue to monitor those data analytics for trends and future enhancement
opportunities.
43. Insights
Heuristic Evaluation
Lowe’s Internal Users
Process Changes - Recent adjustments to Vendor Compliance System (VCS) processes are creating confusion among some of our
associates, which is making it difficult to efficiently process rebuttals, and effectively onboard new staff
We can improve onboarding effectiveness and reduce associate confusion by establishing a period of process stability alongside clear
communication channels for upcoming changes.
Adding a tab for internal associates to record documentation related to VCS process updates and employee reference documentation
would be helpful
Rebuttal Statuses - Implementing a ‘in-progress’ status for rebuttals being investigated would improve transparency for associates and
vendors. This would allow associates to see when a rebuttal is being addressed, review those comments, and provide a clearer timeline to
completion
User Interface Updates - To improve process efficiency and address vendor frustrations around communications, the compliance team
should explore increasing file size limits or implementing alternative methods for sharing larger evidence files.
Additionally, expanding character limits in comment sections or notation spaces could enhance information exchanges, cutting down
on confusion for both vendors and associates
Enhanced Search - By optimizing the the search feature for Ship From Vendor Business Unit (SFVBU) Numbers, we can streamline the
process of finding relevant information within the table. This eliminates the need for constant filtering and saves users valuable time
System Comments - Currently, reopening rebuttals for adding updated internal comments requires contacting IT, which can be time-
consuming. Implementing a functionality for senior associates to directly reopen and update rebuttal comments would improve efficiency
Furthermore, vendors could track these comment updates within the system, providing a more transparent view of the investigation
process, particularly for decisions involving waivers.
44. Project Documentation
Heuristic Evaluation
Project Documentation
EVET-3078 - UX Perform Heuristic Evaluation of Vendor Compliance System (VCS
EVEP-325 Compliance Heuristic
Research Space - Figm
Internal Interview Recordings - Confluenc
Documentation Space - Confluence