Usability is how successfully and satisfactorily a person uses a product, document, website, or app to achieve goals effectively & efficiently. Good usability is measured by these factors: memorability, efficiency, errors, learnability, and satisfaction.
Organizing Your First Website Usability Test - WP Campus 2016Anthony D. Paul
You’ve built a shiny, new WordPress site. You asked your co-worker and your boss if they like it and they both do. However, you’re lying awake at night wondering if you’re missing something—because you know you’re not the end user. You yearn for actionable feedback.
In this talk, I’ll distill my background in usability research into a how-to framework for taking your site and conducting your first unmoderated usability test. I’ll cover why and when you should be running usability tests; how to set research goals and draft a script for them; setting up your lab environment and capturing feedback; and best practices for facilitation, minimizing bias, keeping users on task and gleaning the most from each brief test.
Attendees will walk away with enough information to discuss the value of usability testing with decision-makers, as well as a tactical foundation for organizing and running their own usability study.
Organizing Your First Website Usability Test - WP Campus 2016Anthony D. Paul
You’ve built a shiny, new WordPress site. You asked your co-worker and your boss if they like it and they both do. However, you’re lying awake at night wondering if you’re missing something—because you know you’re not the end user. You yearn for actionable feedback.
In this talk, I’ll distill my background in usability research into a how-to framework for taking your site and conducting your first unmoderated usability test. I’ll cover why and when you should be running usability tests; how to set research goals and draft a script for them; setting up your lab environment and capturing feedback; and best practices for facilitation, minimizing bias, keeping users on task and gleaning the most from each brief test.
Attendees will walk away with enough information to discuss the value of usability testing with decision-makers, as well as a tactical foundation for organizing and running their own usability study.
Edinburgh Napier University offers a Work Based Learning Module which integrates professionalism into the various courses.
In this presentation I praise the merits of doing a placement.
Why You Need to Do a Pilot - Mitch Weisburgh, Founder, Games4Ed & Scott Brews...SeriousGamesAssoc
Pilots provide valuable feedback, and they can springboard into paid engagements, and they can support sales and marketing. Or, they can be a waste of time, they can lead to nowhere, and they can actually hinder growth.
We’re going to go through an exercise in how to screw up your pilots, so that it doesn’t happen to you in real life.
I attended the Pittsburgh Summer LearnLab at Carnegie Mellon over the summer (2016). The work that I did over the week of the LearnLab went into this presentation. I conducted two linear regression models, two support vector classification models, a hierarchical clustering analytics, and a Latent Class Analysis.
Visit BobBodily.com for more information about my research.
Introduction to Usability Testing for Survey ResearchCaroline Jarrett
The basics of how to incorporate usability testing in the development process of a survey. Workshp first presented at the SAPOR conference, Raleigh, North Carolina USA, October 2011 by Emily Geisen of RTI and Caroline Jarrett of Effortmark.
Using real-time dashboards to improve student engagement in virtual learning ...Bob Bodily
In this presentation, I discuss the technical requirements for collecting learning analytics data in an open environment, the analytics system we have created to facilitate real-time data collection, screenshots of our student and instructor dashboards, and some statistical analyses conducted to improve our dashboards.
Visit BobBodily.com for more information about my research.
Organizing Your First Website Usability Test - WordCamp Boston 2016Anthony D. Paul
You’ve built a shiny, new WordPress site. You asked your grandma and your client if they like it and they both do. However, you’re lying awake at night wondering if you’re missing something—because you know you’re not the end user. You yearn for actionable feedback.
In this talk, I’ll distill my background in usability research into a how-to framework for taking your site and conducting your first moderated usability test. I’ll cover what to look for, best practices in facilitation, tools on the cheap, and how to glean the most from a brief window of time.
Designing, developing, and evaluating a real time student dashboardBob Bodily
We discuss the technical infrastructure needed to capture student data in an open learning environment (beyond the LMS), our iterative design process along with dashboard prototypes, and our dashboard evaluation results from focus groups and a survey.
Visit BobBodily.com for more information about my research.
It is very important to be able to evaluate apps because there are too many apps out there, and we are confronted with choices. Some of them are good, some bad. Even the good ones might not serve our needs. Or those good ones might serve our needs, but the entry level is too high for our students to be able to benefit from it.
Examining the effect of a real time student dashboard on student behavior and...Bob Bodily
In this presentation we present a randomized control trial research study conducted to determine the effect of a real-time student dashboard on student behavior and student achievement. We also present on some of our design changes to increase student use of our dashboards.
Visit BobBodily.com for more information about my research.
LAK '17 Trends and issues in student-facing learning analytics reporting sys...Bob Bodily
This presentation was given at the 7th Learning Analytics and Knowledge conference (2017) in Vancouver, BC. It presents the trends and issues in student-facing learning analytics reporting research as identified by a literature review including over 90 articles.
The RISE Framework: Using learning analytics for the continuous improvement o...Bob Bodily
We present the Resource Inspection, Selection, and Enhancement (RISE) framework, a learning analytics framework designed to enable teachers to engage in the continuous improvement process. This framework helps identify resources that should be evaluated by a teacher or an instructional designer.
Visit BobBodily.com for more information about my research.
Are you looking to gather insights from your potential customers? When it comes to your prospects, do you really know what they want? Many startup teams tell us they are missing the key information they need to get into their users' mind. Without this information, the products often fall short of delighting users.
There are those that believe that user research and usability testing must be a complex and scientific process that takes lots of time, money, and resources. However, in the real world, most startups don't have the luxury to spend weeks or months on their user research. That's where guerrilla research techniques come into play.
Edinburgh Napier University offers a Work Based Learning Module which integrates professionalism into the various courses.
In this presentation I praise the merits of doing a placement.
Why You Need to Do a Pilot - Mitch Weisburgh, Founder, Games4Ed & Scott Brews...SeriousGamesAssoc
Pilots provide valuable feedback, and they can springboard into paid engagements, and they can support sales and marketing. Or, they can be a waste of time, they can lead to nowhere, and they can actually hinder growth.
We’re going to go through an exercise in how to screw up your pilots, so that it doesn’t happen to you in real life.
I attended the Pittsburgh Summer LearnLab at Carnegie Mellon over the summer (2016). The work that I did over the week of the LearnLab went into this presentation. I conducted two linear regression models, two support vector classification models, a hierarchical clustering analytics, and a Latent Class Analysis.
Visit BobBodily.com for more information about my research.
Introduction to Usability Testing for Survey ResearchCaroline Jarrett
The basics of how to incorporate usability testing in the development process of a survey. Workshp first presented at the SAPOR conference, Raleigh, North Carolina USA, October 2011 by Emily Geisen of RTI and Caroline Jarrett of Effortmark.
Using real-time dashboards to improve student engagement in virtual learning ...Bob Bodily
In this presentation, I discuss the technical requirements for collecting learning analytics data in an open environment, the analytics system we have created to facilitate real-time data collection, screenshots of our student and instructor dashboards, and some statistical analyses conducted to improve our dashboards.
Visit BobBodily.com for more information about my research.
Organizing Your First Website Usability Test - WordCamp Boston 2016Anthony D. Paul
You’ve built a shiny, new WordPress site. You asked your grandma and your client if they like it and they both do. However, you’re lying awake at night wondering if you’re missing something—because you know you’re not the end user. You yearn for actionable feedback.
In this talk, I’ll distill my background in usability research into a how-to framework for taking your site and conducting your first moderated usability test. I’ll cover what to look for, best practices in facilitation, tools on the cheap, and how to glean the most from a brief window of time.
Designing, developing, and evaluating a real time student dashboardBob Bodily
We discuss the technical infrastructure needed to capture student data in an open learning environment (beyond the LMS), our iterative design process along with dashboard prototypes, and our dashboard evaluation results from focus groups and a survey.
Visit BobBodily.com for more information about my research.
It is very important to be able to evaluate apps because there are too many apps out there, and we are confronted with choices. Some of them are good, some bad. Even the good ones might not serve our needs. Or those good ones might serve our needs, but the entry level is too high for our students to be able to benefit from it.
Examining the effect of a real time student dashboard on student behavior and...Bob Bodily
In this presentation we present a randomized control trial research study conducted to determine the effect of a real-time student dashboard on student behavior and student achievement. We also present on some of our design changes to increase student use of our dashboards.
Visit BobBodily.com for more information about my research.
LAK '17 Trends and issues in student-facing learning analytics reporting sys...Bob Bodily
This presentation was given at the 7th Learning Analytics and Knowledge conference (2017) in Vancouver, BC. It presents the trends and issues in student-facing learning analytics reporting research as identified by a literature review including over 90 articles.
The RISE Framework: Using learning analytics for the continuous improvement o...Bob Bodily
We present the Resource Inspection, Selection, and Enhancement (RISE) framework, a learning analytics framework designed to enable teachers to engage in the continuous improvement process. This framework helps identify resources that should be evaluated by a teacher or an instructional designer.
Visit BobBodily.com for more information about my research.
Are you looking to gather insights from your potential customers? When it comes to your prospects, do you really know what they want? Many startup teams tell us they are missing the key information they need to get into their users' mind. Without this information, the products often fall short of delighting users.
There are those that believe that user research and usability testing must be a complex and scientific process that takes lots of time, money, and resources. However, in the real world, most startups don't have the luxury to spend weeks or months on their user research. That's where guerrilla research techniques come into play.
Usability Testing Basics: What's it All About? at Web SIG ClevelandCarol Smith
Presented to Web SIG Cleveland on May 21, 2011 at Notre Dame College in South Euclid (Cleveland), Ohio.
Learn all you need to get started:
- Where you can conduct studies (does it have to be in a lab?)
- Types of studies (RITE, think aloud, etc.)
- Tips for recruiting participants
- Tips for Interacting with participants without biasing the study
- Preparing for the study (materials needed, forms, etc.)
- Guidance for analyzing the study
Using Automated Testing Tools to Empower Your User ResearchUserZoom
In this Webinar, you'll learn:
-Guidelines for when to use moderated vs. unmoderated testing
-How to structure studies and set up tasks to get valid research results that achieve business objectives for testing
-Tried-and-true tricks for avoiding the most common pitfalls of unmoderated testing
-Advice for recruitment, screening and use of online panels
-How to use automated testing with agile design and development sprints to accommodate tight timelines and satisfy usability needs
Usability Primer - for Alberta Municipal Webmasters Working GroupNormanMendoza
Presentation provided on December 1, 2006. References:
“A Practical Guide to Usability Testing” by Joseph S. Dumas and Janice C. Redish
The Elements of User Experience, diagram by Jesse James Garrett
Learn how to use prototyping and usability testing as a means to validate proposed functionality and designs before you invest in development. SOMETIMES there is a huge disconnect between the people who make a product and the people who use it. Usability testing is vital to uncovering the areas where these disconnects happen. In this symposium you will learn the steps to conduct a successful usability test. This includes tips and real life examples on how to plan the tests, recruit users, facilitate the sessions, analyze the data, and communicate the results.
Usability engineering is a field that is concerned generally with human-computer interaction and specifically with devising human-computer interfaces that have high usability or user friendliness. It provides structured methods for achieving efficiency and elegance in interface design.
Yes U can! - User Checks; iterative usability testing with actionable resultsAnouschka Scholten
45 min Workshop @UXCamp Amsterdam 2018 about User Checks, a method for agile usability testing + getting to actionable results fast. Learning by doing: user check in 45 minutes
As part of the EBI Interfaces forum, Francis Rowland and Dado Marcora give a talk to promote rapid, lightweight usability testing, followed by a simple demo of a typical test
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
1. Why care about usability?
Have you ever…
Gotten lost in a website?
Left a site without finding the information
you wanted?
Struggled to build something using
instructions?
Used a manual that isn’t effective?
“You never get a second chance to make
a first impression.”
2. What is usability?
How successfully and satisfactorily a
person uses a product, document, or
website to achieve goals effectively &
efficiently
Easy to learn
Easy to remember
Efficient
Satisfying
Error free
3. Good Usability Means…
Easy to remember (Memorability)
Users should return to document or
website after some time without
having to learn things all over again
Efficient to use (Efficiency)
Users who learn how to accomplish
task should maintain high level of
productivity
4. Good Usability Means…
Errors
Users should accomplish task free of
errors and recover easily from any errors
Easy to learn (Learnability)
Users should quickly start working on a
task
Satisfying (Satisfaction)
Users should like using product,
document, or website
5. Role of Usability in Workplace
Usability included in review phase of
problem-solving approach
Test the usability of document
Usability can occur during development or
after document has been put into use
6. Where does usability testing take
place?
In a Lab
Allows for a controlled environment
Makes observation easier
Outside a Lab
May only need a conference room
In the Field
Tests real-life situations and environments
7. Your Usability Test Activity
Southwest Airlines publishes a magazine
that is distributed on its airplanes
An article provides instructions for how to
make a dollar bill origami squirrel
8. Planning Usability Tests
Establish a team
Number of people depends on size of
project, location & number of users
Your team for your usability test activity
1 facilitator
1 observer
1 user
9. Planning Usability Tests
Define User Profile
Identify people who typically use document or
website
Define subgroups of users based upon frequency
or expertise (if necessary)
Consider factors: age, education, gender,
experience, stress, attitudes, motivations,
where they would most often use the
document
10. Planning Usability Tests
Recruit & screen participants for test
Recruit participants
Offer food, money, or gift certificate as
compensation
Screen participants to make sure they are
users
11. Planning Usability Tests
Establish Issues & Goals of Test
By establishing clear goals for users that
satisfy issue at stake
By understanding users’ needs, desires, &
preferences
By providing concrete and quantifiable means
to measure test results
12. Your Usability Test Activity
Your issues
Will users be able to quickly and easily make a
squirrel from a dollar bill following the instructions?
Will users be able to easily and quickly recover if they
make an error?
Your Goals
Can users complete the task in 30 minutes or less?
Can users make the dollar bill origami squirrel
following the 16 steps without making an error?
13. Planning Usability Tests
Conduct Usability Tests
Tell users what they will do, without
suggesting how they should do it
Facilitator uses a test facilitator script to
introduce the test and the roles of facilitator,
observer, & user
14. Planning Usability Tests
Collect data from usability tests
Observer(s) will collect data by video or
face-to-face observation.
Record observations and take notes using a
or data collection sheet that’s pre-made &
printed
15. Planning Usability Tests
Facilitator administers post-test
Questionnaire to Users
Collect neutral feedback about their
experiences after task is done
16. Analyze Findings
Efficiency: time how long it takes to complete a
task
Error rate: Count number of deviations from a
path that will lead to the completion of a task, or
any backtracking or restarting of a task
Learnability: observation of how quickly users
can understand the layout of a document &
navigational layout of a website and perform
similar actions throughout testing
Satisfaction: users asked open-ended
questions about experience or take System
Usability Scale survey
17. Report Results & Make
Recommendations
By identifying audience for report
By choosing a format that fits audience
Oral report, written report or PowerPoint
presentation
By summarizing methodology, results,
and recommendations (when
appropriate)
18. Follow Usability Report
Guidelines
Write in a clear style
Include charts or diagrams
Include actual voices and words of the
participants to support findings &
recommendations
19. Ethical Considerations
in Usability Testing
Brief participants about test process
Create unbiased questionnaires
Use consent and anonymous disclosure
forms
Permission to test in workplace & video
record/take pictures
Editor's Notes
Usability is not just about how useful a product or document is, or even about how easy it is to use. It is about how the user perceives the product or document. The best test of usability is whether or not users find that a document or product (including documentation that helps users use a product) helps them reach their goals.
MEELS focuses on:
• Memorability: If a user has used the system before, can he or she remember enough to use it effectively the next time or does the user have to start over again learning everything?
• Efficiency: Once an experienced user has learned to use the system, how fast can he or she accomplish tasks?
• Errors considers three factors: first, the how many errors do users make while using the product, how serious are the errors, and how users recover from their errors.
• Learnability: How fast can a user who has never seen the user interface before learn it sufficiently well to accomplish basic tasks?
• Satisfaction looks at how pleasant the system is to use. How much do users like it?
Workplace documents are often reviewed to make sure that they are usable.
“Usability testing” can occur during product development or document design, or after a document has been put into use.
More companies are recognizing the value of User-Centered Design, a process that asks “What does the user want?” at every stage of product development. Some companies take a team approach to user-centered design, which includes members of the team conducting usability testing. Other companies adopt a user entered design process but do not use a team approach to usability testing, referring to establish usability centers where experts in human factors and usability perform testing services for internal clients on their products. Typically, companies using this latter approach invite interested developers to sit as observers during testing. When this isn't possible, a video-highlights tape and a report become the vehicle to communicate the results of testing. In other cases, a company may hire a usability consulting firm to conduct the test and deliver the results. Companies with executive viewing rooms provide a way for visitors from the client company to observe the test administrator at work during testing. Such an approach using external evaluation has the advantage of assuring that the tests are well run and expertly performed.
Jakob Nielsen, referred to as the guru of web page usability, has demonstrated that you need to test with at least 15 users to discover all the usability problems in the design. The ultimate user experience is improved much more by three tests with 5 users than by a single test with 15 users.
One reason reason for this strategy is to better distribute your budget instead of blowing everything on a single, elaborate study. You also want to run multiple tests because the real goal of usability engineering is to improve the design and not just to document its weaknesses. After the first study with 5 users has found 85% of the usability problems, you will want to fix these problems in a redesign.
After creating the new design, you need to test again. Since nobody can design the perfect user interface, there is no guarantee that the new design does in fact fix the problems. A second test will discover whether the fixes worked or whether they didn't. Also, in introducing a new design, there is always the risk of introducing a new usability problem.
Also, the second test with 5 users will discover most of the remaining 15% of the original usability problems that were not found in the first test.
Finally, the second test will be able to probe deeper into the usability of the fundamental structure of the site, assessing issues like information architecture, task flow, and match with user needs. These important issues are often obscured in initial studies where the users are stumped by stupid surface-level usability problems that prevent them from really digging into the site. So the second test will both serve as quality assurance of the outcome of the first study and help provide deeper insights as well. The second test will always lead to a new (but smaller) list of usability problems to fix in a redesign. And the same insight applies to this redesign: not all the fixes will work; some deeper issues will be uncovered after cleaning up the interface. Thus, a third test is needed as well.