Usability testing reveals website weaknesses and gives designers an opportunity to correct them before a site goes live. This is an example of an after-the-fact test of an old site that highlights the basics of how a usability test is done.
Google Chromecast Usability Report by Team User FriendlyReed Snider
This usability report for the Google Chromecast® was carried out by team User Friendly of the Bentley University Testing & Assessments course in the Human Factors in Information Design graduate program.
The Google chromecast product was given to a set of 45-80 year old participants who were instructed to simply "set up the device". This project was not organized by Google and all rights of the terms used are attributed to Google® under Alphabet®.
Intro talk on lean unmoderated user testing given at General Assembly, Los Angeles in spring 2013. Covers basics, benefits & limitations, when to test, what to test, and a case study.
Usability testing reveals website weaknesses and gives designers an opportunity to correct them before a site goes live. This is an example of an after-the-fact test of an old site that highlights the basics of how a usability test is done.
Google Chromecast Usability Report by Team User FriendlyReed Snider
This usability report for the Google Chromecast® was carried out by team User Friendly of the Bentley University Testing & Assessments course in the Human Factors in Information Design graduate program.
The Google chromecast product was given to a set of 45-80 year old participants who were instructed to simply "set up the device". This project was not organized by Google and all rights of the terms used are attributed to Google® under Alphabet®.
Intro talk on lean unmoderated user testing given at General Assembly, Los Angeles in spring 2013. Covers basics, benefits & limitations, when to test, what to test, and a case study.
These set of slides will explain the importance of maintaining the user friendly features of any website. It contains statistics gathered from researches that have been already conducted and stats collected over the internet to quantitatively prove the importance of the aforementioned title.
Presentation for Taxonomy Bootcamp 2015 by Naomi Oorbeck & Jessica DuVerneay. Covers how taxonomy improved digital products, when to use a lightweight approach, planning & scoping lightweight work, and an overview of key skills and approaches to taxonomy development.
SXSW 2016 - Everything you think about A/B testing is wrongDan Chuparkoff
Everything you've learned about A/B Testing is based on the fundamentally flawed belief that there's one right answer. But the era of mass-market, one-right-answers is over. A/B Testing is our most valuable tool in the battle to create a more engaging web. But our strategy is broken. Don't worry, we can gain a better understanding of our users with a little data science. And we can reinvent A/B Testing... I will show you how.
At Civis Analytics, we specialize in Data Science. From here, we can clearly see that all people are not the same. So why are A/B Tests designed to search for a single solution? In this session I'll show you where A/B Testing is headed next. See you in Austin!
This presentation will look at ways for using the Amazon Mechanical Turk system for conducting UX Research, with an emphasis on Specialized Techniques, and how to work around some of Mechanical Turk's inherent limitations. The intended tone is to provide an "Insider's Guide" to using Mechanical Turk ethically and effectively.
The speaker will share his experiences, including both challenges and successes, in working with Amazon's Mechanical Turk, along with gleaned incites.
Amazon Mechanical Turk is an online tool for recruiting and paying human subjects for completing specific work tasks. User Experience Professionals have been using Mechanical Turk for data gathering activities. It has been designed to link to supplemental tools and resources, such as the Qualtrics Survey Management system.
Talks@Coursera - A/B Testing @ Internet Scalecourseratalks
Talks@Coursera
This tech talk will describe how to build an experiment platform that can handle large-scale experiments. The talk will also discuss several best practices in designing and analyzing online experiments learned from companies like Coursera, Microsoft and LinkedIn.
About the Speakers
Ya Xu has been working in the domain of online A/B testing for over 4 years. She currently leads a team of engineers and data scientists building a world-class online A/B testing platform at LinkedIn. She also spearheads taking LinkedIn's A/B testing culture to the next level by evangelizing best practices and pushing for broad-based platform adoption. She holds a Ph.D. in Statistics from Stanford University.
Chuong (Tom) Do currently leads a team of data engineers and analysts in the Analytics team at Coursera, which is responsible for data infrastructure and quantitative analysis in support of the product and business. He completed his Ph.D. in Computer Science at Stanford University in 2009 and worked as a scientist in the personal genetics company 23andMe until 2012, where his research has collectively spanned the fields of machine learning, computational biology, and statistical genetics.
Chapter stage - Building and evaluating prototypesRenner Modafares
This presentation describes how to build and evaluate prototypes using the Enterprise Design Thinking framework by IBM.
This content was presented to Hortolândia DT Chapter members in February/2020, during the weekly Thursday DT session, created to share knowledge and open space to practice the Enterprise Design Thinking (EDT) framework.
Introduction to Usability Testing: The DIY Approach - GA, London January 13th...Evgenia (Jenny) Grinblo
The slides from my General Assembly workshop on January 13th, 2013 (https://generalassemb.ly/education/introduction-to-usability-testing-the-diy-approach)
ABOUT THIS WORKSHOP
Usability testing can quickly uncover areas of an interface that frustrate users and hurt business goals but many teams put it off due to budget, time, or training concerns.
This workshops will take you through a do-it-yourself approach to usability testing. We'll cover the basics (benefits, recruiting, and how to plan a test), learn how to facilitate a test to get reliable results, and how to use the testing results to move usability improvements forward. You'll walk away with the tools to hold a complete usability testing right away.
TAKEAWAYS
Learn why and when to hold usability testing
Learn practical tools and methods to overcome time, budget or training concerns that block user testing from happening
Shift the conversation from opinions and hunches to proven usability problems that your team can solve together
Writing Key Findings, General Trends and RecommendationsVirginia Bautista
After listening comes writing. . . Here are more tips in delivering useful reports out of disorganized and sometimes chaotic social media conversations.
These set of slides will explain the importance of maintaining the user friendly features of any website. It contains statistics gathered from researches that have been already conducted and stats collected over the internet to quantitatively prove the importance of the aforementioned title.
Presentation for Taxonomy Bootcamp 2015 by Naomi Oorbeck & Jessica DuVerneay. Covers how taxonomy improved digital products, when to use a lightweight approach, planning & scoping lightweight work, and an overview of key skills and approaches to taxonomy development.
SXSW 2016 - Everything you think about A/B testing is wrongDan Chuparkoff
Everything you've learned about A/B Testing is based on the fundamentally flawed belief that there's one right answer. But the era of mass-market, one-right-answers is over. A/B Testing is our most valuable tool in the battle to create a more engaging web. But our strategy is broken. Don't worry, we can gain a better understanding of our users with a little data science. And we can reinvent A/B Testing... I will show you how.
At Civis Analytics, we specialize in Data Science. From here, we can clearly see that all people are not the same. So why are A/B Tests designed to search for a single solution? In this session I'll show you where A/B Testing is headed next. See you in Austin!
This presentation will look at ways for using the Amazon Mechanical Turk system for conducting UX Research, with an emphasis on Specialized Techniques, and how to work around some of Mechanical Turk's inherent limitations. The intended tone is to provide an "Insider's Guide" to using Mechanical Turk ethically and effectively.
The speaker will share his experiences, including both challenges and successes, in working with Amazon's Mechanical Turk, along with gleaned incites.
Amazon Mechanical Turk is an online tool for recruiting and paying human subjects for completing specific work tasks. User Experience Professionals have been using Mechanical Turk for data gathering activities. It has been designed to link to supplemental tools and resources, such as the Qualtrics Survey Management system.
Talks@Coursera - A/B Testing @ Internet Scalecourseratalks
Talks@Coursera
This tech talk will describe how to build an experiment platform that can handle large-scale experiments. The talk will also discuss several best practices in designing and analyzing online experiments learned from companies like Coursera, Microsoft and LinkedIn.
About the Speakers
Ya Xu has been working in the domain of online A/B testing for over 4 years. She currently leads a team of engineers and data scientists building a world-class online A/B testing platform at LinkedIn. She also spearheads taking LinkedIn's A/B testing culture to the next level by evangelizing best practices and pushing for broad-based platform adoption. She holds a Ph.D. in Statistics from Stanford University.
Chuong (Tom) Do currently leads a team of data engineers and analysts in the Analytics team at Coursera, which is responsible for data infrastructure and quantitative analysis in support of the product and business. He completed his Ph.D. in Computer Science at Stanford University in 2009 and worked as a scientist in the personal genetics company 23andMe until 2012, where his research has collectively spanned the fields of machine learning, computational biology, and statistical genetics.
Chapter stage - Building and evaluating prototypesRenner Modafares
This presentation describes how to build and evaluate prototypes using the Enterprise Design Thinking framework by IBM.
This content was presented to Hortolândia DT Chapter members in February/2020, during the weekly Thursday DT session, created to share knowledge and open space to practice the Enterprise Design Thinking (EDT) framework.
Introduction to Usability Testing: The DIY Approach - GA, London January 13th...Evgenia (Jenny) Grinblo
The slides from my General Assembly workshop on January 13th, 2013 (https://generalassemb.ly/education/introduction-to-usability-testing-the-diy-approach)
ABOUT THIS WORKSHOP
Usability testing can quickly uncover areas of an interface that frustrate users and hurt business goals but many teams put it off due to budget, time, or training concerns.
This workshops will take you through a do-it-yourself approach to usability testing. We'll cover the basics (benefits, recruiting, and how to plan a test), learn how to facilitate a test to get reliable results, and how to use the testing results to move usability improvements forward. You'll walk away with the tools to hold a complete usability testing right away.
TAKEAWAYS
Learn why and when to hold usability testing
Learn practical tools and methods to overcome time, budget or training concerns that block user testing from happening
Shift the conversation from opinions and hunches to proven usability problems that your team can solve together
Writing Key Findings, General Trends and RecommendationsVirginia Bautista
After listening comes writing. . . Here are more tips in delivering useful reports out of disorganized and sometimes chaotic social media conversations.
What does it take to get your findings read, heard and acted upon while they are still relevant? How do you get the people who need to act on your research interested in your findings? How do you make your findings and reports more usable?
UXPA 2013: Effectively Communicating User Research FindingsJim Ross
Communicating user research findings effectively so that people can understand them, believe them, and know how to act on the recommendations can be challenging. You may feel that you’ve delivered a successful presentation, but later you find that the recommendations aren’t acted upon. Ideally, our clients are as interested in our user research findings and recommendations as we are and find them valuable, but without the proper understanding, clients can express a variety of negative reactions. This presentation will discuss best practices in communicating user research findings to avoid these problems and to lead to better outcomes.
How to recruit, screen, and interview high-value customers,
synthesize findings, and produce actionable recommendations with just a little bit of money and a whole lot of enthusiasm.
Delivering Results: How Do You Report User Research Findings? Bob Thomas
The long, textual written report is dead, isn’t it? So how do you deliver your findings to your clients? Is it PowerPoint? An e-mail? A spreadsheet? Post-it notes? And what do you include? Positive findings? Screenshots with callouts? Just issues? Or recommendations as well? Are they prioritized?
If you ask our panelists, some of us have developed templates that we use and modify for each research activity, and others change the deliverable based on the activity and client.
Jen McGinn, Principal Usability Engineer, Oracle
Eva Kaniasty, Founding Principal, RedPill UX
Dharmesh Mistry, Usability Specialist, Acquia
Kyle Soucy, Founding Principal, Usable Interface
Carolyn Snyder, Founding Principal, Snyder Consulting
Early Signal Testing: Designing Atlassian’s New LookAtlassian
You probably have noticed the new look of Atlassian's Cloud products. Our new Design Guidelines took many months to create, and our team had many tough decisions to make. Luckily, we incorporated customer research along the way to guide us.
One of our most valuable research tools is called “early signal testing”, and we think it can help you too. Early signal testing can help you gain confidence in a direction, rather than being paralyzed by a choice. It can help assess your design's usability, clarity, comprehension, and more. This talk explains how your team can gather measurable user feedback in as little as a week, for even the very biggest of problems.
Slides to my talk at NDC Oslo 2016: How to do a really good company wide product demo.
See how I tried to improve on the format of informing all people within the whole company about the latest product releases, the underlying user value and the KPIs each product can drive. It was an iterative process making use of PDCA, start from where you are and continuous learning principles.
I taught a class titled "You Don't Know C.R.A.P. about UX/UI" for SkillShare Philadelphia on 8/23/11. For more information on the class visit: http://www.skillshare.com/You-Dont-Know-CRAP-about-UX-UI/1632896614/
From Use to User Interface- This 3-4 hour tutorial describes a practical approach to translating the goals users would like to achieve and the tasks they wish to accomplish into user interface designs that effectively support those goals and tasks.
Currently we are having a project of Human Computer Interaction (HCI) course in which we are developing a mobile app named "Announcer".
This is a project slide presentation of our "Announcer" mobile app.
Click on our blogspot here to know more:
yujinnohikari.blogspot.com
prototyping software credit to: justinmind.com
A four days work shop using LSP method for those companies who are trying to design a new service or redesign an existing service. this work shop should conduct with a certified LSP facilitator to help the organization.
This design was presented in LEGO Serious Play Annual conference 2015 in Billund, Denmark
Sum of the Parts Speaker Series - Experience Engineering and UXvincebohner
Should designers code? Is that even the right question? And what is an Experience Engineer? Find out how our UX team is experimenting with processes, team skills and organization to be more innovative, agile and rigorous about hypothesis driven design.
Newbie UX: Something I learned about UX (Business vs Design)Soon-Aik Chiew
Sharing some tips to those who are new to UX and wish to learn more about UX. The findings and sharing are based on my past learning mistakes, experience and observations.
http://blog.netizentesting.com/newbie-ux-something-learned-user-experience/
I'm currently drafting a material on Startup (Digital) Marketing: Growth Hacking Thru UX. Stay Tuned.
To read more articles, visit: blog.NetizenTesting.com
A four days work shop using LSP method for those companies who are trying to design a new service or redesign an existing service. this work shop should conduct with service designers and a certified LSP facilitator and to help the organization.
This design was presented in LEGO Serious Play Annual conference 2015 in Billund, Denmark
Orangescrum User Role Management Add-on User ManualOrangescrum
User Role Management Add-on lets you define the access of a user in your Orangescrum account with a proper role and responsibility of a user. Using this add-on:
Create Role Groups
Define Roles
Add users to defined roles
Define specific actions each role can perform in the tool
Learn More: https://www.orangescrum.org/add-on
6. Usability Testing Regime In-depth laboratory Usability Testing: Assessment and Acceptance Tests Usability and Visual Expert Reviews of iDivvy Prototype Paper Mock-ups and iDivvy Digital Prototyping Preliminary Survey, Informal Interviews, Focus Groups and Follow-up Surveys
7. Our main competitor is face-to-face communication. Task completion time needs to reflect average human “talking” speed. Pros Cons Quick Not enduring Context-aware Not (necessarily) recurring reminder
8. Pen & Paper Pros Cons Convenient No receipt feedback Enduring Unorganized
12. Even though listed chores were not shaped like buttons, Most users attempted to touch chores for more info. Link Chore from Chore List Panel Assessment indicated users did not first look at the bottom navigation panel even though it was physically closer to them. Invisible Lower Navigation Bar Indication of confusion: Gender Icons? Unintended mismatch of mental model User Icon Selection
13. Acceptance Testing Function Experience Needed Time to Complete Add User The user must have a basic understanding of a keyboard layout in order to use the onscreen keyboard to type in any optional fields. They can select an avatar or picture to represent the user being added; this is the only requirement of this functionality. Less than 30 seconds. The user was able to select an avatar. Select User When adding a Task the user is require to select a person to designate the task to. This required them to make a selection from the list of users created using the Add User function. Less than 3 seconds depending on their picture association skills. Remove User The user must go in to “Remove User” mode and make a selection for which user to remove. Less than 5 seconds based on consideration. Function Experience Needed Time to Complete Add Chore User must be able to pick a picture to represent the chore. Optional chore name can be added. Less than 1 minute. Select Chore The user must be able to select a chore to assign to a person. Less than 20 seconds. Remove Chore The user must put the chore list into “Remove Chore” mode and then click the chore to remove. Less than 10 seconds. Performance Specification Metric Data PASSED View Task 100% < 30 seconds PASSED Create Task 100% < 20 seconds PASSED Show User Tasks 100% < 30 seconds PASSED Add New User 100% < 20 seconds PASSED Select User 100% < 10 seconds PASSED Remove User 100% < 10 seconds PASSED Create a Chore 100% < 20 seconds PASSED Schedule a Chore 100% < 10 seconds PASSED Un-schedule a Chore 100% < 10 seconds