SlideShare a Scribd company logo
1 of 43
Download to read offline
Metrics
For Usability (UT) and User Experience (UX)
Learning outcomes
• Students are able to apply the subjective questionnaire for their
usability project.
What is usability metric?
• The measurement of relative
users’ performance on a
given set of test tasks. The
most basic measures are
based on the definition o
usability as quality metric:
success rate, the erorr rate,
and users’ subjective
satisfaction.
Benefit of UT metrics
• Track progress between releases. You cannot fine-tune your
methodology unless you know how well you're doing.
• Assess your competitive position. Are you better or worse than
other companies? Where are you better or worse?
• Make a Stop/Go decision before launch. Is the design good
enough to release to an unsuspecting world?
• Create bonus plans for design managers and higher-level
executives. For example, you can determine bonus amounts for
development project leaders based on how many customer-
support calls or emails their products generated during the year.
Objective metrics
• The time a task requires
• The error rate
• The success rate
Metric classification
1. Task Load/Mental
2. Usability metric
3. User Experience
Task load/mental
• Subjective Mental Effort Questionnaire – SMEQ (Sauro, 2009) with 1
item i.e., measuring task difficulty.
• NASA’s task load index – NASA-TLX (1980) with 6 items such as
mental demand, physical demand, temporal demand, performance,
effort and frustration.
1. SMEQ - Subjective Mental Effort
Questionnaire
• Post-task rating of difficulty in
usability test
• Measure user satisfaction
immediately after the event,
usually the completion of a
task, potentially increasing its
validity.
• The question will be used for
tasks. i.e., 7 tasks - 7 set of
questions.
Scales
• The more scale steps in a questionnaire
item the better, but with rapidly diminishing
returns.
• 2 to 20, there is an initial rapid increase in
reliability, but it tends to level off at about 7
steps.
• After 11 steps there is little gain in reliability
from increasing the number of steps. The
number of steps is important for single-item
assessments, but is usually less important
when summing scores over a number of
items.
• Attitude scales tend to be highly reliable
because the items typically correlate rather
highly with one another.
2. NASA’s task load index (1980)
• TLX is a subject workload assessment tool to
allow users to perform subjective workload
assessments on operator’s working with
various human-machine interface system.
• Overall workload score based on a weighted
average of rating on six sub scales: Mental
demand, physical demand, temporal
demand, performance , effort and frustration.
Definition
1. Mental Demand (low/high)
• How much mental and perceptual activity was required
(i.e.,thinking, deciding, remembering, looking,
searching)?
2. Physical Demand (low/high)
• How much physical activity was required (for example,
pushing, pulling, turning, controlling, activating)?
Definition
3. Temporal Demand (low/high)
• How much time pressure did you feel due
to the rate or pace at which the tasks or
task elements occurred? WAs the pace
slow and leisurely or rapid and frantic?
4. Performance
• How successful do you think you were in
accomplishing the goals of the task set
by the experimenter?
5. Effort
• How hard did you have to work(mentally and
physically to accomplish your level of
performance?
6. Frustration level
• How insecure, discouraged, irritated, stress and
annoyed versus secure, gratified, content,
relaxed, and complement did you feed during
the task?
Paper and pencil base App
2. Usability Metric
• System usability scale (SUS) developed by John Brooke at Digital
Equipment Corporation in the UK in 1986 as a tool to be used in
usability engineering of electronic office systems.
• Defined by ISO 9241 Part 11 - context of use of the system.
• The scale is 0-100. It can be used to compare even systems that
are outwardly dissimilar.
SUS
Strongly disagree Strongly agree
1 2 3 4 5
1. I think that I would like to use this system frequently.
2. I found the system unnecessarily complex.
3. I thought the system was easy to use.
4. I think that I would need the support of a technical person
to be able to use this system.
5. I found the various functions in this system were well
integrated.
6. I thought there was too much inconsistency in this
system.
7. I would imagine that most people would learn to use this
system very quickly
8. I found the system very cumbersome to use.
9. I felt very confident using the system.
10. I needed to learn a lot of things before I could get going
with this system.
Learnability
Learnability
Efficiency
Efficiency
Efficiency
Satisfaction
Efficiency
Efficiency
4. After-Scenario Questionnaire (ASQ)
• Lewis, 2002
• Psychometric evaluation
• Measuring user satisfaction with linkert scale
ASQ
Strongly disagree Strongly agree
1 2 3 4 5
1. Overall, I am satisfied with the ease of completing the
tasks in this scenario.
2. Overall, I am satisfied with the amount of time it took to
complete the tasks in this scenario.
3. Overall, I am satisfied with the support information (on-
line help, messages, documentation) when completing the
tasks?
5.Net Promoter (NPS)
• The percentage of customers rating their likelihood to recommend a
company, a product, or a service to a friend or colleague as 9 or 10.
• Those who respond with a score of 9 to 10 are called Promoters.
They are considered likely to exhibit value - creating behaviors such
as buying more, remaining customers for longer and making more
positive referrals to other potential customers to be less likely
customers.
• Detractors are believed to be less likely to exhibit the value-
creating behaviors. The score is 0-6.
NPS
• How likely is that you would recommend our company/product/
service to a friend or colleague?
1 2 3 4 5 6 7 8 9 10
6. Technology/Acceptance Model -
TAM (1986)
• Perceived usefulness (PU) - the degree to which a person believes
that using a particular system would enhance his or her job
performance". It means whether or not someone perceives that
technology to be useful for what they want to do.
• Perceived ease-of-use (PEOU) - "the degree to which a person
believes that using a particular system would be free from
effort" (Davis 1989). If the technology is easy to use, then the
barriers conquered. If it's not easy to use and the interface is
complicated, no one has a positive attitudes towards it.
TAM and TRA model
• TAM posits that our beliefs
about ease and usefulness
affect our attitude toward
using which in turn affects
our intention and actual
use.
Perceived
usefulness
(U)
Perceived
Ease of use
(E)
Attitude
toward using
(A)
Behavioral
intention to
use (BI)
Actual
system use
External
Variables
Beliefs and
evaluation
Normative belief
and motivation to
comply
Subjective
Norm
Attitude toward
behavior
(A)
Behavioral
intention
(B)
Actual
Behavior
Laugwitz et al, 2008 with 6 items (26 variables)
• Efficiency: I should perform my tasks with the product fast, efficient
and in a pragmatic way.
• Perspicuity: The product should be easy to understand, clear,
simple, and easy to learn.
• Dependability : The interaction with the product should be
predictable, secure and meets my expectations.
UT
7.
8. User Experience Questionnaire -
UEQ
• Stimulation: Using the product should be interesting, exiting
and motivating.
• Attractiveness: The product should look attractive, enjoyable,
friendly and pleasant.
• Novelty: The product should be innovative, inventive and
creatively designed.
UX
Invention
https://www.ueq-online.org
How to use the Excel-Tool?
• Enter the data in the corresponding work sheet in the Excel
UEQ_Data_Analysis_Tool_Version<x>.xlsx and then all relevant
computations (with the exception of significance tests
• To compare two products, here you need to use the Excel
UEQ_Compare_Products_Version<x>.xlsx) are done automatically.
How to interpret the data?
Error Bar
• The error bar describes the interval in which 95% of the scale means of
these repetitions will be located. Thus, it shows how accurate your
measurement is.
• The size of the error bar depends on the sample size (the more
participants you have, the smaller is typically the error bar)
• Error bar shows how much the different participants agree (the higher the
level of agreement, i.e. the more similar the answers are, the smaller is
the error bar
How to interpret the data?
Cronbach-Alpha values
• A measure for the consistency of a scale, i.e. it indicates that all
items in a scale measure a similar construct
• Rules of thumb consider values >0.6 or >0.7 as a sufficient level.
• Alpha-Coefficeint is quite sensitive to sampling effects. A low Alpha
value can be the result of a sampling effect and may not necessarily
indicate a problem with scale consistency.
9. Software Usability Measurement
Inventory - SUMI (1990)
• Measuring users’ satisfaction (user experience)
• SUMI deified user experience with work-based software products in
1995.
• SUMI uses a rigorous scientific method of analysis and is backed up
by over 25 years of industrial application.
• SUMI is a copy right license - students need to apply by filling the
on-line form http://sumi.uxp.ie/about/appform.php
Software Usability Measurement - SUMI
• Set verifiable goals of user experience
• Track achievement of targets during product development
• Highlight good and bad aspects of an interface
• http://sumi.uxp.ie
How many respondents are required?
• A minimum of 20. A few is 12 respondents.
• Get as many respondents as you can within your timeframe and
budget.
What is measurement variables?
• Efficiency - users do their tasks in a quick, effective and economical manner.
• Affect - user’s general emotional reaction to the software
• Helpfulness - software communicates in a helpful way and assists in the
resolution of operational problem.
• Control - an expected and consistent way to inputs and commands.
• Learnability - familiar with the software. The tutorial interface are relabel and
instructive.
Link of questionnaire -> http://sumi.uxp.ie/en/
SUMI Analysis
SUMI items by percentile
Item: 20 prefer to stick to the functions that I know best.
Percentile: 88 Verdict: More Agreement
—————
Item: 12
Working with this software is satisfying.
Percentile: 58 Verdict: No difference
——————
Item: 8
I find that the help information given by this software is
not very useful.
Percentile: 39 Verdict: More Disagreement
The 60th percentile indicate that your respondents gave a
more positive response to that item than expected from the
standardisation database. These items are given in a black
colour.
Items between the 60th and 40th percentiles indicate that
the responses your respondents gave are pretty much in
line with the standardisation database: no surprises here.
These items are given in a blue colour.
Items below the 40th percentile indicate that your
respondents gave a more negative response to that item
than expected from the standardisation database. These
items are given in a red colour. To interpret them, say to
yourself "Respondents agree it is NOT true that [item
wording]."
User Records
Participant Global Efficiency Affect Helpfulness Control Learnability
1 70 67 59 65 74 69
2 67 67 58 65 68 69
3 66 62 54 64 69 55
4 65 58 57 73 51 49
5 64 62 54 51 61 57
6 60 64 58 55 64 51
7 59 65 64 53 69 68
8 56 65 63 53 63 68
9 53 67 64 43 62 69
10 52 61 58 41 69 70
11 47 61 58 41 56 63
12 47 60 54 45 44 63
13 46 62 63 43 52 68
14 43 43 45 43 59 49
15 38 55 59 34 49 60
Participants are
arranged in the order
of their Global
scores, the highest
Global scores at the
top of the table.
10. PrEmo : Measure Consumer
Emotion & Product Experience
A unique, scientifically validated tool to instantly get insight in
consumer emotions! People can report their emotions with the use of
expressive cartoon animations instead of relying on the use of words.
https://www.premotool.com
PrEmo intro and app
• https://youtu.be/yT2iciPYI0U
• https://youtu.be/6pu09rTehjs
11. Trust in Automated system
• Trust can affect how much people accept and rely upon increasingly
automatedsystems (Sheridan, 1988).
• General trust - trustworthy, honesty, loyalty, reliability, honor
• Trust between people - trustworthy, honesty, loyalty, reliability, integrity
• Trust between Human and Automated system - trustworthy, loyalty,
reliability, honor.
Sources : Jiuan-Yin Jian
Trust
1 2 3 4 5 6 7
1.The system is deceptive.
2. The system behaves in an underhanded manner.
3. I am suspicious of the system’s intent, action or output.
4. I am wary of the system.
5. The system’s actions will have a harmful or injurious outcome.
6. I am confident in the system.
7. The system provides security.
8. The system has integrity.
9. The system is dependable.
10. The system is reliable.
11. I can trust the system.
12. I am familiar with the system.
Summary
• 11 Questionnaires are in this presentation.
• Development of metrics from task (Human Factor) to performance
(Usability testing) and users’ emotion (User experience)
• There are so many questionnaire in the market. As a result, the
validity of questionnaires is the crucial part.

More Related Content

What's hot

Introduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey ResearchIntroduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey ResearchCaroline Jarrett
 
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter eventUsability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter eventKay Aubrey
 
Producing design solutions II
Producing design solutions IIProducing design solutions II
Producing design solutions IIEva Durall
 
Simple Ways of Planning, Designing and Testing Usability of a Software Produc...
Simple Ways of Planning, Designing and Testing Usability of a Software Produc...Simple Ways of Planning, Designing and Testing Usability of a Software Produc...
Simple Ways of Planning, Designing and Testing Usability of a Software Produc...KAROLINA ZMITROWICZ
 
Usability Testing Goes Mobile
Usability Testing Goes MobileUsability Testing Goes Mobile
Usability Testing Goes MobileTechWell
 
Usability Testing for Survey Research:How to and Best Practices
Usability Testing for Survey Research:How to and Best PracticesUsability Testing for Survey Research:How to and Best Practices
Usability Testing for Survey Research:How to and Best Practicesegeisen
 
Measuring the user experience
Measuring the user experienceMeasuring the user experience
Measuring the user experienceAndres Baravalle
 
HCI Part 6 - Prototype and Evaluation Plan
HCI Part 6 - Prototype and Evaluation PlanHCI Part 6 - Prototype and Evaluation Plan
HCI Part 6 - Prototype and Evaluation PlanKemar Harris
 
Measuring the User Experience in Digital Products
Measuring the User Experience in Digital ProductsMeasuring the User Experience in Digital Products
Measuring the User Experience in Digital ProductsKaterina Maniataki
 
Faster Usability Testing in an Agile World presented at Agile2011
Faster Usability Testing in an Agile World presented at Agile2011Faster Usability Testing in an Agile World presented at Agile2011
Faster Usability Testing in an Agile World presented at Agile2011Carol Smith
 
Live Conversation: Cut your customer interview costs by up to 90%
Live Conversation: Cut your customer interview costs by up to 90%Live Conversation: Cut your customer interview costs by up to 90%
Live Conversation: Cut your customer interview costs by up to 90%UserTesting
 
Ubuntu Usability Test Report
Ubuntu Usability Test ReportUbuntu Usability Test Report
Ubuntu Usability Test ReportDan Fitek
 
Usability testing / Nearly everything you need to know to get started
Usability testing / Nearly everything you need to know to get startedUsability testing / Nearly everything you need to know to get started
Usability testing / Nearly everything you need to know to get startedRebecca Destello
 
Applying Systems Thinking to Solve Wicked Problems in Software Engineering
Applying Systems Thinking to Solve Wicked Problems in Software EngineeringApplying Systems Thinking to Solve Wicked Problems in Software Engineering
Applying Systems Thinking to Solve Wicked Problems in Software EngineeringMajed Ayyad
 
Fact finding techniques
Fact finding techniquesFact finding techniques
Fact finding techniquesimthiyasbtm
 
HCI LAB MANUAL
HCI LAB MANUAL HCI LAB MANUAL
HCI LAB MANUAL Um e Farwa
 
5 investigating system requirements
5 investigating system requirements5 investigating system requirements
5 investigating system requirementsricardovigan
 

What's hot (20)

Introduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey ResearchIntroduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey Research
 
The Design and Evaluation of Beahvior Change Tech
The Design and Evaluation of Beahvior Change TechThe Design and Evaluation of Beahvior Change Tech
The Design and Evaluation of Beahvior Change Tech
 
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter eventUsability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
 
Producing design solutions II
Producing design solutions IIProducing design solutions II
Producing design solutions II
 
Simple Ways of Planning, Designing and Testing Usability of a Software Produc...
Simple Ways of Planning, Designing and Testing Usability of a Software Produc...Simple Ways of Planning, Designing and Testing Usability of a Software Produc...
Simple Ways of Planning, Designing and Testing Usability of a Software Produc...
 
Usability Testing Goes Mobile
Usability Testing Goes MobileUsability Testing Goes Mobile
Usability Testing Goes Mobile
 
Usability Testing for Survey Research:How to and Best Practices
Usability Testing for Survey Research:How to and Best PracticesUsability Testing for Survey Research:How to and Best Practices
Usability Testing for Survey Research:How to and Best Practices
 
Measuring the user experience
Measuring the user experienceMeasuring the user experience
Measuring the user experience
 
HCI Part 6 - Prototype and Evaluation Plan
HCI Part 6 - Prototype and Evaluation PlanHCI Part 6 - Prototype and Evaluation Plan
HCI Part 6 - Prototype and Evaluation Plan
 
Tech connect spring 2014 technology to job mapping v2
Tech connect spring 2014   technology to job mapping v2Tech connect spring 2014   technology to job mapping v2
Tech connect spring 2014 technology to job mapping v2
 
Measuring the User Experience in Digital Products
Measuring the User Experience in Digital ProductsMeasuring the User Experience in Digital Products
Measuring the User Experience in Digital Products
 
Faster Usability Testing in an Agile World presented at Agile2011
Faster Usability Testing in an Agile World presented at Agile2011Faster Usability Testing in an Agile World presented at Agile2011
Faster Usability Testing in an Agile World presented at Agile2011
 
Usability testing
Usability testing  Usability testing
Usability testing
 
Live Conversation: Cut your customer interview costs by up to 90%
Live Conversation: Cut your customer interview costs by up to 90%Live Conversation: Cut your customer interview costs by up to 90%
Live Conversation: Cut your customer interview costs by up to 90%
 
Ubuntu Usability Test Report
Ubuntu Usability Test ReportUbuntu Usability Test Report
Ubuntu Usability Test Report
 
Usability testing / Nearly everything you need to know to get started
Usability testing / Nearly everything you need to know to get startedUsability testing / Nearly everything you need to know to get started
Usability testing / Nearly everything you need to know to get started
 
Applying Systems Thinking to Solve Wicked Problems in Software Engineering
Applying Systems Thinking to Solve Wicked Problems in Software EngineeringApplying Systems Thinking to Solve Wicked Problems in Software Engineering
Applying Systems Thinking to Solve Wicked Problems in Software Engineering
 
Fact finding techniques
Fact finding techniquesFact finding techniques
Fact finding techniques
 
HCI LAB MANUAL
HCI LAB MANUAL HCI LAB MANUAL
HCI LAB MANUAL
 
5 investigating system requirements
5 investigating system requirements5 investigating system requirements
5 investigating system requirements
 

Similar to Metrics in usability testing and user experiences

Unit 3_Evaluation Technique.pptx
Unit 3_Evaluation Technique.pptxUnit 3_Evaluation Technique.pptx
Unit 3_Evaluation Technique.pptxssuser50f868
 
Benchmarking Using SUS
Benchmarking Using SUSBenchmarking Using SUS
Benchmarking Using SUSCake and Arrow
 
Evaluation in hci
Evaluation in hciEvaluation in hci
Evaluation in hcisajid rao
 
Beyond "Quality Assurance"
Beyond "Quality Assurance"Beyond "Quality Assurance"
Beyond "Quality Assurance"Jason Benton
 
UX and Usability Workshop Southampton Solent University
UX and Usability Workshop Southampton Solent University UX and Usability Workshop Southampton Solent University
UX and Usability Workshop Southampton Solent University Dr.Mohammed Alhusban
 
Usability and evolution Human computer intraction.ppt
Usability and evolution Human computer intraction.pptUsability and evolution Human computer intraction.ppt
Usability and evolution Human computer intraction.pptSyedGhassanAzhar
 
Usability Testing - Sivaprasath Selvaraj
Usability Testing - Sivaprasath SelvarajUsability Testing - Sivaprasath Selvaraj
Usability Testing - Sivaprasath SelvarajSivaprasath Selvaraj
 
Design and Evaluation techniques unit 5
Design and Evaluation techniques unit  5Design and Evaluation techniques unit  5
Design and Evaluation techniques unit 5KrishnaVeni451953
 
Usability in product development
Usability in product developmentUsability in product development
Usability in product developmentRavi Shyam
 
Using Automated Testing Tools to Empower Your User Research
Using Automated Testing Tools to Empower Your User ResearchUsing Automated Testing Tools to Empower Your User Research
Using Automated Testing Tools to Empower Your User ResearchUserZoom
 
Universal usability engineering
Universal usability engineeringUniversal usability engineering
Universal usability engineeringAswathi Shankar
 
User Experience Design: an Overview
User Experience Design: an OverviewUser Experience Design: an Overview
User Experience Design: an OverviewJulie Grundy
 
Social Media & Its Implications for Content Strategy
Social Media & Its Implications for Content StrategySocial Media & Its Implications for Content Strategy
Social Media & Its Implications for Content StrategyPam Noreault
 
UXPA 2021: How do you know your users feel satisfied
UXPA 2021: How do you know your users feel satisfied   UXPA 2021: How do you know your users feel satisfied
UXPA 2021: How do you know your users feel satisfied UXPA International
 
Safeabilty: Analyzing the Relationship between Safety and Reliability
Safeabilty: Analyzing the Relationship between Safety and Reliability Safeabilty: Analyzing the Relationship between Safety and Reliability
Safeabilty: Analyzing the Relationship between Safety and Reliability PlantEngineering
 

Similar to Metrics in usability testing and user experiences (20)

Unit 3_Evaluation Technique.pptx
Unit 3_Evaluation Technique.pptxUnit 3_Evaluation Technique.pptx
Unit 3_Evaluation Technique.pptx
 
Benchmarking Using SUS
Benchmarking Using SUSBenchmarking Using SUS
Benchmarking Using SUS
 
Evaluation in hci
Evaluation in hciEvaluation in hci
Evaluation in hci
 
Beyond "Quality Assurance"
Beyond "Quality Assurance"Beyond "Quality Assurance"
Beyond "Quality Assurance"
 
UX and Usability Workshop Southampton Solent University
UX and Usability Workshop Southampton Solent University UX and Usability Workshop Southampton Solent University
UX and Usability Workshop Southampton Solent University
 
Usability and evolution Human computer intraction.ppt
Usability and evolution Human computer intraction.pptUsability and evolution Human computer intraction.ppt
Usability and evolution Human computer intraction.ppt
 
Usability Testing - Sivaprasath Selvaraj
Usability Testing - Sivaprasath SelvarajUsability Testing - Sivaprasath Selvaraj
Usability Testing - Sivaprasath Selvaraj
 
Design and Evaluation techniques unit 5
Design and Evaluation techniques unit  5Design and Evaluation techniques unit  5
Design and Evaluation techniques unit 5
 
Usability in product development
Usability in product developmentUsability in product development
Usability in product development
 
Usability testing 2013.12.20.
Usability testing 2013.12.20.Usability testing 2013.12.20.
Usability testing 2013.12.20.
 
Using Automated Testing Tools to Empower Your User Research
Using Automated Testing Tools to Empower Your User ResearchUsing Automated Testing Tools to Empower Your User Research
Using Automated Testing Tools to Empower Your User Research
 
classmar2.ppt
classmar2.pptclassmar2.ppt
classmar2.ppt
 
體驗劇場_1050524_W14_易用性測試_楊政達
體驗劇場_1050524_W14_易用性測試_楊政達體驗劇場_1050524_W14_易用性測試_楊政達
體驗劇場_1050524_W14_易用性測試_楊政達
 
Universal usability engineering
Universal usability engineeringUniversal usability engineering
Universal usability engineering
 
User Experience Design: an Overview
User Experience Design: an OverviewUser Experience Design: an Overview
User Experience Design: an Overview
 
Social Media & Its Implications for Content Strategy
Social Media & Its Implications for Content StrategySocial Media & Its Implications for Content Strategy
Social Media & Its Implications for Content Strategy
 
UXPA 2021: How do you know your users feel satisfied
UXPA 2021: How do you know your users feel satisfied   UXPA 2021: How do you know your users feel satisfied
UXPA 2021: How do you know your users feel satisfied
 
Safeabilty: Analyzing the Relationship between Safety and Reliability
Safeabilty: Analyzing the Relationship between Safety and Reliability Safeabilty: Analyzing the Relationship between Safety and Reliability
Safeabilty: Analyzing the Relationship between Safety and Reliability
 
Measurement of Web Usability: An Approach
Measurement of Web Usability: An ApproachMeasurement of Web Usability: An Approach
Measurement of Web Usability: An Approach
 
2014 Paper Prototype Evaluation by David Lamas
2014 Paper Prototype Evaluation by David Lamas2014 Paper Prototype Evaluation by David Lamas
2014 Paper Prototype Evaluation by David Lamas
 

More from Him Chitchat

Creative design thinking
Creative design thinkingCreative design thinking
Creative design thinkingHim Chitchat
 
Fail fast fail often
Fail fast fail oftenFail fast fail often
Fail fast fail oftenHim Chitchat
 
Coding data with Boris software
Coding data with Boris softwareCoding data with Boris software
Coding data with Boris softwareHim Chitchat
 
Bullet journal method
Bullet journal methodBullet journal method
Bullet journal methodHim Chitchat
 
Atomic Habit Summary
Atomic Habit SummaryAtomic Habit Summary
Atomic Habit SummaryHim Chitchat
 
Outcome based education (Thai Language)
Outcome based education (Thai Language)Outcome based education (Thai Language)
Outcome based education (Thai Language)Him Chitchat
 

More from Him Chitchat (9)

Human error
Human error Human error
Human error
 
Creative design thinking
Creative design thinkingCreative design thinking
Creative design thinking
 
Fail fast fail often
Fail fast fail oftenFail fast fail often
Fail fast fail often
 
Google analytics
Google analyticsGoogle analytics
Google analytics
 
Mindset
MindsetMindset
Mindset
 
Coding data with Boris software
Coding data with Boris softwareCoding data with Boris software
Coding data with Boris software
 
Bullet journal method
Bullet journal methodBullet journal method
Bullet journal method
 
Atomic Habit Summary
Atomic Habit SummaryAtomic Habit Summary
Atomic Habit Summary
 
Outcome based education (Thai Language)
Outcome based education (Thai Language)Outcome based education (Thai Language)
Outcome based education (Thai Language)
 

Recently uploaded

Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Hy...
Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Hy...Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Hy...
Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Hy...Pooja Nehwal
 
call girls in Dakshinpuri (DELHI) 🔝 >༒9953056974 🔝 genuine Escort Service 🔝✔️✔️
call girls in Dakshinpuri  (DELHI) 🔝 >༒9953056974 🔝 genuine Escort Service 🔝✔️✔️call girls in Dakshinpuri  (DELHI) 🔝 >༒9953056974 🔝 genuine Escort Service 🔝✔️✔️
call girls in Dakshinpuri (DELHI) 🔝 >༒9953056974 🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
AMBER GRAIN EMBROIDERY | Growing folklore elements | Root-based materials, w...
AMBER GRAIN EMBROIDERY | Growing folklore elements |  Root-based materials, w...AMBER GRAIN EMBROIDERY | Growing folklore elements |  Root-based materials, w...
AMBER GRAIN EMBROIDERY | Growing folklore elements | Root-based materials, w...BarusRa
 
VIP Model Call Girls Kalyani Nagar ( Pune ) Call ON 8005736733 Starting From ...
VIP Model Call Girls Kalyani Nagar ( Pune ) Call ON 8005736733 Starting From ...VIP Model Call Girls Kalyani Nagar ( Pune ) Call ON 8005736733 Starting From ...
VIP Model Call Girls Kalyani Nagar ( Pune ) Call ON 8005736733 Starting From ...SUHANI PANDEY
 
Brookefield Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Brookefield Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...Brookefield Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Brookefield Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...amitlee9823
 
call girls in Kaushambi (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝...
call girls in Kaushambi (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝...call girls in Kaushambi (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝...
call girls in Kaushambi (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝...Delhi Call girls
 
Government polytechnic college-1.pptxabcd
Government polytechnic college-1.pptxabcdGovernment polytechnic college-1.pptxabcd
Government polytechnic college-1.pptxabcdshivubhavv
 
Call Girls in Kalkaji Delhi 8264348440 call girls ❤️
Call Girls in Kalkaji Delhi 8264348440 call girls ❤️Call Girls in Kalkaji Delhi 8264348440 call girls ❤️
Call Girls in Kalkaji Delhi 8264348440 call girls ❤️soniya singh
 
infant assessment fdbbdbdddinal ppt.pptx
infant assessment fdbbdbdddinal ppt.pptxinfant assessment fdbbdbdddinal ppt.pptx
infant assessment fdbbdbdddinal ppt.pptxsuhanimunjal27
 
Tapestry Clothing Brands: Collapsing the Funnel
Tapestry Clothing Brands: Collapsing the FunnelTapestry Clothing Brands: Collapsing the Funnel
Tapestry Clothing Brands: Collapsing the Funneljen_giacalone
 
Peaches App development presentation deck
Peaches App development presentation deckPeaches App development presentation deck
Peaches App development presentation decktbatkhuu1
 
Editorial design Magazine design project.pdf
Editorial design Magazine design project.pdfEditorial design Magazine design project.pdf
Editorial design Magazine design project.pdftbatkhuu1
 
VVIP CALL GIRLS Lucknow 💓 Lucknow < Renuka Sharma > 7877925207 Escorts Service
VVIP CALL GIRLS Lucknow 💓 Lucknow < Renuka Sharma > 7877925207 Escorts ServiceVVIP CALL GIRLS Lucknow 💓 Lucknow < Renuka Sharma > 7877925207 Escorts Service
VVIP CALL GIRLS Lucknow 💓 Lucknow < Renuka Sharma > 7877925207 Escorts Servicearoranaina404
 
DragonBall PowerPoint Template for demo.pptx
DragonBall PowerPoint Template for demo.pptxDragonBall PowerPoint Template for demo.pptx
DragonBall PowerPoint Template for demo.pptxmirandajeremy200221
 
Top Rated Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...
Top Rated  Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...Top Rated  Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...
Top Rated Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...Call Girls in Nagpur High Profile
 
Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Gi...
Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Gi...Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Gi...
Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Gi...Pooja Nehwal
 
SD_The MATATAG Curriculum Training Design.pptx
SD_The MATATAG Curriculum Training Design.pptxSD_The MATATAG Curriculum Training Design.pptx
SD_The MATATAG Curriculum Training Design.pptxjanettecruzeiro1
 
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779Best VIP Call Girls Noida Sector 47 Call Me: 8448380779
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779Delhi Call girls
 
Booking open Available Pune Call Girls Nanded City 6297143586 Call Hot India...
Booking open Available Pune Call Girls Nanded City  6297143586 Call Hot India...Booking open Available Pune Call Girls Nanded City  6297143586 Call Hot India...
Booking open Available Pune Call Girls Nanded City 6297143586 Call Hot India...Call Girls in Nagpur High Profile
 

Recently uploaded (20)

Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Hy...
Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Hy...Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Hy...
Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Hy...
 
call girls in Dakshinpuri (DELHI) 🔝 >༒9953056974 🔝 genuine Escort Service 🔝✔️✔️
call girls in Dakshinpuri  (DELHI) 🔝 >༒9953056974 🔝 genuine Escort Service 🔝✔️✔️call girls in Dakshinpuri  (DELHI) 🔝 >༒9953056974 🔝 genuine Escort Service 🔝✔️✔️
call girls in Dakshinpuri (DELHI) 🔝 >༒9953056974 🔝 genuine Escort Service 🔝✔️✔️
 
AMBER GRAIN EMBROIDERY | Growing folklore elements | Root-based materials, w...
AMBER GRAIN EMBROIDERY | Growing folklore elements |  Root-based materials, w...AMBER GRAIN EMBROIDERY | Growing folklore elements |  Root-based materials, w...
AMBER GRAIN EMBROIDERY | Growing folklore elements | Root-based materials, w...
 
VIP Model Call Girls Kalyani Nagar ( Pune ) Call ON 8005736733 Starting From ...
VIP Model Call Girls Kalyani Nagar ( Pune ) Call ON 8005736733 Starting From ...VIP Model Call Girls Kalyani Nagar ( Pune ) Call ON 8005736733 Starting From ...
VIP Model Call Girls Kalyani Nagar ( Pune ) Call ON 8005736733 Starting From ...
 
Brookefield Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Brookefield Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...Brookefield Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Brookefield Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
 
call girls in Kaushambi (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝...
call girls in Kaushambi (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝...call girls in Kaushambi (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝...
call girls in Kaushambi (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝...
 
Government polytechnic college-1.pptxabcd
Government polytechnic college-1.pptxabcdGovernment polytechnic college-1.pptxabcd
Government polytechnic college-1.pptxabcd
 
Call Girls in Kalkaji Delhi 8264348440 call girls ❤️
Call Girls in Kalkaji Delhi 8264348440 call girls ❤️Call Girls in Kalkaji Delhi 8264348440 call girls ❤️
Call Girls in Kalkaji Delhi 8264348440 call girls ❤️
 
infant assessment fdbbdbdddinal ppt.pptx
infant assessment fdbbdbdddinal ppt.pptxinfant assessment fdbbdbdddinal ppt.pptx
infant assessment fdbbdbdddinal ppt.pptx
 
Tapestry Clothing Brands: Collapsing the Funnel
Tapestry Clothing Brands: Collapsing the FunnelTapestry Clothing Brands: Collapsing the Funnel
Tapestry Clothing Brands: Collapsing the Funnel
 
Peaches App development presentation deck
Peaches App development presentation deckPeaches App development presentation deck
Peaches App development presentation deck
 
Editorial design Magazine design project.pdf
Editorial design Magazine design project.pdfEditorial design Magazine design project.pdf
Editorial design Magazine design project.pdf
 
VVIP CALL GIRLS Lucknow 💓 Lucknow < Renuka Sharma > 7877925207 Escorts Service
VVIP CALL GIRLS Lucknow 💓 Lucknow < Renuka Sharma > 7877925207 Escorts ServiceVVIP CALL GIRLS Lucknow 💓 Lucknow < Renuka Sharma > 7877925207 Escorts Service
VVIP CALL GIRLS Lucknow 💓 Lucknow < Renuka Sharma > 7877925207 Escorts Service
 
DragonBall PowerPoint Template for demo.pptx
DragonBall PowerPoint Template for demo.pptxDragonBall PowerPoint Template for demo.pptx
DragonBall PowerPoint Template for demo.pptx
 
young call girls in Pandav nagar 🔝 9953056974 🔝 Delhi escort Service
young call girls in Pandav nagar 🔝 9953056974 🔝 Delhi escort Serviceyoung call girls in Pandav nagar 🔝 9953056974 🔝 Delhi escort Service
young call girls in Pandav nagar 🔝 9953056974 🔝 Delhi escort Service
 
Top Rated Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...
Top Rated  Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...Top Rated  Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...
Top Rated Pune Call Girls Koregaon Park ⟟ 6297143586 ⟟ Call Me For Genuine S...
 
Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Gi...
Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Gi...Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Gi...
Pooja 9892124323, Call girls Services and Mumbai Escort Service Near Hotel Gi...
 
SD_The MATATAG Curriculum Training Design.pptx
SD_The MATATAG Curriculum Training Design.pptxSD_The MATATAG Curriculum Training Design.pptx
SD_The MATATAG Curriculum Training Design.pptx
 
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779Best VIP Call Girls Noida Sector 47 Call Me: 8448380779
Best VIP Call Girls Noida Sector 47 Call Me: 8448380779
 
Booking open Available Pune Call Girls Nanded City 6297143586 Call Hot India...
Booking open Available Pune Call Girls Nanded City  6297143586 Call Hot India...Booking open Available Pune Call Girls Nanded City  6297143586 Call Hot India...
Booking open Available Pune Call Girls Nanded City 6297143586 Call Hot India...
 

Metrics in usability testing and user experiences

  • 1. Metrics For Usability (UT) and User Experience (UX)
  • 2. Learning outcomes • Students are able to apply the subjective questionnaire for their usability project.
  • 3. What is usability metric? • The measurement of relative users’ performance on a given set of test tasks. The most basic measures are based on the definition o usability as quality metric: success rate, the erorr rate, and users’ subjective satisfaction.
  • 4. Benefit of UT metrics • Track progress between releases. You cannot fine-tune your methodology unless you know how well you're doing. • Assess your competitive position. Are you better or worse than other companies? Where are you better or worse? • Make a Stop/Go decision before launch. Is the design good enough to release to an unsuspecting world? • Create bonus plans for design managers and higher-level executives. For example, you can determine bonus amounts for development project leaders based on how many customer- support calls or emails their products generated during the year.
  • 5. Objective metrics • The time a task requires • The error rate • The success rate
  • 6. Metric classification 1. Task Load/Mental 2. Usability metric 3. User Experience
  • 7. Task load/mental • Subjective Mental Effort Questionnaire – SMEQ (Sauro, 2009) with 1 item i.e., measuring task difficulty. • NASA’s task load index – NASA-TLX (1980) with 6 items such as mental demand, physical demand, temporal demand, performance, effort and frustration.
  • 8. 1. SMEQ - Subjective Mental Effort Questionnaire • Post-task rating of difficulty in usability test • Measure user satisfaction immediately after the event, usually the completion of a task, potentially increasing its validity. • The question will be used for tasks. i.e., 7 tasks - 7 set of questions.
  • 9. Scales • The more scale steps in a questionnaire item the better, but with rapidly diminishing returns. • 2 to 20, there is an initial rapid increase in reliability, but it tends to level off at about 7 steps. • After 11 steps there is little gain in reliability from increasing the number of steps. The number of steps is important for single-item assessments, but is usually less important when summing scores over a number of items. • Attitude scales tend to be highly reliable because the items typically correlate rather highly with one another.
  • 10. 2. NASA’s task load index (1980) • TLX is a subject workload assessment tool to allow users to perform subjective workload assessments on operator’s working with various human-machine interface system. • Overall workload score based on a weighted average of rating on six sub scales: Mental demand, physical demand, temporal demand, performance , effort and frustration.
  • 11. Definition 1. Mental Demand (low/high) • How much mental and perceptual activity was required (i.e.,thinking, deciding, remembering, looking, searching)? 2. Physical Demand (low/high) • How much physical activity was required (for example, pushing, pulling, turning, controlling, activating)?
  • 12. Definition 3. Temporal Demand (low/high) • How much time pressure did you feel due to the rate or pace at which the tasks or task elements occurred? WAs the pace slow and leisurely or rapid and frantic? 4. Performance • How successful do you think you were in accomplishing the goals of the task set by the experimenter? 5. Effort • How hard did you have to work(mentally and physically to accomplish your level of performance? 6. Frustration level • How insecure, discouraged, irritated, stress and annoyed versus secure, gratified, content, relaxed, and complement did you feed during the task?
  • 13. Paper and pencil base App
  • 14. 2. Usability Metric • System usability scale (SUS) developed by John Brooke at Digital Equipment Corporation in the UK in 1986 as a tool to be used in usability engineering of electronic office systems. • Defined by ISO 9241 Part 11 - context of use of the system. • The scale is 0-100. It can be used to compare even systems that are outwardly dissimilar.
  • 15. SUS Strongly disagree Strongly agree 1 2 3 4 5 1. I think that I would like to use this system frequently. 2. I found the system unnecessarily complex. 3. I thought the system was easy to use. 4. I think that I would need the support of a technical person to be able to use this system. 5. I found the various functions in this system were well integrated. 6. I thought there was too much inconsistency in this system. 7. I would imagine that most people would learn to use this system very quickly 8. I found the system very cumbersome to use. 9. I felt very confident using the system. 10. I needed to learn a lot of things before I could get going with this system. Learnability Learnability Efficiency Efficiency Efficiency Satisfaction Efficiency Efficiency
  • 16. 4. After-Scenario Questionnaire (ASQ) • Lewis, 2002 • Psychometric evaluation • Measuring user satisfaction with linkert scale
  • 17. ASQ Strongly disagree Strongly agree 1 2 3 4 5 1. Overall, I am satisfied with the ease of completing the tasks in this scenario. 2. Overall, I am satisfied with the amount of time it took to complete the tasks in this scenario. 3. Overall, I am satisfied with the support information (on- line help, messages, documentation) when completing the tasks?
  • 18. 5.Net Promoter (NPS) • The percentage of customers rating their likelihood to recommend a company, a product, or a service to a friend or colleague as 9 or 10. • Those who respond with a score of 9 to 10 are called Promoters. They are considered likely to exhibit value - creating behaviors such as buying more, remaining customers for longer and making more positive referrals to other potential customers to be less likely customers. • Detractors are believed to be less likely to exhibit the value- creating behaviors. The score is 0-6.
  • 19. NPS • How likely is that you would recommend our company/product/ service to a friend or colleague? 1 2 3 4 5 6 7 8 9 10
  • 20. 6. Technology/Acceptance Model - TAM (1986) • Perceived usefulness (PU) - the degree to which a person believes that using a particular system would enhance his or her job performance". It means whether or not someone perceives that technology to be useful for what they want to do. • Perceived ease-of-use (PEOU) - "the degree to which a person believes that using a particular system would be free from effort" (Davis 1989). If the technology is easy to use, then the barriers conquered. If it's not easy to use and the interface is complicated, no one has a positive attitudes towards it.
  • 21. TAM and TRA model • TAM posits that our beliefs about ease and usefulness affect our attitude toward using which in turn affects our intention and actual use. Perceived usefulness (U) Perceived Ease of use (E) Attitude toward using (A) Behavioral intention to use (BI) Actual system use External Variables Beliefs and evaluation Normative belief and motivation to comply Subjective Norm Attitude toward behavior (A) Behavioral intention (B) Actual Behavior
  • 22.
  • 23. Laugwitz et al, 2008 with 6 items (26 variables) • Efficiency: I should perform my tasks with the product fast, efficient and in a pragmatic way. • Perspicuity: The product should be easy to understand, clear, simple, and easy to learn. • Dependability : The interaction with the product should be predictable, secure and meets my expectations. UT 7.
  • 24. 8. User Experience Questionnaire - UEQ • Stimulation: Using the product should be interesting, exiting and motivating. • Attractiveness: The product should look attractive, enjoyable, friendly and pleasant. • Novelty: The product should be innovative, inventive and creatively designed. UX Invention
  • 26. How to use the Excel-Tool? • Enter the data in the corresponding work sheet in the Excel UEQ_Data_Analysis_Tool_Version<x>.xlsx and then all relevant computations (with the exception of significance tests • To compare two products, here you need to use the Excel UEQ_Compare_Products_Version<x>.xlsx) are done automatically.
  • 27. How to interpret the data? Error Bar • The error bar describes the interval in which 95% of the scale means of these repetitions will be located. Thus, it shows how accurate your measurement is. • The size of the error bar depends on the sample size (the more participants you have, the smaller is typically the error bar) • Error bar shows how much the different participants agree (the higher the level of agreement, i.e. the more similar the answers are, the smaller is the error bar
  • 28. How to interpret the data? Cronbach-Alpha values • A measure for the consistency of a scale, i.e. it indicates that all items in a scale measure a similar construct • Rules of thumb consider values >0.6 or >0.7 as a sufficient level. • Alpha-Coefficeint is quite sensitive to sampling effects. A low Alpha value can be the result of a sampling effect and may not necessarily indicate a problem with scale consistency.
  • 29. 9. Software Usability Measurement Inventory - SUMI (1990) • Measuring users’ satisfaction (user experience) • SUMI deified user experience with work-based software products in 1995. • SUMI uses a rigorous scientific method of analysis and is backed up by over 25 years of industrial application. • SUMI is a copy right license - students need to apply by filling the on-line form http://sumi.uxp.ie/about/appform.php
  • 30. Software Usability Measurement - SUMI • Set verifiable goals of user experience • Track achievement of targets during product development • Highlight good and bad aspects of an interface • http://sumi.uxp.ie
  • 31. How many respondents are required? • A minimum of 20. A few is 12 respondents. • Get as many respondents as you can within your timeframe and budget.
  • 32. What is measurement variables? • Efficiency - users do their tasks in a quick, effective and economical manner. • Affect - user’s general emotional reaction to the software • Helpfulness - software communicates in a helpful way and assists in the resolution of operational problem. • Control - an expected and consistent way to inputs and commands. • Learnability - familiar with the software. The tutorial interface are relabel and instructive. Link of questionnaire -> http://sumi.uxp.ie/en/
  • 33.
  • 34.
  • 36. SUMI items by percentile Item: 20 prefer to stick to the functions that I know best. Percentile: 88 Verdict: More Agreement ————— Item: 12 Working with this software is satisfying. Percentile: 58 Verdict: No difference —————— Item: 8 I find that the help information given by this software is not very useful. Percentile: 39 Verdict: More Disagreement The 60th percentile indicate that your respondents gave a more positive response to that item than expected from the standardisation database. These items are given in a black colour. Items between the 60th and 40th percentiles indicate that the responses your respondents gave are pretty much in line with the standardisation database: no surprises here. These items are given in a blue colour. Items below the 40th percentile indicate that your respondents gave a more negative response to that item than expected from the standardisation database. These items are given in a red colour. To interpret them, say to yourself "Respondents agree it is NOT true that [item wording]."
  • 37. User Records Participant Global Efficiency Affect Helpfulness Control Learnability 1 70 67 59 65 74 69 2 67 67 58 65 68 69 3 66 62 54 64 69 55 4 65 58 57 73 51 49 5 64 62 54 51 61 57 6 60 64 58 55 64 51 7 59 65 64 53 69 68 8 56 65 63 53 63 68 9 53 67 64 43 62 69 10 52 61 58 41 69 70 11 47 61 58 41 56 63 12 47 60 54 45 44 63 13 46 62 63 43 52 68 14 43 43 45 43 59 49 15 38 55 59 34 49 60 Participants are arranged in the order of their Global scores, the highest Global scores at the top of the table.
  • 38. 10. PrEmo : Measure Consumer Emotion & Product Experience A unique, scientifically validated tool to instantly get insight in consumer emotions! People can report their emotions with the use of expressive cartoon animations instead of relying on the use of words. https://www.premotool.com
  • 39.
  • 40. PrEmo intro and app • https://youtu.be/yT2iciPYI0U • https://youtu.be/6pu09rTehjs
  • 41. 11. Trust in Automated system • Trust can affect how much people accept and rely upon increasingly automatedsystems (Sheridan, 1988). • General trust - trustworthy, honesty, loyalty, reliability, honor • Trust between people - trustworthy, honesty, loyalty, reliability, integrity • Trust between Human and Automated system - trustworthy, loyalty, reliability, honor. Sources : Jiuan-Yin Jian
  • 42. Trust 1 2 3 4 5 6 7 1.The system is deceptive. 2. The system behaves in an underhanded manner. 3. I am suspicious of the system’s intent, action or output. 4. I am wary of the system. 5. The system’s actions will have a harmful or injurious outcome. 6. I am confident in the system. 7. The system provides security. 8. The system has integrity. 9. The system is dependable. 10. The system is reliable. 11. I can trust the system. 12. I am familiar with the system.
  • 43. Summary • 11 Questionnaires are in this presentation. • Development of metrics from task (Human Factor) to performance (Usability testing) and users’ emotion (User experience) • There are so many questionnaire in the market. As a result, the validity of questionnaires is the crucial part.