SlideShare a Scribd company logo
1 of 54
chapter 5
evaluation techniques
Evaluation Techniques
• Evaluation
– tests usability and functionality of system
– occurs in laboratory, field and/or in collaboration with
users
– evaluates both design and implementation
– should be considered at all stages in the design life
cycle
Goals/need of Evaluation
• assess extent of system functionality
• assess effect of interface on user
• identify specific problems
• Assess design issues
Goals/need of Evaluation
• Usability Testing: Observing users to identify navigation,
comprehension, and task completion issues.
• User Satisfaction: Gathering feedback to understand preferences
and inform design improvements.
• Effectiveness and Efficiency: Assessing task performance metrics
to improve system functionality.
• Identifying Design Flaws: Using methods like heuristic evaluation
to catch issues early.
• Accessibility: Ensuring interfaces are compatible with assistive
technologies.
• Iterative Design Improvement: Continuously refining designs
based on user feedback.
• Validation of Design Decisions: Comparing alternative solutions to
make informed choices.
Introduction to Qualitative
and quantitative research .
• reference :from book “Jenny Preece,
Helen Sharp, Yvonne Rogers-Interaction
Design_ Beyond Human-Computer
Interaction-Wiley (2015)[369707]”
• Chapter 8: data analysis
• 8.1 Qualitative
• 8.2 Quantative
• 8.3 simple Qualitative analysis
• 8.4 simple qualitative analysis
Introduction Quantitative
• In short, quantitative user research is research that yields
numerical results, while qualitative research results in data
that you can’t as easily slot into a calculation.
• The type of research you conduct is very much reliant on what your
research objectives are and what kind of data will best help you
understand your users’ needs.
• Quantitative data is data that is in the form of numbers, or that can
easily be translated into numbers. For example, the number of
years’ experience the interviewees have, the number of
projects a department handles at a time, or the number of
minutes it takes to perform a task.
Quantitative
• Quantitative user research is the process of collecting and analyzing
objective, measurable data from various types of user testing.
• Quantitative data is almost always numerical and focuses on the statistical,
mathematical, and computational analysis of data. As the name suggests,
quantitative user research aims to produce results that are quantifiable.
• Examples of quantitative data
• Quantitative data answers questions of:
• How many?
• How often?
• How much?
• quantitative data is also often simple to collect, quicker to analyze, and easier to
present in the form of pie charts, bar graphs, etc.
Quantitative
• For example, in describing a population, a analysis might conclude that the
average person is 5 feet 11
inches tall, weighs 180 pounds, and is 45 years old.
Qualitative analysis
focuses on the nature of something and can be represented by themes,
patterns, and stories. For example, in describing the same population, a
qualitative analysis might conclude that the average person is tall, thin, and
middle-aged.
Statstastical analysis of
quantitative data
• These three are: mean, median, and mode.
• Mean refers to the commonly understood interpretation of average:
i.e. add together all the figures and divide by the number of figures you
started with.
• The median is the middle value of the data when the numbers are ranked.
• The mode is the most commonly occurring number. For example, in a set
of data (2, 3, 4, 6, 6, 7, 7, 7, 8), the median is 6 and the mode is 7, while the
• mean is 50/9 = 5.56. In this case, the difference between the different averages
is not that great. However, consider the set (2, 2, 2, 2, 450).
• Now the median is 2,
• the mode is 2, and
• the mean is 458/5 = 91.6!
Qualitative user research
• Qualitative user research is the process of
collecting and analyzing non-numerical data in the
form of opinions, comments, behaviors, feelings,
or motivations. Qualitative data aims to give an in-
depth look at human behavioral patterns.
• Examples of qualitative data
• Qualitative data cannot be as easily counted and
funnelled into a calculation as it’s quantitative cousin.
Where quantitative research often gives an overarching
view, qualitative research takes a deeper dive into
the why.
• Qualitative research often takes the
form of user surveys, interviews, and
observations or heuristic analysis and
focus groups.
• Qualitative data is not expressed in
numerical terms. For example,
qualitative data includes descriptions,
quotes from interviewees, vignettes of
activity, and images.
Qualitative vs Quantitative
• For example, on a questionnaire,
• questions about the participant's age or number of software
packages they use a day will result in quantitative data, while
any comment fields will result in qualitative data.
• In an observation, quantitative data you may record
includes the number of people involved in a project, or how
many hours a participant spends trying to sort out a problem
they encounter,
• While notes about the feelings of frustration, or the nature of
interactions between team members, are qualitative data.
Qualitative Data Collection
Techniques
• In depth Interviewing
• Focus Groups
• Participant Observations
• Ethnographic Studies
• Projective Techniques
• - Interviews: One-on-one conversations that go deeper into the topic at hand.
- Case studies: Collections of client stories from in-depth interviews.
- Expert opinions: Highly-researched information from well-informed sources.
- Focus groups: In-person or online conversations with small groups of people to
hear their views.
- Open-ended survey questions: A survey response that lets respondents
express their thoughts.
- Observsational research: Observing people using a product or service in daily
life.
Quantitative Data Collection
Techniques
• Closed –ended questions
• Multiple choice questions
• Eg. Closed –ended questions
• did you finish your homework?
• Have you met John before?
• Do you like pizza?
• Did you receive my email?
• Are you feeling better today?
• MCQ.
• Eg. A researcher is conducting a study on consumer behavior in a specific industry. They have
gathered qualitative data through interviews with participants. Which approach to coding
would the researcher most likely use if they want to allow themes to emerge directly from the
data without predetermined categories?
• A) Deductive coding
• B) Comparative coding
• C) Inductive coding
• D) Grounded theory coding
Qualitative analysis Process
• Data collection, data analysis and the development
and verification of relationships and conclusion are all
interrelated and interactive set of processes
• Allows researcher to recognise important themes,
patterns and relationships as you collect data
• Allows you to re-categorise existing data to see
whether themes and patterns and relationships
exist in the data already collected
• Identifying null hypothesis and its validation.
• Inductive and deductive coding
Identifying Recurring Patterns
or Themes
Themes:
Word repetitions
Compare and contrast
Searching for missing information
In qualitative research, researchers often analyze data to identify common patterns,
themes, or trends that emerge from the data. This process involves carefully
examining the data, such as interview transcripts, observational notes, or textual
data, to identify similarities, differences, and meaningful patterns within the
dataset.
Categorizing Data
Transcripts of meetings, interviews, or think-aloud protocols can be
analyzed at a high level of detail, such as identifying stories or themes,
or at a fine level of detail in which each word, phrase, utterance, or
gesture is analyzed. Either way, elements identified in the data are
usually categorized first using a categorization scheme.
Categorizing data involves organizing the information collected during
research into meaningful groups or categories. Researchers may
develop a coding scheme or framework to systematically categorize data
based on themes, concepts, or topics relevant to the research
objectives. This helps to organize the data in a way that facilitates
analysis and interpretation.
Inductive coding
Ground up coding
Inductive coding is a ground-up approach where you derive your codes from the data.
You don’t start with preconceived notions of what the codes should be, but allow the
narrative or theory to emerge from the raw data itself. This is great for exploratory
research or times when you want to come up with a new theories, ideas or concepts.
Inductive coding involves identifying patterns, themes, or categories in the
data that emerge directly from the data itself, without any predetermined
categories or theoretical framework.
Deductive Coding
• Definition: Deductive coding involves applying pre-existing categories,
concepts, or theories to the data to organize and analyze it
• Top down coding
• Deductive coding is a top down approach where you start by developing a
codebook with your initial set of codes.
• This set could be based on your research questions or an existing research
framework or theory.
• You then read through the data and assign excerpts to codes. At the end of
your analysis, your codes should still closely resemble the codebook that you
started off with.
• This is good when you have a pre-determined structure for how you need your
final findings to be.
Deductive Coding
Example
Research Topic: Job Satisfaction Among Employees in the
Hospitality In Industry
1. Identifying themes and patterns
The researcher conducts interviews with employees and reads through the transcripts to
identify recurring topics, ideas, or phrases that emerge organically from the data.
• Example:
• Recurring themes such as "Work Environment," "Recognition and Rewards," "Work-
Life Balance," and "Job Autonomy" are identified through careful reading of the
interviews.
2. Catergorization:The researcher categorizes the themes identified inductively into broader
categories such as
"Organizational Factors,"
"Individual Factors,“
"Social Factors," and
"Job Characteristics," aligning with existing literature on job satisfaction.
3. Inductive Coding:
• The researcher codes specific segments of the interview transcripts using descriptive
labels that emerge directly from the data, without pre-existing categories or
frameworks.
• Example:
• In inductively coding the transcripts, segments discussing positive interactions with
colleagues are labeled as "Positive Social Dynamics," while segments discussing
feelings of burnout are labeled as "Work Overload."
4. Deductive Coding:
• The researcher applies predefined codes or categories derived from existing theories or
research questions to the data.
• Example:
• Using deductive coding, segments discussing the impact of leadership style on job
satisfaction are labeled with predefined codes such as "Transformational Leadership" or
"Transactional Leadership," aligning with established theories in organizational
behavior.
Analysis of quantitative data
• Data Preparation: Clean and organize the data.
• Descriptive Statistics: Summarize the data using
measures like mean ,mode,median and standard
deviation.
• Exploratory Data Analysis: Explore patterns and
relationships visually.
• Interpretation: Understand and discuss the
significance of the results.
• Reporting: Present findings clearly using tables, charts,
and graphs.
The benefits of quantitative
vs. qualitative data
• Quantitative data (what):
• Enables you to measure behaviors, opinions, and trends
through close-ended questions
• Provides numerical and statistical data to analyze patterns,
averages, and correlations
• Allows generalization of findings by collecting data from large
sample sizes
• Establishes statistical significance, tracking metrics over time
and benchmarking against goals
• Qualitative data (why):
• Gives context to behaviors, motivations, and attitudes
through open-ended feedback
• Captures more subjective insights like feelings a,
opinions, and unique perspectives
• Enables the discovery of more intangibles like company
culture and unmet needs
• Allows new ideas and themes to emerge organically
from participants
Difference qualitative and
quantitative
Evaluating Designs
Heuristic Evaluation
Experimental evaluation method
Heuristic Evaluation:-book
• Proposed by Nielsen and Molich.
• usability criteria (heuristics) are identified
• design examined by experts to see if these are
violated
• Example heuristics
– system behaviour is predictable
– system behaviour is consistent
– feedback is provided
• Heuristic evaluation `debugs' design.
Nielsen’s ten heuristics are:
•
1. Visibility of system status Always keep users informed about what is going on,
through appropriate feedback within reasonable time. For example, if a system
operation will take some time, give an indication of how long and how much is
complete.
2. Match between system and the real world The system should speak the user’s
language, with words, phrases and concepts familiar to the user, rather than
system-oriented terms. Follow real-world conventions, making information
appear in natural and logical order.
3. User control and freedom Users often choose system functions by mistake
and need a clearly marked ‘emergency exit’ to leave the unwanted state without
having to go through an extended dialog. Support undo and redo.
4. Consistency and standards Users should not have to wonder whether words,
situations or actions mean the same thing in different contexts. Follow platform
conventions and accepted standards.
5. Error prevention Make it difficult to make errors. Even better than good error
messages is a careful design that prevents a problem from occurring in the first
place.
6. Recognition rather than recall Make objects, actions and options visible. The
user should not have to remember information from one part of the dialog to
another. Instructions for use of the system should be visible or easily retrievable
whenever appropriate.
7. Flexibility and efficiency of use Allow users to tailor frequent actions.
Accelerators – unseen by the novice user – may often speed up the interaction
for the expert user to such an extent that the system can cater to both inexperi-
enced and experienced users
• 8. Aesthetic and minimalist design Dialogs should not contain information that
is irrelevant or rarely needed. Every extra unit of information in a dialog competes
with the relevant units of information and diminishes their relative visibility.
9. Help users recognize, diagnose and recover from errors Error messages should
be expressed in plain language (no codes), precisely indicate the problem, and
constructively suggest a solution.
10. Help and documentation Few systems can be used with no instructions so it
may be necessary to provide help and documentation. Any such information
should be easy to search, focussed on the user’s task, list concrete steps to be
carried out, and not be too large
ratings
• 0 = I don’t agree that this is a usability problem at all
1 = Cosmetic problem only: need not be fixed unless extra time is
available on project
2 = Minor usability problem: fixing this should be given low priority
3 = Major usability problem: important to fix, so should be given high
priority
4 = Usability catastrophe: imperative to fix this before product can be
released (Nielsen)
Application of Nilesen’s10
Heuristic on portal for
reference
• List down Principles
• List down ratings (0-4)
• Visibility of system status(SVKM PORTAL)
• Selection of Course form drop down menu – rating 0
because all the courses in which I was added as a
faculty are visible to me so .
• Match between system and the real world rating 4
because voice recognition option is not available on
portal also different views or language options are not
available on same so it is required to add this feature
mainly for Disabled users .
• Continue this for all 10 heuristic
Evaluating through user
Participation
1.Styles of evaluation
• Lab study
• Field study
Laboratory studies
• Advantages:
– specialist equipment available
– uninterrupted environment
• Disadvantages:
– lack of context
– difficult to observe several users cooperating
• Appropriate
– if system location is dangerous or impractical for
constrained single user systems to allow controlled
manipulation of use
Field Studies
• Advantages:
– natural environment
– context retained (though observation may alter it)
– longitudinal studies possible
• Disadvantages:
– distractions
– noise
• Appropriate
– where context is crucial for longitudinal studies
2.Empirical methods:
experimental evaluation
• Participant
• If not real user then selected on basis
of age and literacy which should be
closer to user
• sample size: the sample size must be
large enough to be considered to be
representative of the population,
Experimental factors
• Participant
– who – representative, sufficient sample
• Variables
– things to modify and measure
• Hypothesis
– what you’d like to show
• Experimental design
– how you are going to do it
• Test
Variables
• independent variable (IV)
characteristic changed to produce different
conditions
e.g. interface style, number of menu items
• dependent variable (DV)
characteristics measured in the experiment
e.g. time taken, number of errors.
Hypothesis
• prediction of outcome
– framed in terms of IV and DV
e.g. “error rate will increase as font size decreases”
• null hypothesis:
– states no difference between conditions
– aim is to disprove this
e.g. null hyp. = “no change with font size”
Experimental design
• within groups design (repeated measures)
– one participant performs experiment under each condition/each UI
– Within-subjects (or repeated-measures) study design: the same person
tests all the conditions (i.e., all the user interfaces).
– transfer of learning possible
– less costly and less likely to suffer from user variation.
– Less users
– Eg only one user can test all features of app.
• between groups design
– each participant performs under different condition
– Between-subjects (or between-groups) study design: different people test
different condition, so that each person is only exposed to a single user
interface
– no transfer of learning
– more users required
– variation can bias results.
– Multiple users can test single feature of app.
Analysis of data
• Before you start to do any statistics:
– look at data
– save original data
• Choice of statistical technique depends on
– type of data
– information required
• Type of data
– discrete - finite number of values
– discrete variable can only take a finite number of values or levels,
for example, a screen color that can be red, green or blue.
– continuous - any value
– ivide heights
– into short (<5 ft (1.5 m)), medium (5–6 ft (1.5–1.8 m)) and tall
(>6 ft (1.8 m)).
Parametric test
• Parametric Tests:
• Assumption: Parametric tests assume that the data
follow a specific probability distribution, usually the
normal distribution.
• Data Type: Parametric tests are appropriate for interval
or ratio data.
• Examples: t-tests, analysis of variance (ANOVA),
Pearson correlation, linear regression.
non-parametric tests.
• Nonparametric Tests:
• Assumption: Nonparametric tests do not assume a specific
probability distribution for the data.
• Data Type: Nonparametric tests are suitable for ordinal or
nominal data, as well as for data that violate parametric
assumptions.
• Examples: Mann-Whitney U test, Wilcoxon signed-rank test,
Kruskal-Wallis test, Spearman correlation.
Test
• Discrete IV- more than or equal to two
discrete independent variable:- ANOVA
Analysis of varience .
• Less than two discrete independent
variable :- T test
Test example
• Comparing Two Versions of Microsoft Word:
– If you're comparing two different versions of Microsoft Word (e.g., Word 2016 vs. Word
2021), you could use an independent samples t-test.
– You would collect data from users performing specific tasks or answering questions
related to usability, satisfaction, or efficiency in both versions of Word.
– The t-test would help determine if there's a significant difference in user performance or
preferences between the two versions.
• Comparing Multiple Versions or Features within Microsoft Word:
– If you're comparing more than two versions of Microsoft Word or specific features within
Word (e.g., comparing the performance of different editing interfaces), you might use
ANOVA.
– ANOVA would help determine if there are significant differences in user performance or
preferences across the multiple versions or features being compared.
– If ANOVA indicates a significant difference can be used to identify which specific versions
or features differ significantly from each other.
Design an experiment to test
whether adding color coding to
an interface will improve
accuracy.
Identify your hypothesis,
participant group, dependent
and independent variables,
experimental
design, task and analysis
approach.
• Hypothesis:
– Our hypothesis is that incorporating color coding into
the interface will enhance accuracy in completing a
specific task.
• Participant Group:
– We will recruit participants from a diverse pool,
including individuals with varying levels of computer
literacy and familiarity with color-coded interfaces.
– Ideally, we should aim for a sample size of at least
30 participants to achieve statistical significance.
• Dependent Variable (DV):
– The dependent variable is accuracy. We will measure how accurately
participants complete a specific task using the interface.
• Independent Variable (IV):
– The independent variable is color coding. We will have two conditions:
• Color-Coded Interface: Participants interact with an interface where relevant
elements (e.g., buttons, icons, text) are color-coded.
• Non-Color-Coded Interface: Participants interact with an interface without any
color coding.
• Experimental Design:
– We will use a between-subjects design (each participant experiences
only one condition).
– Randomly assign participants to either the color-coded or non-color-coded
group.
– Counterbalance the order of conditions to control for order effects (half
start with color-coded, half with non-color-coded).
Task:
Participants will perform a specific task using the interface.
For example, they might be asked to find and click on a
specific button or navigate through a menu.
The task should be relevant to the interface’s purpose (e.g.,
completing an online purchase, searching for information).
Analysis Approach:
Compare the accuracy scores between the two groups (color-
coded vs. non-color-coded).
Use inferential statistics (e.g., t-test or ANOVA) to determine
if the difference in accuracy is statistically significant.
If the color-coded group performs significantly better, we can
conclude that color coding improves accuracy.
• Perform experimental evaluation
analysis to test design of icons.
• Hypothesis:
– Our hypothesis is that different icon designs (color, shape,
simplicity) will influence user performance in completing a specific
task.
– Specifically, we expect that well-designed icons will lead to faster
and more accurate task completion.
• Participant Group:
– We will select participants from a diverse pool, including
individuals with varying levels of familiarity with digital interfaces.
– Ideally, aim for a sample size of at least 30 participants to achieve
statistical significance.
• Dependent Variable (DV):
– The dependent variable is user performance, which includes
both accuracy (correct task completion) and speed (time taken
to complete the task).
• Independent Variable (IV):
– The independent variable is icon design. We will manipulate this
variable by creating different versions of the same interface with
varying icon designs.
• Color-Coded Icons: Icons with distinct colors.
• Monochromatic Icons: Icons with grayscale or single-color design.
• Simplified Icons: Minimalistic, easy-to-recognize icons.
• Experimental Design:
– We will use a within-subjects design (each participant experiences all conditions).
– Participants will interact with the interface under three different conditions (color-coded,
monochromatic, and simplified icons).
– Counterbalance the order of conditions to control for order effects (half start with color-
coded, half with monochromatic).
• Task:
– Participants will perform a specific task using the interface. For example, they might be
asked to find and click on a specific icon (e.g., “Save,” “Print,” “Search”).
– The task should be relevant to the interface’s purpose (e.g., document editing software,
e-commerce website).
• TEST :
– Compare user performance (accuracy and speed) across the three icon design
conditions.
– ANOVA test

More Related Content

Similar to e3_chapter__5_evaluation_technics_HCeVpPLCvE.ppt

Data analysis – qualitative data presentation 2
Data analysis – qualitative data   presentation 2Data analysis – qualitative data   presentation 2
Data analysis – qualitative data presentation 2Azura Zaki
 
Difference between quantitative and qualitative research
Difference between quantitative and qualitative researchDifference between quantitative and qualitative research
Difference between quantitative and qualitative researchSafi Nawam
 
Presentation on research methodologies
Presentation on research methodologiesPresentation on research methodologies
Presentation on research methodologiesBilal Naqeeb
 
writing research proposal (education).pptx
writing research proposal (education).pptxwriting research proposal (education).pptx
writing research proposal (education).pptxDrAmanSaxena
 
Introduction to business research copy
Introduction to business research   copyIntroduction to business research   copy
Introduction to business research copyAnjuPrakash7
 
BRM Revision.pdf
BRM Revision.pdfBRM Revision.pdf
BRM Revision.pdfmadhu928426
 
Business Research Methods Unit II
Business Research Methods Unit IIBusiness Research Methods Unit II
Business Research Methods Unit IIKartikeya Singh
 
RESEARCH DESIGN
RESEARCH DESIGNRESEARCH DESIGN
RESEARCH DESIGNDrRachna6
 
Research skills methodology, steps for research
Research skills methodology, steps for researchResearch skills methodology, steps for research
Research skills methodology, steps for researcharunts23
 
research design
 research design research design
research designkpgandhi
 
RESEARCH APPROACHES AND DESIGNS.pptx
RESEARCH APPROACHES AND DESIGNS.pptxRESEARCH APPROACHES AND DESIGNS.pptx
RESEARCH APPROACHES AND DESIGNS.pptxPRADEEP ABOTHU
 
Analysing_quantitative_data.ppt
Analysing_quantitative_data.pptAnalysing_quantitative_data.ppt
Analysing_quantitative_data.pptteweldemezigebu
 
steps and of research.pptx
steps and of research.pptxsteps and of research.pptx
steps and of research.pptxManjuSingh118444
 

Similar to e3_chapter__5_evaluation_technics_HCeVpPLCvE.ppt (20)

Data analysis – qualitative data presentation 2
Data analysis – qualitative data   presentation 2Data analysis – qualitative data   presentation 2
Data analysis – qualitative data presentation 2
 
Difference between quantitative and qualitative research
Difference between quantitative and qualitative researchDifference between quantitative and qualitative research
Difference between quantitative and qualitative research
 
Business Research
Business ResearchBusiness Research
Business Research
 
Presentation on research methodologies
Presentation on research methodologiesPresentation on research methodologies
Presentation on research methodologies
 
writing research proposal (education).pptx
writing research proposal (education).pptxwriting research proposal (education).pptx
writing research proposal (education).pptx
 
Research aptitude
Research aptitudeResearch aptitude
Research aptitude
 
Introduction to business research copy
Introduction to business research   copyIntroduction to business research   copy
Introduction to business research copy
 
Analysing_quantitative_data.ppt
Analysing_quantitative_data.pptAnalysing_quantitative_data.ppt
Analysing_quantitative_data.ppt
 
BRM Revision.pdf
BRM Revision.pdfBRM Revision.pdf
BRM Revision.pdf
 
Business Research Methods Unit II
Business Research Methods Unit IIBusiness Research Methods Unit II
Business Research Methods Unit II
 
Chapter 2.pptx
Chapter 2.pptxChapter 2.pptx
Chapter 2.pptx
 
RESEARCH DESIGN
RESEARCH DESIGNRESEARCH DESIGN
RESEARCH DESIGN
 
data analysis.ppt
data analysis.pptdata analysis.ppt
data analysis.ppt
 
data analysis.pptx
data analysis.pptxdata analysis.pptx
data analysis.pptx
 
Research skills methodology, steps for research
Research skills methodology, steps for researchResearch skills methodology, steps for research
Research skills methodology, steps for research
 
research design
 research design research design
research design
 
RESEARCH APPROACHES AND DESIGNS.pptx
RESEARCH APPROACHES AND DESIGNS.pptxRESEARCH APPROACHES AND DESIGNS.pptx
RESEARCH APPROACHES AND DESIGNS.pptx
 
Analysing_quantitative_data.ppt
Analysing_quantitative_data.pptAnalysing_quantitative_data.ppt
Analysing_quantitative_data.ppt
 
Cet7034 unit 4
Cet7034 unit 4Cet7034 unit 4
Cet7034 unit 4
 
steps and of research.pptx
steps and of research.pptxsteps and of research.pptx
steps and of research.pptx
 

Recently uploaded

Auto Affiliate AI Earns First Commission in 3 Hours..pdf
Auto Affiliate  AI Earns First Commission in 3 Hours..pdfAuto Affiliate  AI Earns First Commission in 3 Hours..pdf
Auto Affiliate AI Earns First Commission in 3 Hours..pdfSelfMade bd
 
Spring into AI presented by Dan Vega 5/14
Spring into AI presented by Dan Vega 5/14Spring into AI presented by Dan Vega 5/14
Spring into AI presented by Dan Vega 5/14VMware Tanzu
 
GraphSummit Milan - Visione e roadmap del prodotto Neo4j
GraphSummit Milan - Visione e roadmap del prodotto Neo4jGraphSummit Milan - Visione e roadmap del prodotto Neo4j
GraphSummit Milan - Visione e roadmap del prodotto Neo4jNeo4j
 
COMPUTER AND ITS COMPONENTS PPT.by naitik sharma Class 9th A mittal internati...
COMPUTER AND ITS COMPONENTS PPT.by naitik sharma Class 9th A mittal internati...COMPUTER AND ITS COMPONENTS PPT.by naitik sharma Class 9th A mittal internati...
COMPUTER AND ITS COMPONENTS PPT.by naitik sharma Class 9th A mittal internati...naitiksharma1124
 
From Knowledge Graphs via Lego Bricks to scientific conversations.pptx
From Knowledge Graphs via Lego Bricks to scientific conversations.pptxFrom Knowledge Graphs via Lego Bricks to scientific conversations.pptx
From Knowledge Graphs via Lego Bricks to scientific conversations.pptxNeo4j
 
The Strategic Impact of Buying vs Building in Test Automation
The Strategic Impact of Buying vs Building in Test AutomationThe Strategic Impact of Buying vs Building in Test Automation
The Strategic Impact of Buying vs Building in Test AutomationElement34
 
From Theory to Practice: Utilizing SpiraPlan's REST API
From Theory to Practice: Utilizing SpiraPlan's REST APIFrom Theory to Practice: Utilizing SpiraPlan's REST API
From Theory to Practice: Utilizing SpiraPlan's REST APIInflectra
 
BusinessGPT - Security and Governance for Generative AI
BusinessGPT  - Security and Governance for Generative AIBusinessGPT  - Security and Governance for Generative AI
BusinessGPT - Security and Governance for Generative AIAGATSoftware
 
The mythical technical debt. (Brooke, please, forgive me)
The mythical technical debt. (Brooke, please, forgive me)The mythical technical debt. (Brooke, please, forgive me)
The mythical technical debt. (Brooke, please, forgive me)Roberto Bettazzoni
 
The Evolution of Web App Testing_ An Ultimate Guide to Future Trends.pdf
The Evolution of Web App Testing_ An Ultimate Guide to Future Trends.pdfThe Evolution of Web App Testing_ An Ultimate Guide to Future Trends.pdf
The Evolution of Web App Testing_ An Ultimate Guide to Future Trends.pdfkalichargn70th171
 
Abortion Pill Prices Jane Furse ](+27832195400*)[ 🏥 Women's Abortion Clinic i...
Abortion Pill Prices Jane Furse ](+27832195400*)[ 🏥 Women's Abortion Clinic i...Abortion Pill Prices Jane Furse ](+27832195400*)[ 🏥 Women's Abortion Clinic i...
Abortion Pill Prices Jane Furse ](+27832195400*)[ 🏥 Women's Abortion Clinic i...Abortion Clinic
 
Workshop - Architecting Innovative Graph Applications- GraphSummit Milan
Workshop -  Architecting Innovative Graph Applications- GraphSummit MilanWorkshop -  Architecting Innovative Graph Applications- GraphSummit Milan
Workshop - Architecting Innovative Graph Applications- GraphSummit MilanNeo4j
 
Community is Just as Important as Code by Andrea Goulet
Community is Just as Important as Code by Andrea GouletCommunity is Just as Important as Code by Andrea Goulet
Community is Just as Important as Code by Andrea GouletAndrea Goulet
 
Weeding your micro service landscape.pdf
Weeding your micro service landscape.pdfWeeding your micro service landscape.pdf
Weeding your micro service landscape.pdftimtebeek1
 
Prompt Engineering - an Art, a Science, or your next Job Title?
Prompt Engineering - an Art, a Science, or your next Job Title?Prompt Engineering - an Art, a Science, or your next Job Title?
Prompt Engineering - an Art, a Science, or your next Job Title?Maxim Salnikov
 
Microsoft365_Dev_Security_2024_05_16.pdf
Microsoft365_Dev_Security_2024_05_16.pdfMicrosoft365_Dev_Security_2024_05_16.pdf
Microsoft365_Dev_Security_2024_05_16.pdfMarkus Moeller
 
Software Engineering - Introduction + Process Models + Requirements Engineering
Software Engineering - Introduction + Process Models + Requirements EngineeringSoftware Engineering - Introduction + Process Models + Requirements Engineering
Software Engineering - Introduction + Process Models + Requirements EngineeringPrakhyath Rai
 
A Deep Dive into Secure Product Development Frameworks.pdf
A Deep Dive into Secure Product Development Frameworks.pdfA Deep Dive into Secure Product Development Frameworks.pdf
A Deep Dive into Secure Product Development Frameworks.pdfICS
 

Recently uploaded (20)

Auto Affiliate AI Earns First Commission in 3 Hours..pdf
Auto Affiliate  AI Earns First Commission in 3 Hours..pdfAuto Affiliate  AI Earns First Commission in 3 Hours..pdf
Auto Affiliate AI Earns First Commission in 3 Hours..pdf
 
Spring into AI presented by Dan Vega 5/14
Spring into AI presented by Dan Vega 5/14Spring into AI presented by Dan Vega 5/14
Spring into AI presented by Dan Vega 5/14
 
GraphSummit Milan - Visione e roadmap del prodotto Neo4j
GraphSummit Milan - Visione e roadmap del prodotto Neo4jGraphSummit Milan - Visione e roadmap del prodotto Neo4j
GraphSummit Milan - Visione e roadmap del prodotto Neo4j
 
COMPUTER AND ITS COMPONENTS PPT.by naitik sharma Class 9th A mittal internati...
COMPUTER AND ITS COMPONENTS PPT.by naitik sharma Class 9th A mittal internati...COMPUTER AND ITS COMPONENTS PPT.by naitik sharma Class 9th A mittal internati...
COMPUTER AND ITS COMPONENTS PPT.by naitik sharma Class 9th A mittal internati...
 
From Knowledge Graphs via Lego Bricks to scientific conversations.pptx
From Knowledge Graphs via Lego Bricks to scientific conversations.pptxFrom Knowledge Graphs via Lego Bricks to scientific conversations.pptx
From Knowledge Graphs via Lego Bricks to scientific conversations.pptx
 
The Strategic Impact of Buying vs Building in Test Automation
The Strategic Impact of Buying vs Building in Test AutomationThe Strategic Impact of Buying vs Building in Test Automation
The Strategic Impact of Buying vs Building in Test Automation
 
From Theory to Practice: Utilizing SpiraPlan's REST API
From Theory to Practice: Utilizing SpiraPlan's REST APIFrom Theory to Practice: Utilizing SpiraPlan's REST API
From Theory to Practice: Utilizing SpiraPlan's REST API
 
BusinessGPT - Security and Governance for Generative AI
BusinessGPT  - Security and Governance for Generative AIBusinessGPT  - Security and Governance for Generative AI
BusinessGPT - Security and Governance for Generative AI
 
The mythical technical debt. (Brooke, please, forgive me)
The mythical technical debt. (Brooke, please, forgive me)The mythical technical debt. (Brooke, please, forgive me)
The mythical technical debt. (Brooke, please, forgive me)
 
The Evolution of Web App Testing_ An Ultimate Guide to Future Trends.pdf
The Evolution of Web App Testing_ An Ultimate Guide to Future Trends.pdfThe Evolution of Web App Testing_ An Ultimate Guide to Future Trends.pdf
The Evolution of Web App Testing_ An Ultimate Guide to Future Trends.pdf
 
Abortion Pill Prices Germiston ](+27832195400*)[ 🏥 Women's Abortion Clinic in...
Abortion Pill Prices Germiston ](+27832195400*)[ 🏥 Women's Abortion Clinic in...Abortion Pill Prices Germiston ](+27832195400*)[ 🏥 Women's Abortion Clinic in...
Abortion Pill Prices Germiston ](+27832195400*)[ 🏥 Women's Abortion Clinic in...
 
Abortion Pill Prices Jane Furse ](+27832195400*)[ 🏥 Women's Abortion Clinic i...
Abortion Pill Prices Jane Furse ](+27832195400*)[ 🏥 Women's Abortion Clinic i...Abortion Pill Prices Jane Furse ](+27832195400*)[ 🏥 Women's Abortion Clinic i...
Abortion Pill Prices Jane Furse ](+27832195400*)[ 🏥 Women's Abortion Clinic i...
 
Workshop - Architecting Innovative Graph Applications- GraphSummit Milan
Workshop -  Architecting Innovative Graph Applications- GraphSummit MilanWorkshop -  Architecting Innovative Graph Applications- GraphSummit Milan
Workshop - Architecting Innovative Graph Applications- GraphSummit Milan
 
Community is Just as Important as Code by Andrea Goulet
Community is Just as Important as Code by Andrea GouletCommunity is Just as Important as Code by Andrea Goulet
Community is Just as Important as Code by Andrea Goulet
 
Weeding your micro service landscape.pdf
Weeding your micro service landscape.pdfWeeding your micro service landscape.pdf
Weeding your micro service landscape.pdf
 
Prompt Engineering - an Art, a Science, or your next Job Title?
Prompt Engineering - an Art, a Science, or your next Job Title?Prompt Engineering - an Art, a Science, or your next Job Title?
Prompt Engineering - an Art, a Science, or your next Job Title?
 
Microsoft365_Dev_Security_2024_05_16.pdf
Microsoft365_Dev_Security_2024_05_16.pdfMicrosoft365_Dev_Security_2024_05_16.pdf
Microsoft365_Dev_Security_2024_05_16.pdf
 
Software Engineering - Introduction + Process Models + Requirements Engineering
Software Engineering - Introduction + Process Models + Requirements EngineeringSoftware Engineering - Introduction + Process Models + Requirements Engineering
Software Engineering - Introduction + Process Models + Requirements Engineering
 
A Deep Dive into Secure Product Development Frameworks.pdf
A Deep Dive into Secure Product Development Frameworks.pdfA Deep Dive into Secure Product Development Frameworks.pdf
A Deep Dive into Secure Product Development Frameworks.pdf
 
Abortion Clinic In Springs ](+27832195400*)[ 🏥 Safe Abortion Pills in Springs...
Abortion Clinic In Springs ](+27832195400*)[ 🏥 Safe Abortion Pills in Springs...Abortion Clinic In Springs ](+27832195400*)[ 🏥 Safe Abortion Pills in Springs...
Abortion Clinic In Springs ](+27832195400*)[ 🏥 Safe Abortion Pills in Springs...
 

e3_chapter__5_evaluation_technics_HCeVpPLCvE.ppt

  • 2. Evaluation Techniques • Evaluation – tests usability and functionality of system – occurs in laboratory, field and/or in collaboration with users – evaluates both design and implementation – should be considered at all stages in the design life cycle
  • 3. Goals/need of Evaluation • assess extent of system functionality • assess effect of interface on user • identify specific problems • Assess design issues
  • 4. Goals/need of Evaluation • Usability Testing: Observing users to identify navigation, comprehension, and task completion issues. • User Satisfaction: Gathering feedback to understand preferences and inform design improvements. • Effectiveness and Efficiency: Assessing task performance metrics to improve system functionality. • Identifying Design Flaws: Using methods like heuristic evaluation to catch issues early. • Accessibility: Ensuring interfaces are compatible with assistive technologies. • Iterative Design Improvement: Continuously refining designs based on user feedback. • Validation of Design Decisions: Comparing alternative solutions to make informed choices.
  • 5. Introduction to Qualitative and quantitative research . • reference :from book “Jenny Preece, Helen Sharp, Yvonne Rogers-Interaction Design_ Beyond Human-Computer Interaction-Wiley (2015)[369707]” • Chapter 8: data analysis • 8.1 Qualitative • 8.2 Quantative • 8.3 simple Qualitative analysis • 8.4 simple qualitative analysis
  • 6. Introduction Quantitative • In short, quantitative user research is research that yields numerical results, while qualitative research results in data that you can’t as easily slot into a calculation. • The type of research you conduct is very much reliant on what your research objectives are and what kind of data will best help you understand your users’ needs. • Quantitative data is data that is in the form of numbers, or that can easily be translated into numbers. For example, the number of years’ experience the interviewees have, the number of projects a department handles at a time, or the number of minutes it takes to perform a task.
  • 7. Quantitative • Quantitative user research is the process of collecting and analyzing objective, measurable data from various types of user testing. • Quantitative data is almost always numerical and focuses on the statistical, mathematical, and computational analysis of data. As the name suggests, quantitative user research aims to produce results that are quantifiable. • Examples of quantitative data • Quantitative data answers questions of: • How many? • How often? • How much? • quantitative data is also often simple to collect, quicker to analyze, and easier to present in the form of pie charts, bar graphs, etc.
  • 8. Quantitative • For example, in describing a population, a analysis might conclude that the average person is 5 feet 11 inches tall, weighs 180 pounds, and is 45 years old. Qualitative analysis focuses on the nature of something and can be represented by themes, patterns, and stories. For example, in describing the same population, a qualitative analysis might conclude that the average person is tall, thin, and middle-aged.
  • 9. Statstastical analysis of quantitative data • These three are: mean, median, and mode. • Mean refers to the commonly understood interpretation of average: i.e. add together all the figures and divide by the number of figures you started with. • The median is the middle value of the data when the numbers are ranked. • The mode is the most commonly occurring number. For example, in a set of data (2, 3, 4, 6, 6, 7, 7, 7, 8), the median is 6 and the mode is 7, while the • mean is 50/9 = 5.56. In this case, the difference between the different averages is not that great. However, consider the set (2, 2, 2, 2, 450). • Now the median is 2, • the mode is 2, and • the mean is 458/5 = 91.6!
  • 10. Qualitative user research • Qualitative user research is the process of collecting and analyzing non-numerical data in the form of opinions, comments, behaviors, feelings, or motivations. Qualitative data aims to give an in- depth look at human behavioral patterns. • Examples of qualitative data • Qualitative data cannot be as easily counted and funnelled into a calculation as it’s quantitative cousin. Where quantitative research often gives an overarching view, qualitative research takes a deeper dive into the why.
  • 11. • Qualitative research often takes the form of user surveys, interviews, and observations or heuristic analysis and focus groups. • Qualitative data is not expressed in numerical terms. For example, qualitative data includes descriptions, quotes from interviewees, vignettes of activity, and images.
  • 12. Qualitative vs Quantitative • For example, on a questionnaire, • questions about the participant's age or number of software packages they use a day will result in quantitative data, while any comment fields will result in qualitative data. • In an observation, quantitative data you may record includes the number of people involved in a project, or how many hours a participant spends trying to sort out a problem they encounter, • While notes about the feelings of frustration, or the nature of interactions between team members, are qualitative data.
  • 13. Qualitative Data Collection Techniques • In depth Interviewing • Focus Groups • Participant Observations • Ethnographic Studies • Projective Techniques • - Interviews: One-on-one conversations that go deeper into the topic at hand. - Case studies: Collections of client stories from in-depth interviews. - Expert opinions: Highly-researched information from well-informed sources. - Focus groups: In-person or online conversations with small groups of people to hear their views. - Open-ended survey questions: A survey response that lets respondents express their thoughts. - Observsational research: Observing people using a product or service in daily life.
  • 14. Quantitative Data Collection Techniques • Closed –ended questions • Multiple choice questions • Eg. Closed –ended questions • did you finish your homework? • Have you met John before? • Do you like pizza? • Did you receive my email? • Are you feeling better today? • MCQ. • Eg. A researcher is conducting a study on consumer behavior in a specific industry. They have gathered qualitative data through interviews with participants. Which approach to coding would the researcher most likely use if they want to allow themes to emerge directly from the data without predetermined categories? • A) Deductive coding • B) Comparative coding • C) Inductive coding • D) Grounded theory coding
  • 15. Qualitative analysis Process • Data collection, data analysis and the development and verification of relationships and conclusion are all interrelated and interactive set of processes • Allows researcher to recognise important themes, patterns and relationships as you collect data • Allows you to re-categorise existing data to see whether themes and patterns and relationships exist in the data already collected • Identifying null hypothesis and its validation. • Inductive and deductive coding
  • 16. Identifying Recurring Patterns or Themes Themes: Word repetitions Compare and contrast Searching for missing information In qualitative research, researchers often analyze data to identify common patterns, themes, or trends that emerge from the data. This process involves carefully examining the data, such as interview transcripts, observational notes, or textual data, to identify similarities, differences, and meaningful patterns within the dataset.
  • 17. Categorizing Data Transcripts of meetings, interviews, or think-aloud protocols can be analyzed at a high level of detail, such as identifying stories or themes, or at a fine level of detail in which each word, phrase, utterance, or gesture is analyzed. Either way, elements identified in the data are usually categorized first using a categorization scheme. Categorizing data involves organizing the information collected during research into meaningful groups or categories. Researchers may develop a coding scheme or framework to systematically categorize data based on themes, concepts, or topics relevant to the research objectives. This helps to organize the data in a way that facilitates analysis and interpretation.
  • 18. Inductive coding Ground up coding Inductive coding is a ground-up approach where you derive your codes from the data. You don’t start with preconceived notions of what the codes should be, but allow the narrative or theory to emerge from the raw data itself. This is great for exploratory research or times when you want to come up with a new theories, ideas or concepts. Inductive coding involves identifying patterns, themes, or categories in the data that emerge directly from the data itself, without any predetermined categories or theoretical framework.
  • 19. Deductive Coding • Definition: Deductive coding involves applying pre-existing categories, concepts, or theories to the data to organize and analyze it • Top down coding • Deductive coding is a top down approach where you start by developing a codebook with your initial set of codes. • This set could be based on your research questions or an existing research framework or theory. • You then read through the data and assign excerpts to codes. At the end of your analysis, your codes should still closely resemble the codebook that you started off with. • This is good when you have a pre-determined structure for how you need your final findings to be.
  • 21. Example Research Topic: Job Satisfaction Among Employees in the Hospitality In Industry 1. Identifying themes and patterns The researcher conducts interviews with employees and reads through the transcripts to identify recurring topics, ideas, or phrases that emerge organically from the data. • Example: • Recurring themes such as "Work Environment," "Recognition and Rewards," "Work- Life Balance," and "Job Autonomy" are identified through careful reading of the interviews. 2. Catergorization:The researcher categorizes the themes identified inductively into broader categories such as "Organizational Factors," "Individual Factors,“ "Social Factors," and "Job Characteristics," aligning with existing literature on job satisfaction.
  • 22. 3. Inductive Coding: • The researcher codes specific segments of the interview transcripts using descriptive labels that emerge directly from the data, without pre-existing categories or frameworks. • Example: • In inductively coding the transcripts, segments discussing positive interactions with colleagues are labeled as "Positive Social Dynamics," while segments discussing feelings of burnout are labeled as "Work Overload." 4. Deductive Coding: • The researcher applies predefined codes or categories derived from existing theories or research questions to the data. • Example: • Using deductive coding, segments discussing the impact of leadership style on job satisfaction are labeled with predefined codes such as "Transformational Leadership" or "Transactional Leadership," aligning with established theories in organizational behavior.
  • 23. Analysis of quantitative data • Data Preparation: Clean and organize the data. • Descriptive Statistics: Summarize the data using measures like mean ,mode,median and standard deviation. • Exploratory Data Analysis: Explore patterns and relationships visually. • Interpretation: Understand and discuss the significance of the results. • Reporting: Present findings clearly using tables, charts, and graphs.
  • 24. The benefits of quantitative vs. qualitative data • Quantitative data (what): • Enables you to measure behaviors, opinions, and trends through close-ended questions • Provides numerical and statistical data to analyze patterns, averages, and correlations • Allows generalization of findings by collecting data from large sample sizes • Establishes statistical significance, tracking metrics over time and benchmarking against goals
  • 25. • Qualitative data (why): • Gives context to behaviors, motivations, and attitudes through open-ended feedback • Captures more subjective insights like feelings a, opinions, and unique perspectives • Enables the discovery of more intangibles like company culture and unmet needs • Allows new ideas and themes to emerge organically from participants
  • 28. Heuristic Evaluation:-book • Proposed by Nielsen and Molich. • usability criteria (heuristics) are identified • design examined by experts to see if these are violated • Example heuristics – system behaviour is predictable – system behaviour is consistent – feedback is provided • Heuristic evaluation `debugs' design.
  • 29. Nielsen’s ten heuristics are: • 1. Visibility of system status Always keep users informed about what is going on, through appropriate feedback within reasonable time. For example, if a system operation will take some time, give an indication of how long and how much is complete. 2. Match between system and the real world The system should speak the user’s language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in natural and logical order. 3. User control and freedom Users often choose system functions by mistake and need a clearly marked ‘emergency exit’ to leave the unwanted state without having to go through an extended dialog. Support undo and redo. 4. Consistency and standards Users should not have to wonder whether words, situations or actions mean the same thing in different contexts. Follow platform conventions and accepted standards. 5. Error prevention Make it difficult to make errors. Even better than good error messages is a careful design that prevents a problem from occurring in the first place. 6. Recognition rather than recall Make objects, actions and options visible. The user should not have to remember information from one part of the dialog to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. 7. Flexibility and efficiency of use Allow users to tailor frequent actions. Accelerators – unseen by the novice user – may often speed up the interaction for the expert user to such an extent that the system can cater to both inexperi- enced and experienced users
  • 30. • 8. Aesthetic and minimalist design Dialogs should not contain information that is irrelevant or rarely needed. Every extra unit of information in a dialog competes with the relevant units of information and diminishes their relative visibility. 9. Help users recognize, diagnose and recover from errors Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. 10. Help and documentation Few systems can be used with no instructions so it may be necessary to provide help and documentation. Any such information should be easy to search, focussed on the user’s task, list concrete steps to be carried out, and not be too large
  • 31. ratings • 0 = I don’t agree that this is a usability problem at all 1 = Cosmetic problem only: need not be fixed unless extra time is available on project 2 = Minor usability problem: fixing this should be given low priority 3 = Major usability problem: important to fix, so should be given high priority 4 = Usability catastrophe: imperative to fix this before product can be released (Nielsen)
  • 32. Application of Nilesen’s10 Heuristic on portal for reference • List down Principles • List down ratings (0-4) • Visibility of system status(SVKM PORTAL) • Selection of Course form drop down menu – rating 0 because all the courses in which I was added as a faculty are visible to me so . • Match between system and the real world rating 4 because voice recognition option is not available on portal also different views or language options are not available on same so it is required to add this feature mainly for Disabled users . • Continue this for all 10 heuristic
  • 34. 1.Styles of evaluation • Lab study • Field study
  • 35. Laboratory studies • Advantages: – specialist equipment available – uninterrupted environment • Disadvantages: – lack of context – difficult to observe several users cooperating • Appropriate – if system location is dangerous or impractical for constrained single user systems to allow controlled manipulation of use
  • 36. Field Studies • Advantages: – natural environment – context retained (though observation may alter it) – longitudinal studies possible • Disadvantages: – distractions – noise • Appropriate – where context is crucial for longitudinal studies
  • 37. 2.Empirical methods: experimental evaluation • Participant • If not real user then selected on basis of age and literacy which should be closer to user • sample size: the sample size must be large enough to be considered to be representative of the population,
  • 38. Experimental factors • Participant – who – representative, sufficient sample • Variables – things to modify and measure • Hypothesis – what you’d like to show • Experimental design – how you are going to do it • Test
  • 39. Variables • independent variable (IV) characteristic changed to produce different conditions e.g. interface style, number of menu items • dependent variable (DV) characteristics measured in the experiment e.g. time taken, number of errors.
  • 40. Hypothesis • prediction of outcome – framed in terms of IV and DV e.g. “error rate will increase as font size decreases” • null hypothesis: – states no difference between conditions – aim is to disprove this e.g. null hyp. = “no change with font size”
  • 41. Experimental design • within groups design (repeated measures) – one participant performs experiment under each condition/each UI – Within-subjects (or repeated-measures) study design: the same person tests all the conditions (i.e., all the user interfaces). – transfer of learning possible – less costly and less likely to suffer from user variation. – Less users – Eg only one user can test all features of app. • between groups design – each participant performs under different condition – Between-subjects (or between-groups) study design: different people test different condition, so that each person is only exposed to a single user interface – no transfer of learning – more users required – variation can bias results. – Multiple users can test single feature of app.
  • 42. Analysis of data • Before you start to do any statistics: – look at data – save original data • Choice of statistical technique depends on – type of data – information required • Type of data – discrete - finite number of values – discrete variable can only take a finite number of values or levels, for example, a screen color that can be red, green or blue. – continuous - any value – ivide heights – into short (<5 ft (1.5 m)), medium (5–6 ft (1.5–1.8 m)) and tall (>6 ft (1.8 m)).
  • 43. Parametric test • Parametric Tests: • Assumption: Parametric tests assume that the data follow a specific probability distribution, usually the normal distribution. • Data Type: Parametric tests are appropriate for interval or ratio data. • Examples: t-tests, analysis of variance (ANOVA), Pearson correlation, linear regression.
  • 44. non-parametric tests. • Nonparametric Tests: • Assumption: Nonparametric tests do not assume a specific probability distribution for the data. • Data Type: Nonparametric tests are suitable for ordinal or nominal data, as well as for data that violate parametric assumptions. • Examples: Mann-Whitney U test, Wilcoxon signed-rank test, Kruskal-Wallis test, Spearman correlation.
  • 45. Test • Discrete IV- more than or equal to two discrete independent variable:- ANOVA Analysis of varience . • Less than two discrete independent variable :- T test
  • 46. Test example • Comparing Two Versions of Microsoft Word: – If you're comparing two different versions of Microsoft Word (e.g., Word 2016 vs. Word 2021), you could use an independent samples t-test. – You would collect data from users performing specific tasks or answering questions related to usability, satisfaction, or efficiency in both versions of Word. – The t-test would help determine if there's a significant difference in user performance or preferences between the two versions. • Comparing Multiple Versions or Features within Microsoft Word: – If you're comparing more than two versions of Microsoft Word or specific features within Word (e.g., comparing the performance of different editing interfaces), you might use ANOVA. – ANOVA would help determine if there are significant differences in user performance or preferences across the multiple versions or features being compared. – If ANOVA indicates a significant difference can be used to identify which specific versions or features differ significantly from each other.
  • 47. Design an experiment to test whether adding color coding to an interface will improve accuracy. Identify your hypothesis, participant group, dependent and independent variables, experimental design, task and analysis approach.
  • 48. • Hypothesis: – Our hypothesis is that incorporating color coding into the interface will enhance accuracy in completing a specific task. • Participant Group: – We will recruit participants from a diverse pool, including individuals with varying levels of computer literacy and familiarity with color-coded interfaces. – Ideally, we should aim for a sample size of at least 30 participants to achieve statistical significance.
  • 49. • Dependent Variable (DV): – The dependent variable is accuracy. We will measure how accurately participants complete a specific task using the interface. • Independent Variable (IV): – The independent variable is color coding. We will have two conditions: • Color-Coded Interface: Participants interact with an interface where relevant elements (e.g., buttons, icons, text) are color-coded. • Non-Color-Coded Interface: Participants interact with an interface without any color coding. • Experimental Design: – We will use a between-subjects design (each participant experiences only one condition). – Randomly assign participants to either the color-coded or non-color-coded group. – Counterbalance the order of conditions to control for order effects (half start with color-coded, half with non-color-coded).
  • 50. Task: Participants will perform a specific task using the interface. For example, they might be asked to find and click on a specific button or navigate through a menu. The task should be relevant to the interface’s purpose (e.g., completing an online purchase, searching for information). Analysis Approach: Compare the accuracy scores between the two groups (color- coded vs. non-color-coded). Use inferential statistics (e.g., t-test or ANOVA) to determine if the difference in accuracy is statistically significant. If the color-coded group performs significantly better, we can conclude that color coding improves accuracy.
  • 51. • Perform experimental evaluation analysis to test design of icons.
  • 52. • Hypothesis: – Our hypothesis is that different icon designs (color, shape, simplicity) will influence user performance in completing a specific task. – Specifically, we expect that well-designed icons will lead to faster and more accurate task completion. • Participant Group: – We will select participants from a diverse pool, including individuals with varying levels of familiarity with digital interfaces. – Ideally, aim for a sample size of at least 30 participants to achieve statistical significance.
  • 53. • Dependent Variable (DV): – The dependent variable is user performance, which includes both accuracy (correct task completion) and speed (time taken to complete the task). • Independent Variable (IV): – The independent variable is icon design. We will manipulate this variable by creating different versions of the same interface with varying icon designs. • Color-Coded Icons: Icons with distinct colors. • Monochromatic Icons: Icons with grayscale or single-color design. • Simplified Icons: Minimalistic, easy-to-recognize icons.
  • 54. • Experimental Design: – We will use a within-subjects design (each participant experiences all conditions). – Participants will interact with the interface under three different conditions (color-coded, monochromatic, and simplified icons). – Counterbalance the order of conditions to control for order effects (half start with color- coded, half with monochromatic). • Task: – Participants will perform a specific task using the interface. For example, they might be asked to find and click on a specific icon (e.g., “Save,” “Print,” “Search”). – The task should be relevant to the interface’s purpose (e.g., document editing software, e-commerce website). • TEST : – Compare user performance (accuracy and speed) across the three icon design conditions. – ANOVA test