These slides are for the following session presented at the UX STRAT Online 2021 Conference:
"How to Measure Design Quality"
Jessa Parette
Capital One: Head of Design - Strategy, Research & Systems
3. Data and design have an undeniable future together, and
automation of measuring quality needs to be part of it
1
We can measure design quality in objective, quanti
fi
able
ways (but it’s hard)
2
Design leaders have a responsibility to in
fl
uence the data
being built in the product lifecycle
3
Whatwe
willlearn
JESSA PARETTE | UX STRATEGY 2021
4. If we ignore the need to
automate measurement of
design quality at scale,
design becomes robotic
How we connect systems,
including design systems
and tools, becomes vital and
complex part of quality
Whythisfeelsimportant
Measurement is incredibly
hard, and will demand new
types of talent joining the
ranks of design teams.
MEASURING QUALITY
IS HARD
TECHNOLOGY IS
ONLY GOING TO GET
MORE COMPLEX
DESIGN DIGITIZES
FASTER THAN WE CAN
MEASURE
JESSA PARETTE | UX STRATEGY 2021
5. Keytakeawaysin
thistalk
We often overlook objectively measuring
quality as we develop products, but
opportunity exists
1
We can apply a formulaic framework to help
identify low-hanging fruit
2
In
fl
uencing the development process with the
right type of experience design data gets
easier when design leaders learn new skills
3
JESSA PARETTE | UX STRATEGY 2021
9. Speci
fi
cally crafting a
view that will tell us about
the design and experience
quality as a byproduct of a
user’s interaction with the
system as a whole
Thisisnotaboutwhatyoualreadydo
JESSA PARETTE | UX STRATEGY 2021
15. SOWHATISTHEPROBLEM?
Even if there are a few, there is usually
no systematic place to track them.
The cost: Device telemetry &
instrumentation lack design-data
standards as part of a requirements
package
JESSA PARETTE | UX STRATEGY 2021
22. Evenobjective
measuresof
usabilityface
challenges
There are few guidelines or standardized
models on the de
fi
nitions and usage of
usability metrics that developers can apply
consistently
1
Most frameworks are missing design-centric
data requirements, and not well integrated
into current development practices
2
We (probably) didn’t become designers
because we loved math, data and statistics
3
JESSA PARETTE | UX STRATEGY 2021
23. JESSA PARETTE | UX STRATEGY 2021
DESIGNISEVOLVING
Once upon a time…..
There was graphic
design and a technical
architect only
24. JESSA PARETTE | UX STRATEGY 2021
BEPARTOFTHE
EVOLUTION
Evolution is easier than
revolution
A designer’s role has
grown in the
responsibility of
knowing more than just
design
26. WHAT I
HOPE
INSPIRES
What if we could
express, in numbers,
the impact of
design?
JESSA PARETTE | UX STRATEGY 2021
27. DESIGNQUALITYISNOTONETHING
There are subjective and objective means of managing the user experience
Wecanseparatetypes
ofqualitymeasurementin
designbylookingatthe
categoricaldifference
betweensubjective&
objectivemeasures
OBJECTIVE SUBJECTIVE
USABILITY USER EXPERIENCE
JESSA PARETTE | UX STRATEGY 2021
Today’sfocus
28. CATEGORIZETHEFACTORSOFDESIGNQUALITY
Right now, there are generally 10 groups
First,determinethedesign
factorsthatmakeupquality
EFFICIENT
EFFECTIVE
SATISFYING
PRODUCTIVE
LEARNABLE
SAFE
TRUSTWORTHY
ACCESSIBLE
UNIVERSAL
USEFUL
JESSA PARETTE | UX STRATEGY 2021
29. BREAKDOWNTHEDESIGNFACTORS
Identify the criteria that encompasses each design quality factor
TIME
FEEDBACK
MINIMAL MEMORY
NAVIGATION
MINIMAL ACTION
RESOURCE USE
OPERABLE
UNDERSTANDABLE
EFFICIENT
If you want something to be:
You measure it’s encompassing criteria:
JESSA PARETTE | UX STRATEGY 2021
30. BREAKDOWNTHEDESIGNFACTORS
Identify the criteria that encompasses each design quality factor
LEARNABLE
USER GUIDANCE
FAMILIAR
MINIMAL MEMORY
SELF-DESCRIBING
MINIMAL ACTION
SIMPLICITY
CONSISTENT
UNDERSTANDABLE
If you want something to be:
You measure it’s encompassing criteria
JESSA PARETTE | UX STRATEGY 2021
31. BREAKDOWNTHEFACTORS
Identify the criteria that encompasses each factor
LEARNABLE
USER GUIDANCE
If you want something to be:
You measure it’s encompassing criteria
JESSA PARETTE | UX STRATEGY 2021
• Do our
fi
eld forms have proper indication?
• How many actions can be canceled by the user after they
have started?
• What is the rate of error for our most popular task
fl
ows?
32. TIME
Time spent performing some usable task
UNDERSTANDABLE
Capability of the software or product to convey its
purpose and give clear user assistance in its operation
FEEDBACK
The system, product or services’ capability to
respond to user actions
Definitionof
usabilityattributes
iskey
JESSA PARETTE | UX STRATEGY 2021
33. UNDERSTANDABLE
Capability of the software or product to convey its
purpose and give clear user assistance in its operation
Definitionof
usabilityattributes
iskey
JESSA PARETTE | UX STRATEGY 2021
This is different from the user self-
reporting whether they understand
Do our components have the right
microcopy?
Can we measure how many times a
user clicks on a help link?
These are back-end indicators or
benchmarks that signal a problem
35. HERE IS
WHERE IT
GETS FUN
Things like time,
productivity &
learning are all made
up of something, and
that something can
be quanti
fi
ed
(To an extent)
JESSA PARETTE | UX STRATEGY 2021
36. FINDTHENUMERICVALUEFORYOURDESIGNCRITERIA
Because it exists somewhere (probably)
MINIMAL ACTION
If you are measuring
It is made up of measurements like:
•Task time
•Number of commands
•Task concordance
•Completion rate
•Layout appropriateness
(Part of measuring ‘ef
fi
ciency’)
JESSA PARETTE | UX STRATEGY 2021
37. FINDTHENUMERICVALUE
Because it exists somewhere (probably)
TIME
If you are measuring
It is made up of measurements like:
•Time on task
•Completion rate
•Throughput
•Response time
•Time based ef
fi
ciency
(Part of measuring ‘ef
fi
ciency’)
JESSA PARETTE | UX STRATEGY 2021
38. IDENTIFYTHEMETRICSFORCRITERIA
Breaking them into numeric values
TIME
If you are measuring
De
fi
nitions of the metrics are:
METRIC DEFINITION
Throughput How many tasks can be successfully performed over a given period of time?
Throughput (Mean amount of
throughput)
What is the average number of concurrent tasks the system can handle over a set unit of time?
Throughput (Worst case thorughput
ratio)
What is the absolute limit on the system in terms of the number and handling of concurrent tasks as
throughput?
Response Time What is the time taken to complete a speci
fi
ed task?
Time based efficiency The time taken (in seconds or minutes) for users to complete a task
Overall relative efficiency
The overall relative ef
fi
ciency uses the ratio of the time taken by the users who successfully
completed the task in relation to the total time taken by all users.
Time on task How much time does it take a user to complete a task?
JESSA PARETTE | UX STRATEGY 2021
39. METRICSCANBEDESCRIBEDASFORMULASORCOUNTABLEDATA
The output is a numeric value that summarizes the status
METRIC DEFINITION FORMULA
Throughput
How many tasks can be successfully performed over a given period of
time?
X=A/T (A = number of completed events/tasks; T = observation time period)
Throughput (Mean amount
of throughput)
What is the average number of concurrent tasks the system can
handle over a set unit of time?
X = Xmean / Rmean Xmean = Σ(Xi)/N Rmean = required mean throughput
Xi = Ai / Ti (Ai = number of concurrent tasks observed over set period of time for i- th evaluation; Ti = set period of time for i- th
evaluation; N = number of evaluations)
Throughput (Worst case
thorughput ratio)
What is the absolute limit on the system in terms of the number and
handling of concurrent tasks as throughput?
X = Xmax / Rmax Xmax = MAX(Xi) (for i = 1 to N)
Rmax = required maximum throughput.
MAX(Xi) = maximum number of job tasks among evaluations
Xi = Ai / Ti (Ai = number of concurrent tasks observed
over set period of time for i- th evaluation; Ti = set period of time for i- th evaluation;N= number of evaluation)
Response Time What is the time taken to complete a speci
fi
ed task? T = ( time of gaining the result) - ( time of command entry
fi
nished)
Time based ef
fi
ciency The time taken (in seconds or minutes) for users to complete a task
Overall relative ef
fi
ciency
The overall relative ef
fi
ciency uses the ratio of the time taken by the
users who successfully completed the task in relation to the total time
taken by all users.
Time on task How much time does it take a user to complete a task? Total time between task start and task completion
JESSA PARETTE | UX STRATEGY 2021
41. WE LOOK FOR
EXISTING
PRODUCT DATA
(Analysts are your BFF) Design can comprise new views of the user experience
by re-arranging existing product interaction data
(If it exists. And that’s a big if.)
JESSA PARETTE | UX STRATEGY 2021
42. WE LOOK FOR
EXISTING
PRODUCT DATA
(Analysts are your BFF)
BASIC LIST OF PRODUCT ANALYTICS
JESSA PARETTE | UX STRATEGY 2021
43. - ServerTime
- SessionID
- UserID
- InteractionID
- PageLoadTime
To measure ef
fi
ciency, we need
- ServerTime
- SessionID
- ———-
- InteractionID
- PageLoadTime
But here is what we have
- ServerTime
- SessionID
- UserID
- InteractionID
- PageLoadTime
We need to code for
You go on a journey to see what
you need
Use data to map where your
design measures are
Fill in the missing pieces
IT’SKINDOFLIKEARCHEOLOGY
(Without the epic John Williams soundtrack)
JESSA PARETTE | UX STRATEGY 2021
44. THIS IS
NOT NEW
Even in design, this
approach is part of
our DNA
JESSA PARETTE | UX STRATEGY 2021
45. WE DO
THIS ALL
THE TIME
Usually in interface
de
fi
nition, and saying
‘atoms’ instead of
‘metrics’
JESSA PARETTE | UX STRATEGY 2021