Measuring & Evaluating
Your DesignOps Practice
Dave Malouf (@daveixd) 

#designOps #amplifyDesign
10 April, 2019
The Overture
Why does DesignOps exist?
To amplify
the value of
designing?
Better designs come from
better designing
What is Design
Operations
#DesignOps
… the tooling,
grease, and rails that
amplify the value of
a design team.
MissionVisionValuesPrinciples
Processes
Methods
Craft
Workflow
People
Governance Tools & Infrastructure
1. Set up your team for success
2. Increase the value of your
organization’s investment 

in design
Goals of Design Operations:
Why DesignOps (now)?
SCALE
What is scaling?
• Impact to org & customers
• Team Sizes
• Number of Teams
• Number of Systems
• Types and number of integrations
The largest obstacle to
design success is the
misalignment of the
value proposition that
design itself provides an
organization.
Value …
… answers, “why should I come to you?”
… justifies investment through perceived return.
… suggests what should be measured to
understand return on that investment.
But how do we value
DESIGNING?
A proposed value of design
• Driving Understanding & Empathy
• Creating Clarity & Behavioral Fit
• Exploration
• Envisioning
Some skills to create that value
• Storytelling
• Visual Thinking
• Information Presentation
• Workshop Facilitation
• Prototyping/Simulations
Disciplines and their value
Research: move past surface symptoms towards framing problems and needs.
Facilitation: align understanding utilizing tools, frameworks, and visualizations.
Interaction Design: convert understood problems and needs into flows and
interactivity models that not just meet needs but fit behaviors.
Information Architecture: Brings clarity by converting data sets into
information spaces that help people gain insights, navigate smoothly, make
better decisions.
Visual Design: Creates handles and buttons, and information visualizations that
allow people to understand possibilities, make better decisions, act within a
system with confidence, know equally confidently the system reaction.
Pieces of
DesignOps
•Human Resources
•Community
•Communications
•Tools
•Infrastructure
•Workflow / Design API
•Governance
Measuring DesignOps &
Measuring DesignOps Success
Different types of metrics
Types
•Quantitative data
•Qualitative data, quantified.
Collection Methods
•Self reported
•Gathered through automated instrumentation
Duration Variable
• Moment in time
• Comparative
• Trend
Causation
vs.
Correlation
Cascading use of data
Metric What is available to measure?
Correlation
If we compare the original metric to another metric can
that help understand the original hypothesis?
Interpretation
What does the correlation tell us?

Which directions make the desired effect?
Desired Threshold
What measure of the metric will tell us we reached an
otherwise qualitative goal?
Trend
How do the prime metric and correlated metric compare over time? 

How strong of a correlation in the trend would be significant? (differential)
Milestone
What can we map against a timeline to help us understand and interpret
possible moments of cause and effect? (such as releases, ship dates)
Baseline
The value of a metric or the combination of metrics at
beginning of any initiative.
Setting up your measurement*
*partially taken from case study by Intuit
Data Type: Quantitative

Time Designing
Collected by: Self-reporting

Using tool such as Harvest
Desired Outcome

Increase design quality &/or
designer engagement
Hypothesis:

Increasing time designing will
increase design quality &/or
designer engagement
Measuring Desired Outcomes:

NPS, Heuristics, Usability
Testing, Customer
Satisfaction, etc.
How can you measure quality?
Is the product organization aligned in their understanding of the
value of you design(ing) to the business & their customers? 2
1. There is no alignment across the product organization.
2. There have been gains in alignment seen by open trials of design and
research activities and processes.
3. Alignment is growing, as seen by more non-designers participating in design
activities.
4. Design value is well understood and consistently articulated across the
product organization.
Evaluating the
DesignOps Practice.
Vital Signs
Vital Signs
for DesignOps
Top 3-5 metrics that tell you
something might be wrong,
or everything is ok.
Possible Examples
• Number of UX stories
that started in a sprint’s
backlog, but didn’t get
deployed to prouction.
• Attrition rate within a
design team compared to
the whole organization.
• Time spent designing/
researching.
Regular periodic
evaluations
DesignOps Set of Questions
People

-Does recruitment lead to best in class talent hired?
-Is the team engaged and growing professionally?
-Are the team’s values being upheld?
-Are diversity & inclusion upheld as important values?
Workflow
-Teams are meeting the needs of stakeholders?
-How much of the total design process are team
members being encouraged &/or allowed to do?
-Are designs regularly being included in shipped goods?
Communications
-Does the team have line of sight into the team &
business?
-Is the signal:noise ratio being managed?
Tools

-Is the team able to get the tools they need to
be successful & productive?
-Are tools easily integrated to each other, and to
the broader set of stakeholders (where
appropriate)?
Governance 
-Are the mission & vision in place and well
understood?
-Are the team’s principles are being used to
evaluate the quality of design work?
- Are decision making processes are understood,
and acted upon?
BusinessOps
-Is the DesignOps team creating and
maintaining relationships with key BusinessOps
teams to ensure DxD smooth functioning?
ResearchOps Set of Questions
Inclusion

-Who is included in all the stages of research?
-Is there proper representation of appropriate subjects?
-Is data coming from many sources in the
organization?
Diversity
-How is diversity ensured during research?
-What is the current state of diversity during research?
-Are there a diverse set of data types?
Empathy
-How much is empathy spread through the
organization for customers & users of products and
services?
-Who in the organization can share stories of
customers & users that can express their emotional and
cognitive mental models?
Holism

-Is research done in a holistic manner?
-Is the total journey of the user understood?
Synthesis 
-Is collected data aggregated, and synthesized
into models, prototypes, and visions?
Rigor
-Is data gathered in ways to keep data clean and
to avoid wrongful conclusions?
How do you know if you are
measuring the right things?
Measuring the right things …
Your value to you
• How do you want to be valued by your
peers and stakeholders?
• What proof do you have that you are
valuable in these ways or that you can
provide value if allowed.
Measuring the right things …
Your value to others
• When others come to you, what job(s) do
they ask you to do?
• What ways do they have to understand
your value to them? To the business?
Measuring the right things …
Align design team & stakeholders
• If you don’t create this alignment,
everything will be much harder, if not
impossible for you.
• This is hard work, and requires team and
stakeholders take the time to do the work.
Measuring the right things …
Understand business/org goals.
• If you don’t understand your orgs goals,
you’ll never be successful.
• So interview executives across domains in
the organization, even/especially those
outside of the Product & Engineering teams.
Measuring the right things …
A vision that describes success
• Can you tell a story that describes what
your world will be like if you were
successful?
• Do stakeholders like this story?
Measuring the right things …
Activity path to success
• Your story, hopefully, had a series of
activities that led from now to success.
Outline these, and put them into a plan.
• Along the way create milestones that tell
you what/when you measure.
Measuring the right things …
Gather, monitor, compare, share
• How will you gather data?
• How will you monitor the right data?
• How will you find strong correlations?
• How will you share data across the org?
The Finale
For designOps to succeed …
1. Resist being reactive to stakeholders.
2. Use your skills as a design team.
3. Understand/Coach your value to the business.
4. Have a vision & a plan for achieving it.
5. Measure, or evaluate your performance.
What I do …
What do I do?
1. I coach designers & designer leaders to help them
reach personal & professional goals.
2. I consult for & with design teams to help them
evaluate, create, and maintain their designOps
practice.
3. I teach workshops on designOps, storytelling,
design strategy, and design studio culture.
Learn more at …
https://www.designbetter.co/
designops-handbook
https://rosenfeldmedia.com/
designops-community/
http://designops-conference.com/
http://designopssummit.com/
Dave Malouf
me@davemalouf.com
@daveixd || @Des_Ops
http://medium.com/@daveixd

Assessing Your Current DesignOps Practice: A Heuristic Model - Dave Malouf

  • 1.
    Measuring & Evaluating YourDesignOps Practice Dave Malouf (@daveixd) 
 #designOps #amplifyDesign 10 April, 2019
  • 2.
  • 3.
  • 4.
    To amplify the valueof designing?
  • 5.
    Better designs comefrom better designing
  • 6.
    What is Design Operations #DesignOps …the tooling, grease, and rails that amplify the value of a design team.
  • 7.
  • 8.
    1. Set upyour team for success 2. Increase the value of your organization’s investment 
 in design Goals of Design Operations:
  • 9.
  • 10.
    What is scaling? •Impact to org & customers • Team Sizes • Number of Teams • Number of Systems • Types and number of integrations
  • 11.
    The largest obstacleto design success is the misalignment of the value proposition that design itself provides an organization.
  • 12.
    Value … … answers,“why should I come to you?” … justifies investment through perceived return. … suggests what should be measured to understand return on that investment.
  • 13.
    But how dowe value DESIGNING?
  • 14.
    A proposed valueof design • Driving Understanding & Empathy • Creating Clarity & Behavioral Fit • Exploration • Envisioning
  • 15.
    Some skills tocreate that value • Storytelling • Visual Thinking • Information Presentation • Workshop Facilitation • Prototyping/Simulations
  • 16.
    Disciplines and theirvalue Research: move past surface symptoms towards framing problems and needs. Facilitation: align understanding utilizing tools, frameworks, and visualizations. Interaction Design: convert understood problems and needs into flows and interactivity models that not just meet needs but fit behaviors. Information Architecture: Brings clarity by converting data sets into information spaces that help people gain insights, navigate smoothly, make better decisions. Visual Design: Creates handles and buttons, and information visualizations that allow people to understand possibilities, make better decisions, act within a system with confidence, know equally confidently the system reaction.
  • 17.
  • 18.
  • 19.
    Different types ofmetrics Types •Quantitative data •Qualitative data, quantified. Collection Methods •Self reported •Gathered through automated instrumentation
  • 20.
    Duration Variable • Momentin time • Comparative • Trend
  • 21.
  • 22.
    Cascading use ofdata Metric What is available to measure? Correlation If we compare the original metric to another metric can that help understand the original hypothesis? Interpretation What does the correlation tell us? Which directions make the desired effect? Desired Threshold What measure of the metric will tell us we reached an otherwise qualitative goal? Trend How do the prime metric and correlated metric compare over time? How strong of a correlation in the trend would be significant? (differential) Milestone What can we map against a timeline to help us understand and interpret possible moments of cause and effect? (such as releases, ship dates) Baseline The value of a metric or the combination of metrics at beginning of any initiative.
  • 23.
    Setting up yourmeasurement* *partially taken from case study by Intuit Data Type: Quantitative
 Time Designing Collected by: Self-reporting
 Using tool such as Harvest Desired Outcome
 Increase design quality &/or designer engagement Hypothesis:
 Increasing time designing will increase design quality &/or designer engagement Measuring Desired Outcomes:
 NPS, Heuristics, Usability Testing, Customer Satisfaction, etc.
  • 24.
    How can youmeasure quality? Is the product organization aligned in their understanding of the value of you design(ing) to the business & their customers? 2 1. There is no alignment across the product organization. 2. There have been gains in alignment seen by open trials of design and research activities and processes. 3. Alignment is growing, as seen by more non-designers participating in design activities. 4. Design value is well understood and consistently articulated across the product organization.
  • 25.
  • 26.
    Vital Signs for DesignOps Top3-5 metrics that tell you something might be wrong, or everything is ok. Possible Examples • Number of UX stories that started in a sprint’s backlog, but didn’t get deployed to prouction. • Attrition rate within a design team compared to the whole organization. • Time spent designing/ researching.
  • 27.
  • 28.
    DesignOps Set ofQuestions People
 -Does recruitment lead to best in class talent hired? -Is the team engaged and growing professionally? -Are the team’s values being upheld? -Are diversity & inclusion upheld as important values? Workflow -Teams are meeting the needs of stakeholders? -How much of the total design process are team members being encouraged &/or allowed to do? -Are designs regularly being included in shipped goods? Communications -Does the team have line of sight into the team & business? -Is the signal:noise ratio being managed? Tools
 -Is the team able to get the tools they need to be successful & productive? -Are tools easily integrated to each other, and to the broader set of stakeholders (where appropriate)? Governance  -Are the mission & vision in place and well understood? -Are the team’s principles are being used to evaluate the quality of design work? - Are decision making processes are understood, and acted upon? BusinessOps -Is the DesignOps team creating and maintaining relationships with key BusinessOps teams to ensure DxD smooth functioning?
  • 29.
    ResearchOps Set ofQuestions Inclusion
 -Who is included in all the stages of research? -Is there proper representation of appropriate subjects? -Is data coming from many sources in the organization? Diversity -How is diversity ensured during research? -What is the current state of diversity during research? -Are there a diverse set of data types? Empathy -How much is empathy spread through the organization for customers & users of products and services? -Who in the organization can share stories of customers & users that can express their emotional and cognitive mental models? Holism
 -Is research done in a holistic manner? -Is the total journey of the user understood? Synthesis  -Is collected data aggregated, and synthesized into models, prototypes, and visions? Rigor -Is data gathered in ways to keep data clean and to avoid wrongful conclusions?
  • 30.
    How do youknow if you are measuring the right things?
  • 31.
    Measuring the rightthings … Your value to you • How do you want to be valued by your peers and stakeholders? • What proof do you have that you are valuable in these ways or that you can provide value if allowed.
  • 32.
    Measuring the rightthings … Your value to others • When others come to you, what job(s) do they ask you to do? • What ways do they have to understand your value to them? To the business?
  • 33.
    Measuring the rightthings … Align design team & stakeholders • If you don’t create this alignment, everything will be much harder, if not impossible for you. • This is hard work, and requires team and stakeholders take the time to do the work.
  • 34.
    Measuring the rightthings … Understand business/org goals. • If you don’t understand your orgs goals, you’ll never be successful. • So interview executives across domains in the organization, even/especially those outside of the Product & Engineering teams.
  • 35.
    Measuring the rightthings … A vision that describes success • Can you tell a story that describes what your world will be like if you were successful? • Do stakeholders like this story?
  • 36.
    Measuring the rightthings … Activity path to success • Your story, hopefully, had a series of activities that led from now to success. Outline these, and put them into a plan. • Along the way create milestones that tell you what/when you measure.
  • 37.
    Measuring the rightthings … Gather, monitor, compare, share • How will you gather data? • How will you monitor the right data? • How will you find strong correlations? • How will you share data across the org?
  • 38.
  • 39.
    For designOps tosucceed … 1. Resist being reactive to stakeholders. 2. Use your skills as a design team. 3. Understand/Coach your value to the business. 4. Have a vision & a plan for achieving it. 5. Measure, or evaluate your performance.
  • 40.
  • 41.
    What do Ido? 1. I coach designers & designer leaders to help them reach personal & professional goals. 2. I consult for & with design teams to help them evaluate, create, and maintain their designOps practice. 3. I teach workshops on designOps, storytelling, design strategy, and design studio culture.
  • 42.
  • 44.
  • 45.
  • 46.
  • 47.
  • 48.
    Dave Malouf me@davemalouf.com @daveixd ||@Des_Ops http://medium.com/@daveixd