How to Make User Experience Research Robust
When You’re Working Agile
Just Enough for Innovation
November 2015
Josie Scott
2
User Experience Design is the work of creating a digital space that
is immediately usable and recognizable to the people who use
them. This is only possible through a deep knowledge of their
goals, motivations, and needs.
UX Research is pivotal!
3
Serendipity Brings Me Here
x
Career
Volunteer (Mostly)
Professional Organizations
4
A Commitment to Innovation
5
Agile – Principles
• Individuals and interactions
over processes and tools
• Working software
over comprehensive documentation
• Collaboration
over contract negotiation
• Responding to change
over following a plan
6
Agile – Process
The Great UX/Agile Dilemma:
How do we design in Agile?
• No Design/Everyone Designs
• Big Design Up Front (ie, Waterfall)
• Sprint Ahead Design
7
Agile – Process
No Design/
Everyone Designs
PROS
Very Agile
Fast, responsive
and in sync with the
project
Allows developers
to be creative
CONS
Chaotic design
Inconsistent
Not focused on
users
Frequently: NO
RESEARCH!
8
Agile – Process
Big Design Up Front
(BDUP)
PROS
Consistent design
Allows time for
research
Requires fewer
decisions
CONS
Not very Agile
Time-consuming
Non-collaborative;
over the wall design
No failure allowed
9
Agile – Process
Sprint Ahead Design
PROS
Consistent design
Allows time for
research
Iterative
Flexible and
Collaborative
CONS
Stretches Agile
Asynchronous
Not everyone
designs
10
What About Research???
They need answers. We have
methods:
• Observation/Interviews
• Personas
• Innovation “Games”
• Card Sorts
• Diary Studies and Surveys
• Usability Testing
How do we support Agile?
11
Here’s What They Need
Upfront research;
Persona development
Small
validation
test
End Validation
Small
validation
test
Small
validation
tests
12
Answer: Just Enough Testing -- JET
Regular Research Event ~Once a
month
• 8-10 users over 1 to 3 days
• Lower fidelity mockups – even
paper
• Usually focused on single
interaction
• Any user-focused activity
• Possible update mid-test
• We all watch/participate –
scorecard
• Rapid Debrief/action plan
• Just enough reporting
13
JET Process Timeline
MONDAY TUESDAY WEDNESDAY THURSDAY FRIDAY
WEEK 6
Compadres
meet and
determine
hypothesis to
test.
WEEK 5
WEEK 4
WEEK 3
WEEK 2
Researcher:
Test run
through
WEEK 1
Team Set
up/Test day 1
/Debrief
Team
Test day
2/Debrief
Researcher prepares prototypes, products and equipment.
Researcher creates test plan, organizes recruiter and facilities.
Researcher posts
videos, report
14
Answer: Just Enough Testing -- JET
Sprint
Design Test Iterate
• Individuals and
interactions
• Working…prototypes
• Collaboration
• Responding to change
15
Dispelling Myths
• JET ≠ lack of rigor!
• Smaller Samples
• Same Usability Metrics
• JET ≠ Qualitative Only
• But wow!
• JET can’t …….. (fill in the blank)
• If you have a design question, we
can answer it
16
Oh, the Places We Go… with JET
• 12 JETs so far:
• Online and channel funnels
• Functions like geolocation
• Forms innovation
• Navigation, card sorting
• 50 questions so far!
• Multiple platforms (PC to tablet to phone)
• Analysis may include:
• Timing
• Errors
• Work (number of swipes)
• Sats & Qualitative
17
Questions?
Thank you!

Just Enough for Innovation

  • 1.
    How to MakeUser Experience Research Robust When You’re Working Agile Just Enough for Innovation November 2015 Josie Scott
  • 2.
    2 User Experience Designis the work of creating a digital space that is immediately usable and recognizable to the people who use them. This is only possible through a deep knowledge of their goals, motivations, and needs. UX Research is pivotal!
  • 3.
    3 Serendipity Brings MeHere x Career Volunteer (Mostly) Professional Organizations
  • 4.
  • 5.
    5 Agile – Principles •Individuals and interactions over processes and tools • Working software over comprehensive documentation • Collaboration over contract negotiation • Responding to change over following a plan
  • 6.
    6 Agile – Process TheGreat UX/Agile Dilemma: How do we design in Agile? • No Design/Everyone Designs • Big Design Up Front (ie, Waterfall) • Sprint Ahead Design
  • 7.
    7 Agile – Process NoDesign/ Everyone Designs PROS Very Agile Fast, responsive and in sync with the project Allows developers to be creative CONS Chaotic design Inconsistent Not focused on users Frequently: NO RESEARCH!
  • 8.
    8 Agile – Process BigDesign Up Front (BDUP) PROS Consistent design Allows time for research Requires fewer decisions CONS Not very Agile Time-consuming Non-collaborative; over the wall design No failure allowed
  • 9.
    9 Agile – Process SprintAhead Design PROS Consistent design Allows time for research Iterative Flexible and Collaborative CONS Stretches Agile Asynchronous Not everyone designs
  • 10.
    10 What About Research??? Theyneed answers. We have methods: • Observation/Interviews • Personas • Innovation “Games” • Card Sorts • Diary Studies and Surveys • Usability Testing How do we support Agile?
  • 11.
    11 Here’s What TheyNeed Upfront research; Persona development Small validation test End Validation Small validation test Small validation tests
  • 12.
    12 Answer: Just EnoughTesting -- JET Regular Research Event ~Once a month • 8-10 users over 1 to 3 days • Lower fidelity mockups – even paper • Usually focused on single interaction • Any user-focused activity • Possible update mid-test • We all watch/participate – scorecard • Rapid Debrief/action plan • Just enough reporting
  • 13.
    13 JET Process Timeline MONDAYTUESDAY WEDNESDAY THURSDAY FRIDAY WEEK 6 Compadres meet and determine hypothesis to test. WEEK 5 WEEK 4 WEEK 3 WEEK 2 Researcher: Test run through WEEK 1 Team Set up/Test day 1 /Debrief Team Test day 2/Debrief Researcher prepares prototypes, products and equipment. Researcher creates test plan, organizes recruiter and facilities. Researcher posts videos, report
  • 14.
    14 Answer: Just EnoughTesting -- JET Sprint Design Test Iterate • Individuals and interactions • Working…prototypes • Collaboration • Responding to change
  • 15.
    15 Dispelling Myths • JET≠ lack of rigor! • Smaller Samples • Same Usability Metrics • JET ≠ Qualitative Only • But wow! • JET can’t …….. (fill in the blank) • If you have a design question, we can answer it
  • 16.
    16 Oh, the PlacesWe Go… with JET • 12 JETs so far: • Online and channel funnels • Functions like geolocation • Forms innovation • Navigation, card sorting • 50 questions so far! • Multiple platforms (PC to tablet to phone) • Analysis may include: • Timing • Errors • Work (number of swipes) • Sats & Qualitative
  • 17.
  • 18.

Editor's Notes

  • #4 I have been lucky because everything I’ve done brought me here, today. Every step of the way, something I learned contributed to where I am today.     It begins with the journalism program that taught me a skill that will never fail me. Trained/learned from some top folks, starting at DTE Energy and Theo Mandel. That’s where I caught the bug, I think. I attended great conferences, and had great people teach me. Even my Master’s degree program was eclectic enough to teach me to research and gather data properly. During my UX career: Worked as a volunteer with Sarah to develop a training program to conduct usability testing for the uninitiated. Trained hundreds with the LEO kit.   Worked for six years conducting research for TechSmith's Morae. I learned how all sorts of folks conduct their programs. -- we went agile as I worked there…we had to pioneer agile methods – Morae really supports that. Now, I'm working with a great team, focused on innovation for a giant in the financial industry. At Synchrony Financial, we have a focus on Innovation. We are moving Agile as well.
  • #5 Who here is involved with Agile? Lean? These are incredibly powerful, and empowering statements. They created a dilemma for us: How do we support these new principles with research that we can put our names to? We have roles and our participants have rights, and we need to safeguard them. Honestly, Agile caused the entire UX world to conduct a pretty deep self-examination. Who remembers the old ways of analog video? Logging with excel (which, in itself was a step forward). Painful tracking of each event, hoping we get it live. No matter what you were doing, you quoted months sometimes, to do all the analysis. You just can’t do that with agile. We needed to respond to change.
  • #6 Who here is involved with Agile? Lean? These are incredibly powerful, and empowering statements. They created a dilemma for us: How do we support these new principles with research that we can put our names to? We have roles and our participants have rights, and we need to safeguard them. Honestly, Agile caused the entire UX world to conduct a pretty deep self-examination. Who remembers the old ways of analog video? Logging with excel (which, in itself was a step forward). Painful tracking of each event, hoping we get it live. No matter what you were doing, you quoted months sometimes, to do all the analysis. You just can’t do that with agile. We needed to respond to change.
  • #10   Here's what I learned: Lots of Flavors Need to move fast; time is off the essence. Old system: 2-3 months per study. Analog analysis. Wait! Agile means: Can't wait for lengthy studies Can't do BDUP Can't take our time with findings Need to share everything: "everyone is responsible for design."
  • #11 Who remembers the old ways of analog video? Logging with excel (which, in itself was a step forward). Painful tracking of each error, each action, each adjustment…often live. No matter what you were doing, you quoted months sometimes, to do all the analysis. You just can’t do that with agile. You can’t write big reports to throw over the wall, either. We needed to respond to change.
  • #17 Analysis remains robust. We can measure all the usual measures. My secret sauce: Morae…which does so much.
  • #18 Analysis remains robust. We can measure all the usual measures. My secret sauce: Morae…which does so much.