Data and Design:
BFFs or Frenemies?
Steve Mulder
@muldermedia
2
UX & Design
Research
Testing
Analytics
3
4
5
Trap #1
We design by blindly following data
6
7
Trap #2
We measure the wrong thing
8
9
10
What we measure
becomes
what we reward
11
What we reward
becomes
what we do
and
what we are
12
Trap #3
We become optimizers instead of designers
13
Optimization via A/B
or usability testing
The danger of optimizing to a local maximum
Local maximum
14
15
1. Data is the platform from which we leap
16
For NPR, Arbitron ratings data tells us that
audience is up but listening time is down
17
Third-party research validates trends in media
fragmentation and digital adoption
18
Channels proliferate, presenting both
opportunity and challenge
“Radio isn’t going away,
it’s going everywhere.”
19
Our own research and personas reveal
changing consumer behaviors & opportunities
20
It’s not about building things that users know
to ask for
21
New digital interfaces in familiar locations
(connected car)
22
Experiments with new listening experiences
Exploring
experiences
that meet needs
users can’t yet
articulate
23
2. Data helps us know
how high we’ve jumped
and whether to keep going
24
The Facebook experiment: Local station stories
geo-targeted on NPR’s Facebook stream
25
Local stories saw consistently higher
engagement and grew local audience
26
Pivot and double-down: We created a
workflow tool so we can scale this offering
27
and
more about
continuous learning
3. It’s less about
achieving goals
28
Yes, this sounds like Lean Startup thinking
Assumption/hypoth
esis
Minimum Viable
Product
Analytics, researc
h, testing
29
Designing choice in the new NPR app
Original design: Give
users more
control/choices
Hypothesis to test:
Fewer immediat...
30
Usability testing (and third-party research)
on the infamous hamburger nav
31
Walking the narrow path
32
Upcoming SlideShare
Loading in …5
×

Data and Design: BFFs or Frenemies?

791 views

Published on

For some UX/design teams, anything that comes out of analytics, research, and testing is suspect – like a mugger that could rob us of creativity and innovation. There are traps to avoid so that data doesn’t imprison design. So how do we avoid the traps and instead use data for good? Come find out how NPR thinks about the intersection of data and design, and how we are using lean, data-informed processes to experiment with the future of digital listening.

Published in: Design, Technology
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
791
On SlideShare
0
From Embeds
0
Number of Embeds
8
Actions
Shares
0
Downloads
5
Comments
0
Likes
4
Embeds 0
No embeds

No notes for slide
  • Qualitative and quantitativeUser interviews, surveys, third-party researchUsability testing, A/B testing, concept testinghttp://www.flickr.com/photos/chicagobart/4650478963/
  • UX and design community: fundamental fear and skepticism of research & metricsCan feel like data’s purpose is to mug design and steal creativity from us, restrict what we dohttp://www.flickr.com/photos/spencerbbclark/6483073463/
  • http://www.flickr.com/photos/kerryburnout/3826128002/
  • False:Research = doing what users tell you to doDriving directions, prescriptive, restrictiveLycos marketing survey on colors
  • Sony boombox colorWhat users say isn’t always what they do
  • Or measuring only one thingEverything hinges on what you decide to measurehttp://www.adprint.ro/files/executions/8/356_topinterior_mag_sofa_low.jpg_900.jpg
  • Wisconsin grocery chain decided to use IPM (items per minute) as key metric for cashier productivity. Faster checkout should mean shorter lines, happier customers, more profit.But IPM is only tracked when a lane is open and a cashier’s terminal is ready. So cashiers learned to put their terminals in secure state to stop the clock – between customers, while bagging, etc. IPM score goes up, but terminal on and off slows down actual checkout times.*Unintended consequences*http://innovationedge.com/2010/05/10/dangerous-metrics-for-innovation/
  • Former Soviet Union: managers of glass plants rewarded by tons of sheet glass produced. Most plants produced glass so thick it was nearly opaque. Changed to square meters of glass. Led to glass so thin that it was easily broken.For us: Session conversion vs longer-term loyalty effectsUpsellsLeads to false optimization, taking us in the wrong directionhttp://compliance.saiglobal.com/community/know/blogs/item/4392-unintended-consequences-and-perverse-incentives
  • Ifsomethingcanbe counted, it’smorelikely to be used tomeasuresuccessWe choosewhat we measure – and whatmetrics we lookat and use the mosthttp://www.flickr.com/photos/60852569@N00/2204909532/
  • Incentives drive behavior change and cultureThey affect what we work on and who we becomeWhat gets counted in your organization?What numbers do you praise people for or chastise them for?How does that change behavior and impact what you do and who you are?http://www.flickr.com/photos/jon_tucker/2446567442/
  • Instead of using our creative skills, it’s just about optimization and making very small adjustmentsBecoming machinists, tending to the robots that make decisionsCult of A/B testingDoug at Google on testing blueshttp://www.flickr.com/photos/seattlemunicipalarchives/4089729743/
  • What is it about A/B testing that makes it so easy to fall into this trap?
  • It’s the baseline we start from, not the endIt provides context and constraints that focus us before freeing ushttp://www.flickr.com/photos/bfishadow/4407857237/
  • Opportunity: reaching new audience where they liveChallenge: More competitors for “earshare” than ever before
  • Channel usage by segmentLikelihood to consider public radioBrand perception and affinity:NPRvs station vsprogramBUT all this data doesn’t tell us exactly what to do
  • Smartphone app works with Ford Sync, voice control
  • App in private betaMobile version of NPR Infinite PlayerRemixing public radio content, algorithms and editorial curation for a lean-back or lean-forward experience
  • Measure how high we've jumped (and whether it's worth continuing investment) It should inspire us to reach higherWithout overly limiting our design choiceshttp://www.flickr.com/photos/10710442@N08/8265087493/in/photolist-dAmJyv-aoYjU9-aoYk3q-LJAUR
  • Initially: manual process – concierge MVP
  • Mindset: the goal is learningThere is no “finished”http://www.deviantart.com/morelikethis/artists/198956566?view_mode=2
  • Flips the product development model on its head (build first, then figure out what to measure)
  • Findability and discoverabilityTesting alone didn’t answer the questionBut it gave us insights so WE could answer it
  • Falling into the cult of datavsFalling into the cult of egoWithout data, design is just guessingWhy is it so hard for teams to walk the narrow path?
  • Data doesn’t give us the answers, it helps us ask better questions
  • Data and Design: BFFs or Frenemies?

    1. 1. Data and Design: BFFs or Frenemies? Steve Mulder @muldermedia
    2. 2. 2 UX & Design Research Testing Analytics
    3. 3. 3
    4. 4. 4
    5. 5. 5 Trap #1 We design by blindly following data
    6. 6. 6
    7. 7. 7 Trap #2 We measure the wrong thing
    8. 8. 8
    9. 9. 9
    10. 10. 10 What we measure becomes what we reward
    11. 11. 11 What we reward becomes what we do and what we are
    12. 12. 12 Trap #3 We become optimizers instead of designers
    13. 13. 13 Optimization via A/B or usability testing The danger of optimizing to a local maximum Local maximum
    14. 14. 14
    15. 15. 15 1. Data is the platform from which we leap
    16. 16. 16 For NPR, Arbitron ratings data tells us that audience is up but listening time is down
    17. 17. 17 Third-party research validates trends in media fragmentation and digital adoption
    18. 18. 18 Channels proliferate, presenting both opportunity and challenge “Radio isn’t going away, it’s going everywhere.”
    19. 19. 19 Our own research and personas reveal changing consumer behaviors & opportunities
    20. 20. 20 It’s not about building things that users know to ask for
    21. 21. 21 New digital interfaces in familiar locations (connected car)
    22. 22. 22 Experiments with new listening experiences Exploring experiences that meet needs users can’t yet articulate
    23. 23. 23 2. Data helps us know how high we’ve jumped and whether to keep going
    24. 24. 24 The Facebook experiment: Local station stories geo-targeted on NPR’s Facebook stream
    25. 25. 25 Local stories saw consistently higher engagement and grew local audience
    26. 26. 26 Pivot and double-down: We created a workflow tool so we can scale this offering
    27. 27. 27 and more about continuous learning 3. It’s less about achieving goals
    28. 28. 28 Yes, this sounds like Lean Startup thinking Assumption/hypoth esis Minimum Viable Product Analytics, researc h, testing
    29. 29. 29 Designing choice in the new NPR app Original design: Give users more control/choices Hypothesis to test: Fewer immediate choices + simplicity = longer listening Results: Listening time up 8%
    30. 30. 30 Usability testing (and third-party research) on the infamous hamburger nav
    31. 31. 31 Walking the narrow path
    32. 32. 32

    ×