Developer's Guide to
 Feedback Driven
   Development
      Marty Haught
       @mghaught
Did you get the memo?
We’re starting a new project
Do we have requirements?
  Project release party?
Nope, not like you’re thinking
Huh?
Yeah, we’ll just build out
      a few weeks
That’s all they want?!?
Oh no, it’s just to get started
So how can we plan it out?
I’ll send you the memo
How do you build a product
   when you only have a
vision and a starting place?
FbDD

 Feedback
  Driven
Development
Session roadmap

1. What is feedback driven development

2. Role in development

3. Techniques

4. Integration
Boulder Ruby




Longmont, Colorado
1. FbDD, what is it?

“A collection of techniques for
measuring progress based on
    customer interaction.”
Via lean startups
          by Eric Ries

“translating your startup vision into a
  successful business as quickly and
        efficiently as possible”


 http://startuplessonslearned.com
Business-Technical


Technical process that touches business use

Used to guide product direction

Best when business unknowns exist
Where do you use this?

  New projects where you are unsure of your
  market

  Haven’t figured out a pricing strategy

  Don’t know what users really need

  Don’t want to write software no one uses
What is progress?


Completing stories?

Hitting project milestones?

Making profit?
None of the above?


Going in the wrong direction

More about traction over profits

Finding product-market fit
Validated learning


“Using collected data to prove key
risk factors are being addressed by
   your product's development”
Focus on adoption

Customer signups

User engagement

Long-term retention

Viral coefficient
What does FbDD look
       like?

 Don’t spec out entire project

 Keep long-term strategy in mind

 Tight iterative loops once you collect feedback
Minimum Viable Product

 “A version of a new product which
allows a team to collect the maximum
 amount of validated learning about
   customers with the least effort.”
Minimum Viable Product

 “A version of a new product which
allows a team to collect the maximum
 amount of validated learning about
   customers with the least effort.”
A starting place

Simplest implementation of your core idea

Just build enough for validated learning

Early adopters will see the potential

Not finished, should be embarrassing
Some examples



Landing page, signup page and AdWords

Rails Rumble or Startup Weekend
MVP Interaction



Questions, comments?
2. Role in development
Life of a story
An idea is born
Start with usual client-developer conversation

Ask three additional questions:

1. How do we measure progress?

2. What’s the expectation on how this story
   will impact our metrics?

3. How long do we need to measure results?
Grows up

Implement with simplest version for learning

Strive for least effort to get feedback soonest

Push off unneeded decisions, commitments

Ensure we have a way for measuring in place
Leaves home


Deploy code to production

Measure results for agreed upon duration

Though deployed, the story is not done
Makes its mark

Analyze results, good, bad or indifferent

Is it a keeper?

Possibly remove it?

Need to tweak the approach some?
The pivot

Shifting your product direction based on
validated learning

Used when progress is no longer being made

Leverage what you’ve built
Types of pivots

Segment - going after different customers

Problem - solve a different problem for existing
customers

Feature - selecting specific feature and
centering company around that
Dev loop interaction



Questions, comments?
3. Techniques

Starting techniques

 1. Tracking usage

 2. A/B testing

 3. Net promoter score

 4. Direct feedback
Tracking usage




“Know when you’re making progress”
Pirate metrics
Not vanity metrics
Actionable is key


Should help you make decisions

Gives insight into your problem domain
AARRR!!
                      by Dave McClure

          acquisition

          activation

          retention

          referral

          revenue

http://www.slideshare.net/dmc500hats/startup-metrics-for-pirates-long-version
flingr’s metrics

acquisition - site visits

activation - signups

retention - repeat use

referral - fling backs

revenue - upgrade to pro account
Drop-in service

Google Analytics (good for content/pages)

MixPanel (easy integration)

KISSMetrics (strong on funnel analysis)

StatsMix (flexible, combines many sources)
Roll your own


1. Your own event-driven metrics table

2. Use existing application data
Existing data
DIY dashboard
why DIY?

Very simple to get started

You already have this data

Easily add or change what you track

As you own the data, can analyze differently

Good enough
A/B testing

“Presenting two or more variations
 and measuring user behavior to
         determine value”
An improvement?
Anatomy of a split


           A/Bingo
        Patrick MacKenzie

http://www.bingocardcreator.com/abingo
Identify the user
Invoke a split test
Record conversion
Tracking results
Benefits


Best mechanism to measure change

Can’t be affected by outside influences

Can answer internal debates
Pitfalls


Don’t a/b test everything

Can lead to a mess if not guided by your vision

May not get conclusive results
Net Promoter Score




“How much do your users like your
          product?”
How likely are you to
  recommend ...?
Simple follow-up
What’s the point?

Lightweight way to track customer perceptions

You want to see your score improve over time

Easy way to get additional feedback

Can do small sample sizes at regular intervals
How it works

Tracks 3 groups:

 1. Promoters (9-10)

 2. Passives (7-8)

 3. Detractors (0-6)
Formula

   NPS = P% - D%
10 responses
5 promoters
3 passives
2 detractors

30 = 5/10 - 2/10
Triggering

Offer questionnaire every other week/month
or manually trigger

Only need a small sample size, such as 50-100
per time

Only present to each user once per 3-6 months
My preference

Perform small sample each week

Ask users after 3 weeks of signing up

Only ask every 6 months

Prefer in site lightbox modal to minimize
interruption
Database
Base model
Following results
Benefits


Very easy to use and implement

Not too annoying for users when done right

Allows you to track customer perception
Direct user feedback
Guidelines

Make it easy to tell you want they think

Ever present feedback button on all pages

Shouldn’t interrupt user flow

Less clicks is better
Simple version


Lightbox modal with text box and email

Start with sending results in emails

Graduate to a DB table for managing
Full featured


Get Satisfaction or User Voice

Better at collecting and managing feedback

Downside is more clicks and extra registration
Live interaction
Snap-a-bug: http://www.snapabug.com/

   Does live chat or offline email form

   Can record what’s going on with the user’s
   browser

   Integrates with existing CRMs, help desk
   and bug tracking solutions

   Advanced remote screen capture
Implementation
   interaction



Questions, comments?
4. Integration
With current process

 Plays nice with agile methodologies

 Need to consider an extra phase for learning
 once metrics are collected

 Half-implemented stories will need follow-up
 to build out
When to use
MVP: from the start

Metrics: pick 3-5 core metrics from start

A/B testing: prefer major feature rollouts or
contentious enhancements

NPS: once you get over 100 users

Direct feedback: from the start
Limitations

Challenging with low volume

Be prepared for a lack of significant results

Can’t replace vision, only give some guidance
on effectiveness
Takeaway

Feedback can be used for smarter product
development

It’s easier than you think to get started

Knowing how your customers use your
product

Can help avoid writing software no one uses
Questions? Thanks!

      Marty Haught
       @mghaught
    mghaught@gmail.com
   http://martyhaught.com

Dev's Guide to Feedback Driven Development