Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Experimentation from the real world; move from outputs to outcomes

164 views

Published on

by Andrew Rusling

Outcomes such as “subscriptions increased by 20%” or “complaints regarding the upload feature reduced to zero” are what makes a real difference in our customers lives and hence to the company’s bottom line. When a team is delivering outcomes like that, there is no denying their performance and hence their value to the company.

Delivering outcomes comes from understanding our customers, producing an output that may result in an outcome and then validating if we have achieved the desired outcome. At the very least one of these cycles produces knowledge. The Lean Start-up by Eric Ries, clearly explained this cycle, unfortunately it did not explain clearly how we should design, set up, run or analyse our experiments. I have met many people who agree we should follow the Lean Start-up approach; however, there is rarely any consensus on the experimentation approach that will make it a reality.

In 2017 Australia’s largest independent game studio, Halfbrick Studios, embarked upon a mission to better understand their customer and experiment their way to renewed success. Fruit Ninja Fight is one of the results of that approach. In 2018 Australia’s largest Telco, Telstra, focused on “co-creation” with their customers through a series of experiments; delivering improved customer satisfaction and faster results than ever before.

This presentation shares with you my experiences of working with those two Scrum based organisations as they sought to improve their outcomes through Experimentation.

Published in: Business
  • Be the first to comment

Experimentation from the real world; move from outputs to outcomes

  1. 1. Experimentation Outcomes over OutputsFrom the real world @AndrewRusling
  2. 2. Performance @AndrewRusling
  3. 3. Output Outcome Impact $
  4. 4. WHY HOW
  5. 5. Belief Frequency Rigor Learning Practice Clarity Focus Effort Outcomes Achieved
  6. 6. Product Lifecycle Phases Spark Problem Validation Product Validation Scale Sunset
  7. 7. Lesson learnt: Mindset is crucial Photo reference: https://www.flickr.com/photos/simmogl/2449850215
  8. 8. Listen to your customers, give them what they want, not what they ask for.
  9. 9. Good at finding a spark Terrible for validating the worth of that spark Customer interviews Image Reference: http://happilyhaverland.blogspot.com/2013/01/
  10. 10. Lessons Learnt: Interviews • Pre-qualify your interviewees • Face to face • Ask to record the session • Try whole team, then scale back • Start Open, Narrow in • Discuss your product last Image Reference: https://www.flickr.com/photos/wocintechchat/22518583822
  11. 11. Your understanding What the user sees Photo Reference: https://www.flickr.com/photos/eggrole/7524458398
  12. 12. Observational Testing “Nothing is quite so humbling as being forced to watch in silence as some poor play-tester stumbles around your level for 20 minutes, unable to figure out the "obvious" answer that you now realize is completely arbitrary and impossible to figure out.”
  13. 13. Customer Testing
  14. 14. Observational Testing: Process 1. Provide an objective 2. Observe them and their screen 1. No guidance 2. Video record 3. Ask them to explain their thinking
  15. 15. Observational Testing: Benefits • Challenge your design approach • Validate hypothesis • Dramatically increase usability
  16. 16. Illusion of choice
  17. 17. Telstra Open APIs
  18. 18. Surveys: Open over Closed CD 1: CD 2: CD 3: Please enter your top 3 Albums Yield: Pearl Jam 1 2 3 4 5 Please prioritise these Albums AC DC Live: AC DC Meddle: Pink Floyd Garbage: Garbage Wish you were here: Pink Floyd CLOSED OPEN
  19. 19. • Alt. Background colour • Alt. Background • Alt. Eye direction • Alt. Barry direction • Alt. Barry image • Just Letters • Just jetpack • Overlay text • Etc. etc.
  20. 20. Always set a hypothesis • Yes it will be a guess! • Guess hypothesis still educate
  21. 21. ? Baseline: Messaging anytime Hypothesis: Messaging during prime time will increase open rate. Validated Invalidated
  22. 22. Photo by: https://www.deviantart.com/romanjones, Licensed by Creative Commons
  23. 23. Hypothesis can only be disproven
  24. 24. Versions released each week 1. Baseline version, just basic game, no progression. 2. Improved tutorial 3. UI/UX tweaks 4. First trial of progression system 5. Second trial of different progression system 6. Third trial of different progression system
  25. 25. 0 50 100 150 200 250 300 350 400 450 500 1 2 3 4 5 6 7 ACTIVEUSERS END OF WEEK Total Retention Tutorial UI/UX Progress1 Progress2 Progress3 Base
  26. 26. 0 50 100 150 200 250 300 350 400 450 500 1 2 3 4 5 6 7 ACTIVEUSERS END OF WEEK Total Retention with Cohort Size Tutorial UI/UX Progress1 Progress2 Progress3 Base
  27. 27. 0 10 20 30 40 50 60 70 80 1 2 3 4 5 6 7 %ofactiveUsersbycohort End of week Retention by Cohort TutorialUI/UX Progress 1 Progress 2 Progress 3 Base
  28. 28. Key Lessons I learnt • Mindset is crucial • Listen to your customers • Customer Interviews • Observational testing • Survey's need to be open • Always set a hypothesis • Negative hypothesis • Cohort analysis
  29. 29. bit.ly/OutputsToOutcomes Blog: www.journey-to-better.com bit.ly/FreeAgilePosters bit.ly/ScrumRolesGame bit.ly/Weekend_Escape andrewrusling@hotmail.com @andrewrusling Thank you

×