A/B Testing - How to Break Things and Fail Fast... Without Breaking Things

659 views
536 views

Published on

Everyone wants to optimize their web property, especially if that web property is their only sales channel. Lots of tools like Optimizely, allow marketers to go in and run tests on their own in an effort to "break things and fail fast", then take those learnings to make the site better. Unfortunately, the risk of severely breaking things can be significant, and it isn't the marketer who gets the call in the middle of the night when something is broken.
In this talk Mark will explore how Atlassian, the creator of JIRA and Confluence, has used Magnolia CMS and other tools to operationalize A/B testing to continually optimize Atlassian’s website. Using this process we can balance failing fast with keeping things running smoothly.

Published in: Marketing, Technology, Business
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
659
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
6
Comments
0
Likes
4
Embeds 0
No embeds

No notes for slide

A/B Testing - How to Break Things and Fail Fast... Without Breaking Things

  1. 1. MARK HALVORSON • HEAD OF INTERACTIVE • ATLASSIAN SOFTWARE • @halv0112 A/B Testing HOW TO BREAK THINGS AND FAIL FAST… WITHOUT BREAKING THINGS
  2. 2. About Atlassian • Products include: JIRA, Confluence, Bitbucket, HipChat… • Founded in 2002 • Over 35,000 Customers • Over 700 employees globally • No sales people ! ! ! We h e l p g r e a t t e a m s b u i l d b e t t e r s o f t w a r e , t o g e t h e r.
  3. 3. Think big, act small, fail fast, learn rapidly. "LEAN SOFTWARE DEVELOPMENT” BY MARY AND TOM POPPENDIECK (2003) ”“
  4. 4. Atlassian A/B Testing is cross functional • Email • Product Management • Customer Platform • Customer Advocates • Internal Systems ! ! ! C e n t r a l G r o w t h H a c k i n g Te a m t o s u p p o r t : • Leads • Product Marketing • Support • Bitbucket • HipChat
  5. 5. SCHEDULE TO AVOID CONFLICTS BUILD OUT GOOD CANDIDATES TRIAGE BASED ON POTENTIAL IMPACT COLLECT GREAT IDEAS Our Process ANALYZE RESULTS RUN EXPERIMENTS
  6. 6. SCHEDULE TO AVOID CONFLICTS BUILD OUT GOOD CANDIDATES TRIAGE BASED ON POTENTIAL IMPACT COLLECT GREAT IDEAS Our Process ANALYZE RESULTS RUN EXPERIMENTS
  7. 7. Where do experiment ideas come from?
  8. 8. Anywhere.
  9. 9. What are you are trying to improve?
  10. 10. Blimp Dashboard
  11. 11. JIRA – Ticket
  12. 12. • Pebbles • Change in copy or CTA presentation • Rocks • New page (e.g. no Dev Tools family page) • Boulders • Simplified order form. ! ! ! Types of tests
  13. 13. Examples
  14. 14. Reduce Friction
  15. 15. Increase Visibility
  16. 16. Test drive campaigns
  17. 17. Simplify
  18. 18. SCHEDULE TO AVOID CONFLICTS BUILD OUT GOOD CANDIDATES TRIAGE BASED ON POTENTIAL IMPACT COLLECT GREAT IDEAS Our Process ANALYZE RESULTS RUN EXPERIMENTS
  19. 19. What are you are trying to improve?
  20. 20. Blimp Dashboard
  21. 21. JIRA – Prioritized Backlog
  22. 22. JIRA – Ticket
  23. 23. JIRA – Ticket
  24. 24. Experiment Illuminati
  25. 25. SCHEDULE TO AVOID CONFLICTS BUILD OUT GOOD CANDIDATES TRIAGE BASED ON POTENTIAL IMPACT COLLECT GREAT IDEAS Our Process ANALYZE RESULTS RUN EXPERIMENTS
  26. 26. Optimizely – Build
  27. 27. Optimizely – Build
  28. 28. Prepare for results.
  29. 29. Confluence – Corresponding Test Page
  30. 30. Confluence – Corresponding Test Page
  31. 31. Confluence – Corresponding Test Page
  32. 32. Confluence – Corresponding Test Page
  33. 33. Confluence – Corresponding Test Page
  34. 34. Confluence – Corresponding Test Page
  35. 35. Confluence – Corresponding Test Page
  36. 36. Confluence – Corresponding Test Page
  37. 37. Confluence – Corresponding Test Page
  38. 38. Confluence – Corresponding Test Page
  39. 39. Confluence – Corresponding Test Page
  40. 40. Confluence – Corresponding Test Page
  41. 41. SCHEDULE TO AVOID CONFLICTS BUILD OUT GOOD CANDIDATES TRIAGE BASED ON POTENTIAL IMPACT COLLECT GREAT IDEAS Our Process ANALYZE RESULTS RUN EXPERIMENTS
  42. 42. JIRA – Prioritized Backlog
  43. 43. SCHEDULE TO AVOID CONFLICTS BUILD OUT GOOD CANDIDATES TRIAGE BASED ON POTENTIAL IMPACT COLLECT GREAT IDEAS Our Process ANALYZE RESULTS RUN EXPERIMENTS
  44. 44. Optimizely – Start Experiment
  45. 45. Optimizely – Project Code
  46. 46. Magnolia CMS – Include Optimizely Checkbox
  47. 47. Magnolia CMS – Include Optimizely Checkbox
  48. 48. Confluence – Corresponding Test Page
  49. 49. SCHEDULE TO AVOID CONFLICTS BUILD OUT GOOD CANDIDATES TRIAGE BASED ON POTENTIAL IMPACT COLLECT GREAT IDEAS Our Process ANALYZE RESULTS RUN EXPERIMENTS
  50. 50. Optimizely – Results
  51. 51. What are you are trying to improve?
  52. 52. Blimp Dashboard
  53. 53. Optimizely – Results
  54. 54. Confluence – Corresponding Test Page
  55. 55. Confluence – Results Archive
  56. 56. • Ideas are everywhere • Everything affecting conversion is tested • No traffic wasted • Be aggressive • Regular cadence • Never run a test that you wouldn’t want to win Our Test Philosophy
  57. 57. • Ideas are everywhere collect as many as you can, prioritize based on impact. • Use ticket number everywhere so everyone is referring to the same test. • Involve stakeholders and developers… and get approvals. Key Take Aways
  58. 58. Thank you! MARK HALVORSON • HEAD OF INTERACTIVE • ATLASSIAN SOFTWARE • @halv0112

×