Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Agile Analytics

1,065 views

Published on

Talk on agile analytics, definition of success, multi-disciplinary teams, and transparency. Presented at MeasureCamp XII, London (2018).

Published in: Internet

Agile Analytics

  1. 1. Agile Analytics What can we learn from agile methodologies? @SimoAhava from @ReaktorNow
  2. 2. Agile Analytics What can we learn from agile methodologies? @SimoAhava from @ReaktorNow
  3. 3. Simo Ahava Senior Data Advocate, Reaktor Google Developer Expert, Google Analytics Blogger, developer, www.simoahava.com Twitter-er, @SimoAhava Google+:er, +SimoAhava
  4. 4. Individuals and interactions over processes and tools
 Working software over comprehensive documentation
 Customer collaboration over contract negotiation
 Responding to change over following a plan
 That is, while there is value in the items on
 the right, we value the items on the left more. Manifesto for Agile Software Development 
 (http://agilemanifesto.org/)
  5. 5. Agile works in analytics because it promotes reaction over sticking to plans
  6. 6. Agile works in analytics because it promotes multi-disciplinary teams over silos Business design Software development AnalyticsCRO UX
  7. 7. Agile works in analytics because it promotes increments over massive releases Sprint 1 Sprint 2 Sprint 3 Sprint 4 Contact form Server-side validation Item1 Item2 Item3 Item4 Auto-complete
  8. 8. Agile works in analytics because it promotes feedback loops over tunnel vision Sprint 1 Sprint 2 Sprint 3 Sprint 4 Contact form Server-side validation Item1 Item2 Item3 Item4 Auto-complete Inline validation A/B test, UX and peer reviews, heatmaps, session recordings
  9. 9. Data is the lifeblood of the organization. It flows through all departments, across job titles, permeating the very fabric of the organization, reinforcing its foundations for growth. It cannot and should not be contained in one vector (a dedicated analyst) alone. Reaktor Blog: 10 Truths About Data
 (https://goo.gl/r6pdXB)
  10. 10. Idea #1: Collaboration
  11. 11. Idea #1: Collaboration Daily meetings - A call / stand-up meeting with all team members
 - Discuss key metrics via dashboard next to the board
 - Discuss if there are questions about analytics implementation for tasks being worked on
 - Avoid "peeking" at experiments that are on-going and making radical, unplanned changes to data collection
  12. 12. Idea #1: Collaboration Sprint planning - Choose stories for the upcoming sprint
 - Discuss whether it’s important to measure these with analytics tools
 - Discuss if it’s necessary to start A/B tests, etc.
 - Choose stories based on potential value
  13. 13. Idea #1: Collaboration Backlog grooming - Split / evaluate stories waiting in the backlog
 - Discuss what the Definition of Success is for these stories
 - Discuss how to measure the success of these stories
 - Discuss what the potential value of each story is
  14. 14. Idea #1: Collaboration Weekly / demo - Show what’s been done so far (and is ready for shipping)
 - Discuss on-going experiments and data collection
 - Show previous test / data collection results via a dashboard and/or prepared presentation
 - Discuss organization-wide about the role of data in the current project
  15. 15. Idea #2: Definition of Done Definition of Done 1. Test coverage extended to cover
 the feature
 2. Feature documented (only when necessary)
 3. Feature passes code/UX/peer review
  16. 16. Idea #2: Definition of Done - The Definition of Done is a (living) list of items that each developed feature must pass for the sprint to be deemed successful
 - Adding "analytics" to the DoD is a great way to put measurement to top-of-mind when developing features
 - It’s important to shape it into "is discussed" rather than "is done", because not all features can/should be measured or tested Definition of Done 1. Test coverage extended to cover
 the feature
 2. Feature documented (only when necessary)
 3. Feature passes code/UX/peer review
 4.Feature measurement / testing is discussed
  17. 17. Idea #3: Multi-disciplinary teams IT UX Data Biz Marketing The Project WORST: Lack of involvement, lack of participation, lack of collaboration, lack of trust.
  18. 18. Idea #3: Multi-disciplinary teams IT UX Data Biz Marketing The Project BETTER: Good communication, lack of shared goals, lack of involvement.
  19. 19. Idea #3: Multi-disciplinary teams The Project BEST: Transparency, involvement, shared knowledge and skills, constant learning.
  20. 20. Idea #3: Multi-disciplinary teams Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization's communication structure. - (Mel) Conway’s law
  21. 21. A good consultant should work in a manner that inevitably makes them redundant. https://goo.gl/aPvVjb
  22. 22. Idea #4: Hire to educate, not to delegate The Project BAD: Experts are hired as extra pairs of hands and as reporting tools.
  23. 23. Idea #4: Hire to educate, not to delegate The Project BEST: Experts work with the team, teaching, facilitating, building processes.
  24. 24. A feature is completed when it is done successfully. https://www.simoahava.com/analytics/definition-of-success/
  25. 25. Success is subjective.
  26. 26. Success is temporal.
  27. 27. Success is non-binary.
  28. 28. Definition of Done Are we doing things right? Definition of Success Are we doing the right things?
  29. 29. Definition of Done Are we doing things right? Definition of Success Are we doing the right things?
  30. 30. simo.ahava@reaktor.com www.simoahava.com Twitter: @SimoAhava Google+: +SimoAhava

×