Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Best Effort Agile

Scrum with Outsourced and Distributed Teams

  • Login to see the comments

  • Be the first to like this

Best Effort Agile

  1. 1. Best Effort Agile Scrum with Outsourced and Distributed Teams MARK SAWERS, ALEXANDRA RAMIN JUNE 2017
  3. 3. Scrum Review  Theory: transparency, inspection, adaptation  Values: commitment, courage, focus, openness and respect  Artifacts: product backlogs, increments, definition of done  Ceremonies: sprint, planning, scrum, review, retro  Roles: product owner, product team, scrum master 3 Source -
  4. 4. What works?  Most of the role aspects:  Single P.O. who owns backlog, priority and releases  P.O. clearly expresses PBIs/test criteria, and is final accepter  Product team owns estimates and sprint commitment  Scrum Master is servant-leader, coach, process enforcer  And in particular:  Small team  Small batch sizes  Iterative/Incremental everything  Definition of Done 4
  5. 5. What doesn’t work?  Co-located team  Self-organizing team  Flat, cross-functional team  Neutral Scrum Master 5
  6. 6. Agenda  About Us  Context  Lessons Learned  People  Process  Product 6
  7. 7. About Us  Mark Sawers  20+ years in Software Engineering  Dev Mgr @ Starwood Hotels (Marriott), CTO @ WebomateS  Certified ScrumMaster, practicing Scrum ~2 yrs   mark at sawers dot com  Alexandra Ramin  20 years in Marketing Operations and Project Management  Marketing Operations Mgr @ Starwood Hotels (Marriott)  Certified Scrum Product Owner, practicing Scrum ~2 yrs   alexandra dot ramin at starwoodhotels at com 7
  8. 8. Context  Product used daily by all Starwood guests, on property associates and Marketing staff representing 1K+ properties  ~20 person product development team, split into 3 teams  Adopted Scrum ~2 years ago  Outsource partner staffs 100% of talent  80% IST, 2 US EST locations  Results  Doubled release freq., Halved defect density, Integrated releases  Challenges  Predictability, Staff turnover (not new) 8
  9. 9. Lessons Learned PEOPLE 9
  10. 10. Small Teams, Big Wins  Small batch sizes, narrower focus  Bridge teams with epic-level grooming, pre-planning and scrum of scrums  Have one ScrumMaster lead coordinated activities, e.g. tax cycle 10
  11. 11. Right People, Right Attitude  Engaged business owner  Developers/Testers that work well without detailed requirements  For each location, a senior developer and tester to lead juniors  Engaged product technical manager  Disciplined, principle-focused and product- knowledgeable Scrum Master 11
  12. 12. Lessons Learned PROCESS 12
  13. 13. Story != Task  The project-oriented planning habit is very hard to kick  Decomposition preference, in order: 1. By user-visible function 2. By architectural component 3. By activity 13
  14. 14. How to Miss Commitments  If there isn’t enough information on a user story, don't add details or acceptance criteria during grooming  Develop first, then ask questions  Assume an existing feature was broken by some story and raise a defect  “Good enough” is never ok, make it perfect  Change stories mid-sprint  Add stories mid-sprint without removing comparable ones that have not been started  Finish the story regardless of a change in business context  Keep to the sprint commitment, regardless of business or technical impediments or changes  If it's not working, keep doing it  Run experiments but ignore feedback 14
  15. 15. Iterate and Increment each story  Grooming - medium sized stories, reviewed by team, good acceptance criteria  Acceptance Criteria early, Test Plan reviews early (day 1-3) and with whole team  Iterative business previews for early feedback  Spikes for uncertainty  Spikes to realize dependencies, e.g. on a shared services team 15
  16. 16. Over-plan your sprint  Prevent team overloading by mapping the story points to a schedule  Dates for each story: dev complete, test plan complete, test exec start, UAT start  Aim for UAT start at the latest day 9  Stagger test execution and UAT start  Lightweight tasks: dev, test plan, test exec, uat start with day #  If a team is over capacity and one is underutilized, split the work, but assign a lead  Limit change to sprint scope mid-sprint. If you need to add, then reprioritize and move items, don’t just add work. 16
  17. 17. Reality check the release  Go-no-go meeting on day 8 of last implementation sprint  Only promote those that are QA complete and can reasonably be UAT complete by day 10  Be realistic about what can make it into the release so that code doesn’t have to be backed out  Stick to your decisions after the go-no-go and don't squeeze in more stories when they are completed 17
  18. 18. Keep retros fresh  Change up the format / tools periodically  Actively engage team members that are not participating 18
  19. 19. Over-communicate process  Operating Agreements: Release patterns, Tooling, Entry/exit criteria, PBI types, states, usage  Docs, Presentations, KTs  Reinforce process during daily standups (e.g. is the capacity correct, are task hours estimated and spent updated, is the task/story status correct, was definition of done met)  Publish schedules for multi-day efforts 19
  20. 20. Schedule carefully  Find best available slots for all time zones, and perhaps revisit twice a year on daylight savings switches  Line up meetings so as many can attend as possible (P.O., tech mgr, tech leads, BA, etc.)  Prioritize if there are conflicts  For each ceremony type, keep the start times the same 20
  21. 21. Use multiple information radiators  Agile planning tool  Daily email with per story, sprint and release status  Objective status measures/colors: Green-Yellow-Red  Daily email during tax cycle 21
  22. 22. Lessons Learned PRODUCT 22
  23. 23. Automate to reduce cycle times  No one should lift a finger to get code from their IDE to a server  Automate the hell out of your regression suite – reduce tax cycle  You can’t test everything  Use business input to test the most important / most used features. Preferably using product usage data, not guesses  Periodically review these, since your product usage typically changes over time. 23
  24. 24. Design in product agility  Flexibility: Feature toggles for staggered multi- subsystem releases  Configurability: DB-stored app properties  Extensibility: Data driven business rules 24
  25. 25. Thank you mark at sawers dot com alexandra dot ramin at starwoodhotels at com 25