Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Being a Tester in Scrum


Published on

In this webinar Carsten will explore the role of the tester in a Scrum team. He will examine where the tester play an important role in Scrum and how you can contribute to a teams performance.

Published in: Software
  • Your opinions matter! get paid for them! click here for more info...●●●
    Are you sure you want to  Yes  No
    Your message goes here
  • I made $2,600 with this. I already have 7 days with this... ●●●
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

Being a Tester in Scrum

  1. 1. Being a tester in SCRUM EuroSTAR Huddle Webinar by Carsten Feilberg Managing Director and consultant House of Test Aps Denmark web: Twitter: @carsten_f
  2. 2. About me and House of Test Managing Director Consultant Speaker Coach / Mentor M.Sc. Contracting ConsultantsSweden Denmark South Africa United Kingdom Switzerland Communities Context-Driven Agile Father of two Cooks a lot @carsten_f PSL graduate All things testing CSM
  3. 3. Agenda • The tester role • Mindset • The actual testing • Stress inducers • A bit about dashboards • When document-driven testing is demanded • Questions
  4. 4. The tester role and the Developer title Product Owner Scrum Master Developer
  5. 5. The tester role and the Developer title Testing Programming Architechting Operating Designing Problem solving
  6. 6. The Mindset of Agile Not agile: Plan rigoriously from start to deadline Estimates may be revised but plan is locked Delays dealt with by adding time Agile: Stay clear on what’s important at the deadline As knowledge of capacity grows we put it to use Delays dealt with by navigating after each sprint
  7. 7. The Mindset of Agile • Feature-specific mindset • Looking at features only – adding more is better • Sprints are frustrating as they stop you half-way through a feature • Relies heavily on estimates and knowing exactly when to stop • Essentially waterfall/silo thinking • Agile mindset • Looking at the whole product – adds only what is valuable and safe • Sprints provides rhythm, calmness and cleanliness • Reacts to feedback
  8. 8. The actual testing • A lot of it is the usual technical investigation of the product with the purpose of evaluating it towards a number of quality characteristics and project goals. • The constraints are: • The product and features evolves organically, not pre-planned and spec’ed • Risks emerge along the way • Testing needs to be prioritized • Testing happens continuously • Everyone participates in testing
  9. 9. The testing workload in a typical sprint Preparation Tune-up’s Pairing Test stories Test Product Automate Test Product Finish automation
  10. 10. Stress Inducers 1. The team is too ambitious 2. Bad luck is a real thing 3. No obvious way to measure quality / no-one really cares 4. The team has split into fractions
  11. 11. 1. Taking on ambitions • All teams gets too ambitious at certain points • Learn through sprint review and retrospective • Velocity is a team metric for internal use only • Focus on value of product helps. Focus on SP doesn’t.
  12. 12. 2. Tackling problems • All teams wrestle with problems – that’s what we do • Sprint reviews are for tech problems • Retrospectives are for what keeps us from solving tech problems • Build the team – learn everything you can about each other • Voice your observations as passionless as you can • Take lead in team events: daily scrum - retrospectives
  13. 13. 3. Quality as a secondary goal • Use the backlog for tests that are really hard to do -> visibility • Argue for the risks and needs as you see them -> advocacy • Product Owner decides ultimately
  14. 14. 3. Quality as a secondary goal • “We can always fix the bugs later” • Dealing with bullies • Why are they opposing to agile and prefer waterfall’ing? • Why don’t they care about feedback? • Safety first
  15. 15. 4. Team is splitting up • Code Contributors and Product Care-takers vs
  16. 16. A few notes on boards 2 DO WIP TEST DONE
  17. 17. A few notes on boards 2 DO WIP TEST DONE “Def of Done” ???“I’m idle”
  18. 18. Low-Tech Dashboard Read more about the low-tech dashboard on
  19. 19. Document-driven testing in SCRUM? • Organisation is not agile + business as usual = documents • Problems: • Huge test plans never actually really helped us in any case • We haven’t got time for large docs • Not much to work from – product is emerging and changing • So – what do we do?
  20. 20. Test plan(ning) • Plan per sprint only • Mindmaps of what we can foresee we need to cover (and why) • Models of different coverages (different quality characteristics) • Share with team – get input – get modification
  21. 21. Test report(ing) 1. What was tested 2. What actually works and how well does it work? 3. What can’t we do with the product (compared to expectations) 4. What should have been tested (but wasn’t) 5. Will we meet the deadline? Are we in control?
  22. 22. Test strategy • If this is required as a deliverable through a contract – put it in the backlog, work on it as a team. • Shorter is typically better • DO NOT GIVE IN TO A SPRINT-ZERO for planning and docs! • Refine it sprint by sprint – doesn’t take long to do and you get wiser all the time
  23. 23. And the test cases? • First: assess whether you really need them • Do you want to re-perform them? • Do you want to capture how to test XYZ? • Do you need to deliver test material? • Do you need to automate? • The best test cases are written after you tested • Obviously you then know how to test it in the best way
  24. 24. That’s all folks – questions? web: Twitter: @carsten_f