Integrated Dev And Qa Team With Scrum


Published on

This presentation wants to share our experience on forming an integrated Development/QA team in Perficient projects applying Scrum, and some of our best practices on securing high quality.

Published in: Technology, Design
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Integrated Dev And Qa Team With Scrum

  1. 1. Integrated Development and QA Team with Scrum Ethan Huang Perficient GDC, China
  2. 2. Challenges We Were Facing <ul><li>At the beginning of the project, team was suffering from having: </li></ul><ul><li>No testers </li></ul><ul><ul><li>Our resource plan could not support a fully structured team which contains dedicated QA roles </li></ul></ul><ul><li>A fixed schedule </li></ul><ul><ul><li>We only had 90 calendar days in total as a fixed deliver timeline </li></ul></ul><ul><ul><li>So we didn’t have much buffer time for ‘stabilization’ in terms of bug fixing </li></ul></ul><ul><li>Unclear requirement </li></ul><ul><ul><li>We had no ‘Finalized’ requirement even in the late development phase </li></ul></ul><ul><ul><li>And we could not afford the documentation/communication effort for writing both requirement and test case documents </li></ul></ul><ul><li>No enough communication with the client </li></ul>
  3. 3. The Way We Go <ul><li>Team decided to resolve the problems and secure high quality by: </li></ul><ul><li>Forming an Integrated Dev/QA team </li></ul><ul><ul><li>Developers agreed that we should secure high quality by ourselves </li></ul></ul><ul><ul><li>Which means the developers were to test for themselves </li></ul></ul><ul><li>Running short period Sprints (one week) </li></ul><ul><ul><li>Short Sprints helped to reduce risks </li></ul></ul><ul><ul><li>And secure high quality code by frequent inspections </li></ul></ul><ul><li>Running Test Case Driven Requirement </li></ul><ul><ul><li>We were using test cases to represent/confirm requirement directly </li></ul></ul>
  4. 4. Detailed Actions <ul><li>Ramping up in the first a couple of Sprints: </li></ul><ul><ul><li>Designing the team model and made clear of the Developers responsibilities </li></ul></ul><ul><ul><li>Designing our Test Case template </li></ul></ul><ul><ul><li>We defined and posted a white board diagram to highlight our process </li></ul></ul><ul><li>In the later Sprints we continuously improved our approach: </li></ul><ul><ul><li>We had one member to act as the “ QA Goalkeeper ” </li></ul></ul><ul><ul><li>And for each User Story we had a volunteer to be the “ Story Owner ” </li></ul></ul><ul><ul><li>Built several quality principles </li></ul></ul><ul><ul><li>Setup 4 Quality Gates </li></ul></ul>
  5. 5. Ramp up – Forming the Team
  6. 6. Ramp up – Defining the Sprint Approach
  7. 7. Practice – Test Case Driven Requirement <ul><li>Developers were developing test cases before coding </li></ul><ul><li>And use the test case as the tool to confirm the requirement </li></ul><ul><ul><li>It made significant difference on helping the team understand the business </li></ul></ul><ul><ul><li>Also scaled down the effort on confirming the requirement back and forth </li></ul></ul>One possible format for the test cases:
  8. 8. Practice – Conducting tests by developers <ul><li>Developers develop code following the test cases developed by us own </li></ul><ul><li>Everyone takes both development and testing tasks </li></ul><ul><ul><li>could be testing on others code – buddy test </li></ul></ul><ul><li>No bugs to be checked in into repository – run enough test on local </li></ul><ul><li>Do daily verification after the daily build, and resolve today’s issues today </li></ul>Two IMPORTANT principles that all team members came to agreement on:
  9. 9. Practice – The 4 Q-gates before we deliver <ul><li>Q-gate # 1 – Local verification + Continuous Code Review (CCR) before code check in </li></ul><ul><ul><li>The Story Owner runs the test step by step on local dev environment </li></ul></ul><ul><ul><li>The identified Reviewer to review the local code </li></ul></ul><ul><ul><li>Exit Criteria: Resolve all the issues </li></ul></ul><ul><li>Q-gate # 2 – Daily Build and Daily Verification on Dev Server </li></ul><ul><ul><li>At least one daily build, sometimes multiple builds </li></ul></ul><ul><ul><li>Run UT on servers, track the Sonar report for variations </li></ul></ul><ul><ul><li>The QA Goalkeeper and Story Owners test the new feature after the build </li></ul></ul><ul><ul><li>Exit Criteria: Resolve all the issues found on that day </li></ul></ul><ul><li>Q-gate # 3 – Team Verification on Test Server before Sprint closure </li></ul><ul><ul><li>The whole team gather together in one room and verify all the integrated feature one by one </li></ul></ul><ul><ul><li>Exit Criteria: Resolve all the ‘Major’ above issues and log the rest into Jira </li></ul></ul><ul><li>Q-gate # 4 – Formal Code Review (FCR) every one or two sprints </li></ul><ul><ul><li>Exit Criteria: All findings logged into Jira have been fixed </li></ul></ul>
  10. 10. Results - 12 weeks development vs. 1 week stabilization <ul><li>We ran near 13 one-week sprints (89 calendar days) </li></ul><ul><li>2 RED Sprints for we failed to deliver high enough quality </li></ul><ul><li>But we finished integrating all the planned feature in sprint # 11 </li></ul><ul><li>In sprint # 12 we spent one whole week for regression and bug fixing </li></ul><ul><li>In sprint # 13 we finished the clean-up </li></ul>Development + Testing Final Testing Cleaning Up
  11. 11. Quality Statistics – UT Coverage/Test Success From Sonar report
  12. 12. Quality in Statistics – Code Summarization From Sonar report
  13. 13. Quality Statistics – Code Complexity From Sonar report
  14. 14. Quality Statistics – Violations From Sonar report
  15. 15. Quality Statistics – Time Machine From Sonar report
  16. 16. Best Practices <ul><li>Local Verification/Buddy Test before code check in </li></ul><ul><ul><li>It’s the first step of Continuous Code Review </li></ul></ul><ul><ul><li>The Story Owner does the verification together with the developer </li></ul></ul><ul><ul><li>Developer walks the Story Owner through the test case step by step </li></ul></ul><ul><ul><li>All the findings should be fixed and then do the verification again </li></ul></ul><ul><li>This activity was happening daily and could find most of the functional defects or even potential defects as early as possible </li></ul><ul><li>It takes about 30 minutes for both the author and the reviewer for one verification – the effort is much less than the regular testing activities which happens after the integration </li></ul>
  17. 17. Best Practices <ul><li>Daily Verification on Dev Server for the new build </li></ul><ul><ul><li>Our daily build happens at 12:00 PM everyday </li></ul></ul><ul><ul><li>The QA Goalkeeper will do the manual verification for all the new features newly integrated (sometimes the author will prefer to walk the whole team through the new feature) </li></ul></ul><ul><ul><li>It takes about 30 – 60 minutes, and the findings will be logged into a spreadsheet and be posted on the whiteboard </li></ul></ul><ul><ul><li>The issue owners will fix them before the end of the day </li></ul></ul><ul><ul><li>Do another manual build if necessary to verify the fixed bugs </li></ul></ul><ul><ul><li>In cases of some issues could not be resolved in that day, they’ll be treated as the most important tasks which we should not start new implementation before we fix them </li></ul></ul><ul><li>This activity was happening daily and could find most of the integration issues as early as possible </li></ul><ul><li>The more important, team members committed to resolve the issues before they leave so that for every build we will have solid quality, no debts </li></ul>
  18. 18. Best Practices <ul><li>Team Verification before the Sprint Review </li></ul><ul><ul><li>As the exit criteria of the entire sprint, team will be sitting in one room and do the verification/manual functional test for all the feature developed within this Sprint </li></ul></ul><ul><ul><li>Authors run the test following the test cases </li></ul></ul><ul><ul><li>All findings will be logged into Jira as the highest priority tasks for the next Sprint </li></ul></ul><ul><ul><li>Sprint exit criteria: having no major above issues found in the session – the key fact to decide whether the sprint status is Green or Red </li></ul></ul><ul><li>This activity helps the whole team to have a chance of inspecting the whole set of Sprint deliverables. </li></ul><ul><li>Since most of the defects have been inspected/fixed during the Sprint day-to-day work, therefore, Team Verification helps more on finding those highest priority/severity issues, and getting a quality overview/summarization for that Sprint </li></ul>
  19. 19. Best Practices <ul><li>Simplify the way we log/manage defects </li></ul><ul><ul><li>Tools we were using to log/manage defects: Yellow Stickies, Spreadsheet/txt files + White Board, and Jira </li></ul></ul><ul><ul><li>Yellow Stickies: To log Local Verification findings </li></ul></ul><ul><ul><li>Spreadsheet/txt files: To log Daily Verification findings </li></ul></ul><ul><ul><li>White Board: The place we post all Yellow Stickies and Spreadsheets </li></ul></ul><ul><ul><li>Jira: To log Team Verification findings/Regression Test findings </li></ul></ul>
  20. 20. Further Improvements <ul><li>FCR should happen more frequently </li></ul><ul><li>Test Automation – reduce the regression test effort significantly </li></ul>
  21. 21. Several key-words to be highlighted <ul><li>Integrated Dev/QA team </li></ul><ul><li>Story Owner </li></ul><ul><li>QA Goalkeeper </li></ul><ul><li>Local Verification </li></ul><ul><li>Daily Verification </li></ul><ul><li>Team Verification </li></ul>
  22. 22. <ul><li>Thanks! </li></ul><ul><li>谢谢观赏! </li></ul>