Your SlideShare is downloading. ×
Integrated Dev And Qa Team With Scrum
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

Integrated Dev And Qa Team With Scrum

5,805
views

Published on

This presentation wants to share our experience on forming an integrated Development/QA team in Perficient projects applying Scrum, and some of our best practices on securing high quality.

This presentation wants to share our experience on forming an integrated Development/QA team in Perficient projects applying Scrum, and some of our best practices on securing high quality.

Published in: Technology, Design

0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
5,805
On Slideshare
0
From Embeds
0
Number of Embeds
6
Actions
Shares
0
Downloads
116
Comments
0
Likes
2
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Integrated Development and QA Team with Scrum Ethan Huang Perficient GDC, China
  • 2. Challenges We Were Facing
    • At the beginning of the project, team was suffering from having:
    • No testers
      • Our resource plan could not support a fully structured team which contains dedicated QA roles
    • A fixed schedule
      • We only had 90 calendar days in total as a fixed deliver timeline
      • So we didn’t have much buffer time for ‘stabilization’ in terms of bug fixing
    • Unclear requirement
      • We had no ‘Finalized’ requirement even in the late development phase
      • And we could not afford the documentation/communication effort for writing both requirement and test case documents
    • No enough communication with the client
  • 3. The Way We Go
    • Team decided to resolve the problems and secure high quality by:
    • Forming an Integrated Dev/QA team
      • Developers agreed that we should secure high quality by ourselves
      • Which means the developers were to test for themselves
    • Running short period Sprints (one week)
      • Short Sprints helped to reduce risks
      • And secure high quality code by frequent inspections
    • Running Test Case Driven Requirement
      • We were using test cases to represent/confirm requirement directly
  • 4. Detailed Actions
    • Ramping up in the first a couple of Sprints:
      • Designing the team model and made clear of the Developers responsibilities
      • Designing our Test Case template
      • We defined and posted a white board diagram to highlight our process
    • In the later Sprints we continuously improved our approach:
      • We had one member to act as the “ QA Goalkeeper ”
      • And for each User Story we had a volunteer to be the “ Story Owner ”
      • Built several quality principles
      • Setup 4 Quality Gates
  • 5. Ramp up – Forming the Team
  • 6. Ramp up – Defining the Sprint Approach
  • 7. Practice – Test Case Driven Requirement
    • Developers were developing test cases before coding
    • And use the test case as the tool to confirm the requirement
      • It made significant difference on helping the team understand the business
      • Also scaled down the effort on confirming the requirement back and forth
    One possible format for the test cases:
  • 8. Practice – Conducting tests by developers
    • Developers develop code following the test cases developed by us own
    • Everyone takes both development and testing tasks
      • could be testing on others code – buddy test
    • No bugs to be checked in into repository – run enough test on local
    • Do daily verification after the daily build, and resolve today’s issues today
    Two IMPORTANT principles that all team members came to agreement on:
  • 9. Practice – The 4 Q-gates before we deliver
    • Q-gate # 1 – Local verification + Continuous Code Review (CCR) before code check in
      • The Story Owner runs the test step by step on local dev environment
      • The identified Reviewer to review the local code
      • Exit Criteria: Resolve all the issues
    • Q-gate # 2 – Daily Build and Daily Verification on Dev Server
      • At least one daily build, sometimes multiple builds
      • Run UT on servers, track the Sonar report for variations
      • The QA Goalkeeper and Story Owners test the new feature after the build
      • Exit Criteria: Resolve all the issues found on that day
    • Q-gate # 3 – Team Verification on Test Server before Sprint closure
      • The whole team gather together in one room and verify all the integrated feature one by one
      • Exit Criteria: Resolve all the ‘Major’ above issues and log the rest into Jira
    • Q-gate # 4 – Formal Code Review (FCR) every one or two sprints
      • Exit Criteria: All findings logged into Jira have been fixed
  • 10. Results - 12 weeks development vs. 1 week stabilization
    • We ran near 13 one-week sprints (89 calendar days)
    • 2 RED Sprints for we failed to deliver high enough quality
    • But we finished integrating all the planned feature in sprint # 11
    • In sprint # 12 we spent one whole week for regression and bug fixing
    • In sprint # 13 we finished the clean-up
    Development + Testing Final Testing Cleaning Up
  • 11. Quality Statistics – UT Coverage/Test Success From Sonar report
  • 12. Quality in Statistics – Code Summarization From Sonar report
  • 13. Quality Statistics – Code Complexity From Sonar report
  • 14. Quality Statistics – Violations From Sonar report
  • 15. Quality Statistics – Time Machine From Sonar report
  • 16. Best Practices
    • Local Verification/Buddy Test before code check in
      • It’s the first step of Continuous Code Review
      • The Story Owner does the verification together with the developer
      • Developer walks the Story Owner through the test case step by step
      • All the findings should be fixed and then do the verification again
    • This activity was happening daily and could find most of the functional defects or even potential defects as early as possible
    • It takes about 30 minutes for both the author and the reviewer for one verification – the effort is much less than the regular testing activities which happens after the integration
  • 17. Best Practices
    • Daily Verification on Dev Server for the new build
      • Our daily build happens at 12:00 PM everyday
      • The QA Goalkeeper will do the manual verification for all the new features newly integrated (sometimes the author will prefer to walk the whole team through the new feature)
      • It takes about 30 – 60 minutes, and the findings will be logged into a spreadsheet and be posted on the whiteboard
      • The issue owners will fix them before the end of the day
      • Do another manual build if necessary to verify the fixed bugs
      • In cases of some issues could not be resolved in that day, they’ll be treated as the most important tasks which we should not start new implementation before we fix them
    • This activity was happening daily and could find most of the integration issues as early as possible
    • The more important, team members committed to resolve the issues before they leave so that for every build we will have solid quality, no debts
  • 18. Best Practices
    • Team Verification before the Sprint Review
      • As the exit criteria of the entire sprint, team will be sitting in one room and do the verification/manual functional test for all the feature developed within this Sprint
      • Authors run the test following the test cases
      • All findings will be logged into Jira as the highest priority tasks for the next Sprint
      • Sprint exit criteria: having no major above issues found in the session – the key fact to decide whether the sprint status is Green or Red
    • This activity helps the whole team to have a chance of inspecting the whole set of Sprint deliverables.
    • Since most of the defects have been inspected/fixed during the Sprint day-to-day work, therefore, Team Verification helps more on finding those highest priority/severity issues, and getting a quality overview/summarization for that Sprint
  • 19. Best Practices
    • Simplify the way we log/manage defects
      • Tools we were using to log/manage defects: Yellow Stickies, Spreadsheet/txt files + White Board, and Jira
      • Yellow Stickies: To log Local Verification findings
      • Spreadsheet/txt files: To log Daily Verification findings
      • White Board: The place we post all Yellow Stickies and Spreadsheets
      • Jira: To log Team Verification findings/Regression Test findings
  • 20. Further Improvements
    • FCR should happen more frequently
    • Test Automation – reduce the regression test effort significantly
  • 21. Several key-words to be highlighted
    • Integrated Dev/QA team
    • Story Owner
    • QA Goalkeeper
    • Local Verification
    • Daily Verification
    • Team Verification
  • 22.
    • Thanks!
    • 谢谢观赏!