Software Defect Prevention via Continuous Inspection


Published on

Research and guidance for educing software development risk and cost while improving speed, quality and maintainability by applying review at all levels.

Published in: Technology
  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Software Defect Prevention via Continuous Inspection

  1. 1. Avoid the Zone of Chaos: Economics of Quality andProductivity via Code ReviewReducing software development risk and costwhile improving speed, quality andmaintainability by applying review at all levelsPresented by: Joshua GoughAtlanta ALT.NET Meetuphttp://www.meetup/com/AtlAltDotNet6/19/2012
  2. 2. Topic Outline● Avoiding the Ultimate Risk● Software Development Processes● Risks associated with poor code-review and lack of defect prevention● Automated .NET tools to support "continuous inspection", code-review, and defect prevention● Demo of static source-code analysis with Visual Studio and NDepend
  3. 3. Avoiding The Ultimate Risk● How to validate that youre building the product your customers or users want and need?● What untested assumptions and risks can lurk in requirements and design docs?● What kinds of reviews can happen before or in parallel with coding to test assumptions and mitigate risks?
  4. 4. Danger! Dont Go There!Say Wha t?
  5. 5. Royce Strawman Waterfall Model
  6. 6. Royces Observations
  7. 7. Final Royce Model(Shame on our industry for not reading his whole paper)
  8. 8. Generic Iterative and Incremental Model
  9. 9. Boehm Spiral Model
  10. 10. Generic Agile
  11. 11. Extreme Programming (XP) Feedback Loops
  12. 12. Scrum Agile Process Framework
  13. 13. Whirlpool Model(A "violent water metaphor" we can live with and enjoy)
  14. 14. Scrum Agile Process Framework
  15. 15. Lets Review...
  16. 16. Traditional: Known | Known
  17. 17. Agile: Known | Unknown
  18. 18. Lean Startup: Unknown| Unknown
  19. 19. And Now: Code Review...
  20. 20. Types of Code Review● Formal code review: involves a careful and detailed process with multiple participants and multiple phases: Example: Fagan Inspection● Over-the-shoulder : One developer looks over the authors shoulder as the latter walks through the code.● Email pass-around – Source code management system emails code to reviewers automatically after checkin is made.● Pair Programming – Two authors develop code together at the same workstation, such is common in Extreme Programming.● Tool-assisted code review – Authors and reviewers use specialized tools designed for peer code review.
  21. 21. Economic Reasons : Defect Cost Increase
  22. 22. Productivity Reasons: Faster Schedule t! Spo eet SwRelationship between defect rate and development time. As a rule,the projects that achieve the lowest defect rates also achieve theshortest schedules. -- Capers Jones
  23. 23. Cisco Case Study Data : Defect Counts
  24. 24. Formal Code Review
  25. 25. Hope This Guy Gets Lost in Elevator
  26. 26. Email Pass-Around Pre Check-In
  27. 27. Email Pass-Around Post Check-In
  28. 28. Email Pass-Around Code Review(Pray Uncle Bob Doesnt Get The Email, Unless You Crave Discipline! )
  29. 29. Over-The-Shoulder Walkthrough
  30. 30. Dont Be This Guy (Either of Them!)
  31. 31. Pair Programming● Agile software development technique wherein two programmers work together at one workstation● One drives and writes codes while the other observes (or navigates) and reviews each line of code● The two programmers switch roles frequently● While reviewing, the observer also considers the strategic direction of the work in order to: ○ Devise ideas for improvements and likely future problems to address ○ Free the driver to focus all of his or her attention on the "tactical" aspects of completing the current task, using the observer as a safety net and guide
  32. 32. In Other Words...
  33. 33. But, What Does the Science Say?● Isolated studies of pair-programming reveal results ranging all across the map● Some meta-analyses also reveal wide- ranging results● I suspect the answer to be "It depends", therefore proceed without dogma and use pragmatism
  34. 34. Example Study
  35. 35. Study Summary● 48% increase in correctness for complex systems ○ No significant time difference● Simple systems had 20% time decrease ○ No significant correctness difference● Overall no general time reduction or correctness increase ○ But an overall 84% effort increase● Limitations: this was a one day experiment with 99 individuals and 98 pairs How would working together longer affect results?
  36. 36. Tool-Assisted Code Review!
  37. 37. Demo: Visual Studio Code Analysis
  38. 38. Demo: NDepend Critical Warnings
  39. 39. 11 Lessons from SmartBear Cisco Case Study
  40. 40. 1. Review fewer than 200-400 lines of code at a time.
  41. 41. 2. Aim for an inspection rate of less than 300-500 LOC/hour
  42. 42. 3. Take enough time for a proper, slow review, but not more than 60-90 minutes K e y
  43. 43. 4. Authors should annotate source code before the review
  44. 44. Additional Tactical Tips...● 5. Establish quantifiable goals for code review and capture metrics so you can improve your processes● 6. Checklists substantially improve results for both authors and reviewers● 7. Verify that defects are actually fixed!
  45. 45. And Managerial Tips...● 8. Managers must foster a good code review culture in which finding defects is viewed positively● 9. Beware the “Big Brother” effect● 10. The Ego Effect: Do at least some code review, even if you dont have time to review it all
  46. 46. 11.Lightweight-style code reviews are efficient, practical, and effective at finding bugs
  47. 47. Many Thanks to SmartBear Software!(See CodeCollaborator Free Trial and Jason Cohens Free Book) Free!
  48. 48. Contact● Meetup:● Email:● Web:
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.