Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Adopting code reviews for agile software development


Published on

Agile 2010 talk

  • Be the first to comment

  • Be the first to like this

Adopting code reviews for agile software development

  1. 1. Mario Bernhart, Andreas Mauczka, Thomas Grechenig Research Group for Industrial Software (INSO) Vienna University of Technology, Austria
  2. 2. Introduction <ul><li>Code reviews have many benefits, most importantly to find bugs early [1, 2, 3] </li></ul><ul><li>It is commonly accepted that code reviews are resource-intensive [4] </li></ul><ul><li>Due to the lack of time and resources many (agile) software development teams do not perform traditional code reviews [5] </li></ul><ul><li>Code reviews support knowledge sharing [18] and collective code ownership [22] </li></ul>
  3. 3. Code reviews in agile environm. <ul><li>Formal reviews such as the IEEE1028 [2] tend to be too heavyweight in agile contexts [4, 5] </li></ul><ul><li>There is a growing interest in code reviews and review tools in the agile community [24] </li></ul><ul><li>An empirical study [25] proposes a combination of pair programming and code reviews </li></ul><ul><li> Q: How to adopt traditional code reviews to support reviews in an agile environment? </li></ul>
  4. 4. Indust. case: context & problem <ul><li>EHR exchange service for hospitals </li></ul><ul><li>Agile software development project, 24 months, Team of 7, sprint length 4 weeks </li></ul><ul><li>Planned code reviews for 4 days at the end of each sprint : </li></ul><ul><ul><li>1d inspection, 2d rework, 1d re-inspection </li></ul></ul><ul><li> Reviews were mostly challenged or skipped due to time and resource lack especially at the end of the sprint </li></ul>
  5. 5. Adoption strategy <ul><li>Make the reviews a continuous task </li></ul><ul><li>Make each review task smaller </li></ul><ul><li>Create review tasks automatically based on individual team rules </li></ul><ul><li>Make review asynchronous and support distributed reviewing </li></ul><ul><li>Provide a first-class IDE integration for developers </li></ul><ul><li> Continuous Changeset-Based Review CCBR </li></ul>
  6. 6. CCBR description <ul><li>Setup = List of (author, reviewer, filter) </li></ul><ul><ul><li>E.g. (Peter, Mike, *) + (*, Chris, *test*.java) + … </li></ul></ul><ul><li>Continuous review workflow: </li></ul><ul><ul><li>The developer commits to the SCM </li></ul></ul><ul><ul><li>If there is a matching rule: review task is auto-created </li></ul></ul><ul><ul><li>The review is executed by using the compare-view to the previous version of the change-set </li></ul></ul><ul><ul><li>The review result is consumed by the author </li></ul></ul><ul><ul><li>A (negative) review result is followed up and may create a corrective task for the author </li></ul></ul>
  7. 7. CCBR attributes <ul><li>Process Coherence </li></ul><ul><ul><li>Review is done shortly after the development </li></ul></ul><ul><ul><li>Rework is done shortly after the original work </li></ul></ul><ul><ul><li> “ It's All about Feedback and Change” [23] </li></ul></ul><ul><li>Information Coherence </li></ul><ul><ul><li>Changes within one timeframe often correlate [15] </li></ul></ul><ul><li>Task size </li></ul><ul><ul><li>small, typically <200 LOC per review task </li></ul></ul>
  8. 8. CCBR vs. traditional reviews (TR) Inspection Re-work Re-inspection t t Development effort Development effort Inspection, re-work and re-inspection effort TR CCBR
  9. 9. Limitations of CCBR <ul><li>TR sees the „whole“ whereas CCBR only refers to a „small piece“ </li></ul><ul><li>Focus on a stable state (TR) vs. the creation process (CCBR) </li></ul><ul><li>The sum of changed lines in TR < CCBR </li></ul><ul><ul><li>one line of code may be changed more than once </li></ul></ul><ul><li>Pre-commit reviews vs. Post-commit reviews </li></ul>
  10. 10. Review Tool for CCBR <ul><li>Eclipse integrated easy to use UI </li></ul><ul><li>Review scope = one changeset in the SVN </li></ul><ul><li>No additional server (XML/SVN based) </li></ul><ul><li>Works offline </li></ul><ul><li>Works with Subversive and Subclipse </li></ul><ul><li>Flexible filtered review assignments </li></ul><ul><li>Create Mylyn tasks out of (bad) reviews </li></ul><ul><li>Freeware , 10.000+ downloads </li></ul><ul><ul><li> </li></ul></ul>
  11. 11. ReviewClipse Screenshot
  12. 12. ReviewClipse Screenshot
  13. 13. Planned Empirical Evaluation <ul><li>Research Questions </li></ul><ul><ul><li>Is CCBR equally effective as TR? </li></ul></ul><ul><ul><li>Is CCBR equally efficient as TR? </li></ul></ul><ul><ul><li>Is the rework effort of CCBR < TR? </li></ul></ul><ul><li>Plan </li></ul><ul><ul><li>Prestudy with 3/3 and 6 subjects of A and B </li></ul></ul><ul><ul><li>A: In parallel study 30/30 (targets question 1 + 2) </li></ul></ul><ul><ul><li>B: In serial study 60 (targets question 3) </li></ul></ul>
  14. 14. Future Work – Mylyn Reviews <ul><li>Mylyn Reviews is a task-based code review plugin that integrates with Mylyn </li></ul><ul><li>Main Features: </li></ul><ul><ul><li>Do code reviews based on Mylyn tasks </li></ul></ul><ul><ul><li>Incorporate the review tasks in the Mylyn task list </li></ul></ul><ul><ul><li>Provide inline-commenting for source code reviews </li></ul></ul><ul><ul><li>Framework for other review tool integrations </li></ul></ul><ul><li> </li></ul>
  15. 15. References <ul><li>M. Fagan, “Design and code inspections to reduce errors in program development,” IBM Systems Journal , vol. 15, no. 3, pp. 182–211, 1976. </li></ul><ul><li>IEEE standard for software reviews and audits. IEEE STD 1028-2008, pages 1–52, 8 2008. </li></ul><ul><li>R. Baker, “Code reviews enhance software quality,” In Proceedings of the 1997 (19th) International Conference on Software Engineering , pp. 570–571, May 1997. </li></ul><ul><li>M. Ciolkowski, O. Laitenberger, and S. Biffl. “Software reviews, the state of the practice.” Software, IEEE, 20(6):46–51, Nov.-Dec. 2003. </li></ul><ul><li>L. Harjumaa, I. Tervonen, and A. Huttunen. “Peer reviews in real life - motivators and demotivators.” Fifth Int.Conference on Quality Software , 2005, pages 29–36, Sept. 2005. </li></ul><ul><li>J. Remillard. “Source code review systems.” IEEE Software, 22(1):74-77, Jan.-Feb. 2005. </li></ul><ul><li>B. Meyer. “Design and code reviews in the age of the internet.” Commun. ACM, 51(9):66–71, 2008. </li></ul><ul><li>M. Bernhart, C. Mayerhofer, T. Grechenig, “ReviewClipse - Supporting Code-Reviews within the Eclipse IDE”. Talk at EclipseCon 2009 </li></ul><ul><li>M. Stein, J. Riedl, S. J. Harner, and V. Mashayekhi, “A case study of distributed, asynchronous software inspection,” in ICSE ’97: Proceedings of the 19th international conference on Software engineering , (New York, NY, USA), pp. 107–117, ACM, 1997. </li></ul><ul><li>L.-T. Cheng, C. R. de Souza, S. Hupfer, J. Patterson, and S. Ross, “Building collaboration into IDEs,” Queue, vol. 1, no. 9, pp. 40–50, 2004. </li></ul><ul><li>C. R. Prause and S. Apelt, “An approach for continuous inspection of source code,” in Proceedings of the 6th int. workshop on Software quality , (New York, NY, USA), pp. 17–22, ACM, 2008. </li></ul><ul><li>S. P. Berczuk, “Software configuration management patterns - Effective Teamwork”, Practical Integration. Addison-Wesley, 2003. </li></ul><ul><li>Kersten, M. and Murphy, G. C. “Using task context to improve programmer productivity.” In Proceedings of the 14th ACM SIGSOFT international Symposium on Foundations of Software Engineering (Portland, Oregon, USA, November 05 - 11, 2006). SIGSOFT '06/FSE-14. ACM, New York, NY, 1-11. </li></ul><ul><li>B. W. Boehm, Software Engineering Economics. Prentice-Hall, 1981. </li></ul><ul><li>T. Zimmermann, P. Weißgerber, S. Diehl, and A. Zeller. „Mining Version Histories to Guide Software Changes.” Saarland University, September 2003. Proc. 26th International Conference on Software Engineering (ICSE) , Edinburgh, UK, May 2004. </li></ul><ul><li>A. Porter, H. Siy, A. Mockus, and L. Votta, “Understanding the sources of variation in software inspections,” ACM Trans. Softw. Eng. Methodol., vol. 7, no. 1 , pp. 41–79, 1998. </li></ul><ul><li>M. Stein, J. Riedl, S. J. Harner, and V. Mashayekhi, “A case study of distributed, asynchronous software inspection.” in ICSE ’97: Proceedings of the 19th international conference on Software engineering , (New York, NY, USA), pp. 107–117, ACM, 1997. </li></ul><ul><li>T. Gilb, D. Graham, and S. Finzi. “Software Inspection.” Addison-Wesley Longman Publishing Co., Inc. Boston, MA, USA, 1993. </li></ul><ul><li>C. Mayerhofer. “Design and Development of a Module for Change-Set Based Code Reviews in IDEs” Master Thesis, Vienna University of Technology, 2009. </li></ul><ul><li>Y. Mizuno. “Software quality improvement ,” Computer, vol. 16 , pp. 66–72, März 1983. </li></ul><ul><li>M.Bernhart, C. Mayerhofer, T. Grechenig. „ReviewClipse@Class - Using a Lightweight Continuous Software Reviewing Tool in Undergraduate Software Development Education“ – to appear in Proceedings of the 23rd Annual IEEE-CS Conference on Software Engineering Education and Training </li></ul><ul><li>M. E. Nordberg III, &quot;Managing Code Ownership,&quot; IEEE Software , pp. 26-33, March/April, 2003 </li></ul><ul><li>Williams, L. and Cockburn, A. Agile Software Development: It's All about Feedback and Change, IEEE Computer, Vol. 36, No., 6, pp. 83-85, June 2003. </li></ul><ul><li>W. Seliga, S. Ginter, “Effective code reviews in agile teams” Talk at Agile 2009 </li></ul><ul><li>D. Winkler and S. Biffl. “An empirical study on design quality improvement from best-practice inspection and pair programming.” Product-Focused Software Process Improvement, pages 319–333, 2006. </li></ul><ul><li>P. Grünbacher, M. Halling, and S. Biffl. “An empirical study on groupware support for software inspection meetings.” Automated Software Engineering, International Conference on, 0:4, 2003. </li></ul>