Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Characterizing Verification of Bug Fixes in Two Open Source IDEs (MSR 2012)

626 views

Published on

Paper by Rodrigo Souza and Christina Chavez, presented at the 9th Working Conference on Mining Software Repositories (MSR 2012) -- http://2012.msrconf.org/program.php

Experimental package and presentation video available at https://sites.google.com/site/rodrigorgs2/msr2012

Published in: Technology
  • Be the first to comment

Characterizing Verification of Bug Fixes in Two Open Source IDEs (MSR 2012)

  1. 1. Characterizing Verification of Bug Fixes in Two Open Source IDEs Rodrigo Souza* and Christina Chavez Software Engineering Labs Department of Computer Science - IM Universidade Federal da Bahia (UFBA), Brazil {rodrigo, flach}@dcc.ufba.br * speakerJune 2, 2012MSR 2012, Zürich
  2. 2. 2
  3. 3. NEW2
  4. 4. NEW FI XED2
  5. 5. NEW FI XED E D V ERIFI2
  6. 6. Characterize the verification process by mining bug repositories VER IFIED 3
  7. 7. Characterize the verification process by mining bug repositories VER IFIED When? Who? How? 3
  8. 8. DataMSR 2011 Challenge data set (~ 10 years of bug reports) Platform VersionControl Platform EMF (Eclipse Modeling Framework) 4
  9. 9. When?
  10. 10. accumulated # of verifications ~ 800verifications ~ 2 years 6
  11. 11. accumulated # of verifications ~ 800 * = releaseverifications * * * * ~ 2 years 6
  12. 12. accumulated # of verifications ~ 700 * = release *verifications * * * * * * * ~ 1 year 7
  13. 13. accumulated # of verifications ~ 700 * = release *verifications * * verification * phase * * * * ~ 1 year 7
  14. 14. Who?
  15. 15. QA developerbug fixes x ≥ 10x verifications 9
  16. 16. QA team10
  17. 17. 20% of developers QA team 10
  18. 18. 20% of developers QA team 80% of the verifications 10
  19. 19. QAteam 11
  20. 20. How?
  21. 21. 13
  22. 22. Most comments just state the obvious. 4% refer to Less than automated testing or code inspection. Further research needed 14
  23. 23. Pitfalls
  24. 24. accumulated # of verifications ~ 14kverifications ~ 10 years 16
  25. 25. accumulated # of verifications ~ 14kverifications ~ 10 years 16
  26. 26. 17
  27. 27. 2.6k bugs 17
  28. 28. 2.6k bugs 17
  29. 29. 2.6k bugs 17
  30. 30.  Mass verifications: representrepository cleanup, not softwareverification.They may represent a large part of theverifications and bias your analyses. 18
  31. 31.  Pseudo verifications: In someprojects, marking a bug as VERIFIEDmeans something else!(e.g., in Eclipse/EMF, since 2007, itmeans that the fix is available in a build) 19
  32. 32. Future Work
  33. 33. process product(software) 21
  34. 34. process verification process* product (software)* qa team, reopeningverification phase etc. 22
  35. 35. causal process (bayesian) network verification process* product (software)* qa team, reopeningverification phase etc. 22
  36. 36. Thanks!Verification ✓ ✗ phase QA team ✗ ✓ Comments rarely state the verification technique. Beware of mass verifications and pseudo verifications. 23

×