Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

ICSE 2012: Test Confessions - A study of testing practices for plug-in systems


Published on

  • Be the first to comment

  • Be the first to like this

ICSE 2012: Test Confessions - A study of testing practices for plug-in systems

  1. 1. Test Confessions: A Study of Testing Practices for Plug-In Systems Michaela Greiler, Arie van Deursen and Margaret-Anne Storey 1
  2. 2. Testing Practices for Plug-In Systems? 2
  3. 3. 3
  4. 4. extending withoutthe need of change assemble new productsloaded at runtime 4
  5. 5. 5
  6. 6. complex compositionsintegrating multiple plug-ins different developers 6
  7. 7. Incompatibility 7
  8. 8. 8
  9. 9. “The number one reason people give usfor not upgrading to the latest version of WordPress is fear that their plugins won’t be compatible.” [WordPress] 9
  10. 10. “Thanks, but I think we have given up on Eclipse and Bugzilla integration.” [Eclipse – Bug ID: 268207] 10
  11. 11. 11
  12. 12. How are plug-in specific Which test practices are integration issues prevalent in testing of (i.e. versioning, configurations) plug-in-based systems? tested? Practices? Plug-in Testing? What are challengesexperienced when testing plug-in based systems?Challenges? Research Questions 12
  13. 13. Survey of existing approaches – over 200 resources LITERATURE STUDY? 13
  14. 14. A Grounded Theory StudySystematic procedure to discover a theory from (qualitative) data 14
  15. 15. Objective Increase understanding ofwhat testers and developersthink and do when it comes to testing plug-in-based systems. 15
  16. 16. StudyInterviews with 25 experienced Eclipsedevelopers or testers representing well-known open and closed source projects. 16
  17. 17. Practices? Plug-in Testing? Are there compensation strategies to support theChallenges? testing of plug-ins? Compensation? Research Questions 17
  18. 18. Triangulation1. Resonance @ EclipseCon2. Survey among 151 developers 18
  19. 19. OutcomeTESTING PRACTICES 19
  20. 20. Focus on Unit Testing 20
  21. 21. Focus on Unit Testing Focus on Unit Testing “Unit testing is where “Ultimately, unit test are ouryou find the most bugs” best friends” P18 P14 “At least 70% of our test effort is spent on unit testing.” 21
  22. 22. Integration20 Tests 14 22
  23. 23. Other forms of testing are less popular“We think that with a high “QF-tests [were] too rigid whentest coverage through unit the system was evolving”tests, integration tests are not necessary.” P20 P14 23
  24. 24. 1. Finding: (Automated) unit testing is widely adopted; Integration, system, UI and acceptance testing are much less automated SURVEY 24
  25. 25. OutcomePLUG-IN TESTING 25
  26. 26. Cross-plug-in integration?Testing platform or plug-in versioning? 26
  27. 27. Cross plug-in testing is optional“We handle problems between several plug-ins in a bug- driven way” P18 P19 “We have no automated tests for cross plug-in testing, but we do manual testing.” 27
  28. 28. Version testing is minimal“A lot of people put version ranges […], and they say they can run with 3.3 up to version 4.0 […].” P13 “But I’m willing to bet that 99% of the people do not test that their stuff works.” 28
  29. 29. Testing combinations or versions?43% don’t test integration of different products only 3% test this thoroughly55% don’t test for platform versionsonly 4% test this thoroughly63% don’t test for dependency versionsonly 10% test this thoroughly 29
  30. 30. Testing not needed? 30
  31. 31. Testing not needed?“Simply fetch the latest and youll end up in a mess!” [Ian Bull on updating problems with plug-in systems] 31
  32. 32. Versioning, cross-plug-in testing not needed?“Sometimes we update, but there is always the risk that it will break something and then you have to do extensive [manual] testing.”P9 P12 “I do not even have a chance to test [all possible combinations]. There are too many operating systems, there are too many Eclipse versions.” 32
  33. 33. OutcomeBARRIERS 33
  34. 34. Plug-In Integration Testing Barriers 34
  35. 35. Findings: Barriers• Responsibility for integration unclear• Lack of ownership• Insufficient plug-in testing knowledge• Test execution too long 35
  36. 36. OutcomeCOMPENSATION 36
  37. 37. Self-Hosting“Eating your own dog food” 37
  38. 38. Manual Testing “Tests that I do are very simple manual tests, the real tests are coming from the users, that are doing all kind of different things with [x].“—P9 38
  39. 39. “Testing is done by the user-community and they are rigorous about it. We havemore than 10,000 installations per month. If there is a bug it gets reported P12 immediately.“ 39
  40. 40. Developer Involvement “We’re a framework. If the user downloads anew version and lets his application run with it, then this is already like a test.” P20 P11 “Perhaps it is not our own product, but our product relies on this other product. So it is normal to improve [it].” 40
  41. 41. A Prerequisite OpennessCommunication Release Management Extensibility Feedback Manual Testing Automated TestingRequirements Alpha & Beta Tester Downstream Projects & Release Train 41
  42. 42. Summary: Findings1. (Automated) unit testing is widely adopted; Integration, system, UI and acceptance testing are much less automated2. The plug-in nature has little direct impact on test practices3. Barriers to adopt techniques include unclear ownership, responsibilities, and test effort & execution time4. Limited integration testing is compensated by community 42
  43. 43. Plug-in specific testing support Provide a test modus Rewarding communityCentralized compatibility information 43
  44. 44. More Details?Michaela Greiler, Arie van Deursen &Margaret-Anne Storey.“Test Confessions: A Study of TestingPractices for Plug-in Systems” 44