Your SlideShare is downloading. ×
Ingegneria del Software II
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Ingegneria del Software II

366
views

Published on


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
366
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Course: Ingegneria del Software II academic year: 2004-2005 Course Web-site: [www.di.univaq.it/ingegneria2/] 12.Regression Testing Lecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila - Italy muccini@di.univaq.it – cortelle@di.univaq.it [www.di.univaq.it/muccini] – [www.di.univaq.it/cortelle]
  • 2. Copyright Notice » The material in these slides may be freely reproduced and distributed, partially or totally, as far as an explicit reference or acknowledge to the material author is preserved. Henry Muccini SEA Group 2 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 3. Acknowledgment » This work is joined with the University of California, Irvine (Debra J. Richardson and M. Dias) » Thanks to Lihua Xu and to the ROSATEA group at the University of California, Irvine for their contribution on this research SEA Group 3 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 4. Agenda » Definition and Techniques » Regression Test Selection Techniques » Spec-based Regression Testing approaches » SA-based Regression Testing SEA Group 4 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 5. What is Regression Test » Maintenance activity » Performed on modified programs » Attempts to validate modified software and ensure that modifications are correct SEA Group 5 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 6. Regression Testing » Test modified software and provide a certain confidence that no new errors are introduced into previously tested code. » Given program P and P’, T is a test suite for P. A regression testing technique: - provides a certain confidence that P’ is still correct with respect to a set of test T’, which is a subset of T; - Helps to identify new test cases for T’. » Different Techniques: - Retest all - Selective Retest (Regression Test Selection) SEA Group 6 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 7. Regression Test Selection technique » Select T’, subset of T and relevant for P’; » Test P’ with respect to T’; » If necessary, create T’’, to test new functionality/structure in P’; » Test P’ with respect to T’’; » Create T’’’, a new test suite and test history; SEA Group 7 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 8. Traditional (Selective) Regression Testing » Traditional Regression: - We have a program P P P’ - We select a set of tests T to be applied to P XClient XClient, v2 - We test P with T ClientA ClientB ClientA, v2 ClientB, v2 - P is modified in P’ - Select T’ in T, a set of tests to ClientC ClientC, v2 execute on P’ - Test P’ with T’ Set of tests T Set of tests T’, - If necessary, create T’’, a set of new tests for P’ and test P’ with T’’ subset of T SEA Group 8 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 9. Selective Retest 1. Identify the modifications of P with P’. 2. Select T’, a subset of T, based on the result of step 1. 3. Test P’ with T’, establishing (P’)’s correctness with respect to T’. 4. If necessary, create a set of new tests T’’, to test the new, modified and untested portion of P’. 5. Test P’ with T’’, establishing (P’)’s correctness with respect to T’’. 6. The test suite for P’, thus, is T’ & T’’. SEA Group 9 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 10. How they usually implement Regression testing » Given P, an instrumented version P1 is created to store information on P’s execution Provide an example » P is (usually) represented as a flow graph » Tests T are run over P1 » For each t in T, P1 stores information on the portion of code executed by running t (test history) » P becomes P’ » The flow graph of P’ is built » t in T will be in T’ if the portion of modified code affects t - that is, if t in the test history of P executes a portion of code SEA Group modified in P’, then t needs to be run again on P’ 10 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 11. Regression Test Selection Techniques SEA Group 11 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 12. In other terms… Since test T1 visits method m1 P code Test Test History P’ code and m1 is changed, (obtained by code then test again Suite instrumentation) Class x(){ Class x(){ P’ with respect to m1, m2} m1’, m2} T1 The methods Class y { “reached” by Class y { T2 does not need executing the test m3 } m3 } to be re-executed cases Class z { T1 = m1 + m3 Class z { P is usually represented m4} T2 = m2+m3+m4 m4} by a graph SEA Group 12 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 13. Regression Test Selection Techniques SEA Group 13 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 14. Regression Test Selection Techniques » Control-Flow- Based Regression Test - Control-Flow-Based regression test traverses corresponding paths in G and G’. - If finds any sink of edge has been changed, then selects the tests that executed the edge in G. » TestTube - Based on code entities - They maintain a function trace list produced by each test case, and expand it to entity trace list. - To find the changes of the entities, they define that an entity consists of a sequence of tokens for each execution, and the entity considered to be changed if one of these tokens changed. SEA Group - tends to choose more tests than Control-Flow-based techniques 14 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 15. Regression Test Selection Techniques » Incremental Program Testing - It uses program slicing - If a statement in P’ have no corresponding statements in P and/or the behaviors of the same statement in P and P’ are not equivalent, then this statement should be considered to be affected » Domain-Based Regression Testing - a little bit different from the others - First update domain by defining the modifications of the command languages, then update test suite and select regression test suite based on the subdomains SEA Group 15 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 16. Spec-based Regression Testing SEA Group 16 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 17. Overview (not required for the exam) » Dependent Relations A. Firewall: (static) control dependent/ inheritance, aggregation, association… [kung95] B. FDG: function dependent (+data members) [wu99][wu00] C. CSIG: control + data flow (spec. + Impl.) [BeydedaGruhn] D. DejaVu: (CFG control flow) [harrold01nov] SEA Group 17 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 18. Background » A Safe, Efficient Regression Test Selection Technique (Gregg Rothermel, Mary Jean Harrold) - Source code à Control Flow Graph node: statement edge: flow of control between statement - Compare by edge SEA Group 18 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 19. Background » Testing of object-oriented programs based on finite state machines (H.S. Hong) - Source code à Class State Machine transitions: (source, target, event, guard, action) » Techniques for Testing Component-Based Software (Ye Wu, Dai Pan, Mei-Hwa Chen) - Component Interaction Graph (CIG): interactions and dependence relationships among components > Interfaces nodes: basic access, via which components are activated > Events nodes: the event from the calling interface to the called interface > Context dependence edges: similar to control flow dependence relationship > Content dependence edges: data dependence relationship (regression test for OOP) SEA Group 19 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 20. A. Developing an Object-Oriented Software Testing and Maintenance Environment (David Kung, Jerry Gao) [kung95] • ORD (Object Relation Diagram) – inheritance, aggregation, association… – Test order • BBD (Block Branch Diagram) – Function invocation – Test path • OSD (Object State Diagram) – State behavior for object class SEA Group 20 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 21. B. Regression Testing on Object-Oriented Programs (Ye Wu, Mei-Hwa Chen) • AFDG: [wu99] – Affected Function Dependence Graph CFG P’ – Affected variable / functions, function dependence relations AFDG FCG • FCG: – Function Calling Graph – Test case execution history of each test case RTs SEA Group 21 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 22. C. Integrating White- and Black-Box Techniques for Class-Level Regression Testing (Sami Beydeda) • CSM: Class State Machine [BeydedaGruhn] CSM • CSIG: Class Specification Implementation Graph Prototype – Prototype: each event within transitions of CSM Data-flow – CCFG: Class Control Flow Graph CCFG (Def-use) (for each prototype/method) • Method implementation graph • Method specification graph Test CSIG CSIG’ – Data-flow: black-box testing cases Cases • Regression testing – White-box: compare by node Regression Test cases – Black-box: by edge SEA Group 22 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 23. D. Using Component Metadata to Support the Regression Testing of Component-based Software (Mary Jean Harrold) » Code-based [harrold01nov] - Metadata à branch coverage for T. - DejaVu (CFG) » Spec-based - Test frame: test specification for functional unit - Metadata à Test frame coverage for T. - DejaVu (at least one of the paths in test frame associate changed statement) SEA Group 23 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 24. SA-based Regression Testing SEA Group 24 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 25. Why? » Motivations and Goals: - To reduce the testing effort during architecture or code maintenance - To handle the “architectural drift” - To maximize benefits in using Software Architecture SEA Group 25 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 26. Why? » Because changes in P may effect only a small set of architectural interactions, and we do not want to retest everything » “… the use of software architecture for regression testing activities has the potential for a bigger impact on the cost of software than those techniques that focus only on development testing.” [Harrold_Rosatea98] » “We need to develop techniques that can be applied to … the … architecture, to assist in selective retest of the software. These techniques will let us identify existing test cases that can be used to retest the software. These techniques will also let us identify those parts of the modified software for which new test SEA Group cases are required.” [Harrold_FOSE2000] 26 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 27. SA-based Regression Testing (1/2) » Assumption: - we have the Software Architecture, - some architectural tests SAT, - the P code and - a function mapping map(SAT,P) à CTp that produces the code- level tests - we run the CTp tests over P and evaluate the P conformance to SA » We may change SA in SA’ or P in P’ SEA Group 27 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 28. SA-based Regression Testing [SARTE Project] Test Reuse SEA Group Test Reuse Test Reuse 28 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 29. Goal 1 » Goal 1: Test Conformance of a Modified Implementation P' to the initial SA: - Context: > Given a software architecture specification for S, and an implementation P, we first gain confidence that P correctly implements S. > P' modifies P: some components are modified, and/or some new components are introduced. - Goal: Test the conformance of P' with respect to S, while reusing previous test information for selective regression testing, thereby reducing the test cases that must be retested. SEA Group 29 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 30. Goal 2 » Test Conformance of an Evolved Software Architecture - Context: > P correctly implements the SA S > S is modified into S’, adding or removing components > A modified implementation P' may have been also developed. - Goal: Test the conformance of P (or P‘) with respect to S', while reusing previous test information for selective RT, thereby reducing the test cases that must be retested. SEA Group 30 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 31. Activity Diagram of our Sa-based Regression Testing approach SEA Group 31 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 32. Goal 1 SEA Group 32 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 33. Considerations » Differences with respect to traditional code-based RT techniques: - the oracle in SA-based RT is the software architecture specification itself. > In fact, when t1 is run on P', the test fails if its execution does not allow the expected behavior to be reproduced - Moreover, code-level test cases are always driven by well formalized functional and structural architectural requirements. SEA Group 33 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 34. Advantages » i) as in traditional RT, we reduce the size of the test suite for P', eliminating all those tests which do not need to be reapplied to P', and » ii) when conformance faults are detected, we can gather information on how to adjust the initial architecture. SEA Group 34 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 35. Goal 2 Idea » Compare the two architectural specifications to identify changed/unchanged portions of the SA. » Both structural and behavioral changes are taken into account. - In particular, the LTSs for S and S'' are compared, and differences are identified in the two graphs (using a sort of ``Diff" algorithm). » In a fashion similar to traditional code-based RT, whenever an ATC traverses a path in the SA LTS which has been modified in the S'' LTS, then it needs to be retested in S''. SEA Group 35 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 36. Goal 2 SEA Group 36 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 37. Experiment 1: Elevator System » The approach has been applied to the Elevator case study - Architecture in the C2 style - Implemented using the C2 framework » Tool support: - The SA is formally specified using the C2 specification language and FSP. - A behavioral model of the system is automatically produced from the FSP specification, using the LTSA tool. SEA Group 37 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 38. Elevator SA Elevator System Elevator System Version 1 Version 2 ElevatorADT1 ElevatorADT1 ElevatorADT2 ElevatorPanel1 ElevatorPanel2 ElevatorPanel1 Scheduler BuildingPanel BuildingPanel SEA Group 38 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 39. Experiment 1 - The abstraction over the LTS behavioral model is realized again in FSP. - The mapping between architectural tests and code-level tests is provided for by the C2 Framework. - Test cases are run over the code using the Argus-I environment. - The selective regression testing step is supported by the DejaVOO tool. SEA Group 39 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 40. ElevatorADT1 Elevator System Version 1 ElevatorPanel1 Architectural Paths = 2 Architectural “Paths” (Scenarios) BuildingPanel Test Cases Path #1: BP sends Call; ADT1 attends Call Path #1 – Test Cases TC1: BP.addCall(1,up); ADT1(1,up) BuildingPanel ElevatorADT1 TC2: BP.addCall(1,up); ADT1(2,up) TC3: BP.addCall(1,up); ADT1(2,down) addCall (x,dir) TC4: BP.addCall(2,up); ADT1(1,up) TC5: BP.addCall(2,up); ADT1(2,down) TC6: BP.addCall(2,up); ADT1(3,down) callAttended (x) TC7: BP.addCall(2,down); ADT1(1,up) TC8: BP.addCall(2,down); ADT1(2,up) TC9: BP.addCall(2,down); ADT1(3,up) TC10: BP.addCall(3,down); ADT1(4,down) TC11: BP.addCall(3,down); ADT1(3,down) TC12: BP.addCall(3,down); ADT1(2,down) Path #2: EP1 sends Call; ADT1 attends Call Path #2 – Test Cases ElevatorPanel1 ElevatorADT1 TC1: EP1.addCall(1); ADT1(1,up) addCall (x) TC2: EP1.addCall(1); ADT1(2,up) TC3: EP1.addCall(1); ADT1(2,down) TC4: EP1.addCall(2); ADT1(1,up) TC5: EP1.addCall(2); ADT1(2,down) callAttended (x) TC6: EP1.addCall(3); ADT1(2,down) SEA Group 40 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 41. ElevatorADT1 ElevatorADT2 Elevator System Version 2 ElevatorPanel1 ElevatorPanel2 Architectural Paths = 4 Scheduler Architectural “Paths” (Scenarios) BuildingPanel Path #1: BP sends Call; ADT1 attends Call Test Cases BuildingPanel Scheduler ElevatorADT1 ElevatorADT2 addCall (x,dir) getDistanceToCall (x,dir) getDistanceToCall (x,dir) distanceToCall (x,1) distanceToCall (x,1) addCall (x,dir) callAttended (x) callAttended (x) SEA Group 41 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 42. Elevator System v2 test cases Path #2: BP sends Call; ADT2 attends Call Path #3: EP1 sends Call; ADT1 attends Call ElevatorPanel1 ElevatorADT1 addCall (x) callAttended (x) Path #4: EP2 sends Call; ADT2 attends Call (analogous to path #3) SEA Group 42 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 43. Architecture-Based Regression Testing Elevator System - Version 1 Elevator System - Version 2 Architectural Paths = 2 Architectural Paths = 4 (or 3?) Path #1: BP sends Call; ADT1 attends Call Path #1: BP sends Call; ADT1 attends Call BuildingPanel Scheduler ElevatorADT1 ElevatorADT2 BuildingPanel ElevatorADT1 addCall (x,dir) getDistanceToCall (x,dir) addCall (x,dir) getDistanceToCall (x,dir) distanceToCall (x,1) distanceToCall (x,1) callAttended (x) Need addCall (x,dir) callAttended (x) to retest! callAttended (x) TC1: BP.addCall(1,up); ADT1(1,up) TC2: BP.addCall(1,up); ADT1(2,up) Path #2: BP sends Call; ADT2 attends Call TC3: BP.addCall(1,up); ADT1(2,down) TC4: BP.addCall(2,up); ADT1(1,up) BuildingPanel Scheduler ElevatorADT1 ElevatorADT2 TC5: BP.addCall(2,up); ADT1(2,down) addCall (x,dir) TC6: BP.addCall(2,up); ADT1(3,down) getDistanceToCall (x,dir) TC7: BP.addCall(2,down); ADT1(1,up) getDistanceToCall (x,dir) TC8: BP.addCall(2,down); ADT1(2,up) TC9: BP.addCall(2,down); ADT1(3,up) distanceToCall (x,1) TC10: BP.addCall(3,down); ADT1(4,down) distanceToCall (x,1) TC11: BP.addCall(3,down); ADT1(3,down) addCall (x,dir) TC12: BP.addCall(3,down); ADT1(2,down) callAttended (x) callAttended (x) Path #2: EP1 sends Call; ADT1 attends Call Path #3: EP1 sends Call; ADT1 attends Call ElevatorPanel1 ElevatorADT1 ElevatorPanel1 ElevatorADT1 addCall (x) addCall (x) callAttended (x) Don’t need callAttended (x) to retest! TC1: EP1.addCall(1); ADT1(1,up) TC2: EP1.addCall(1); ADT1(2,up) TC3: EP1.addCall(1); ADT1(2,down) Path #4: EP2 sends Call; ADT2 attends Call TC4: EP1.addCall(2); ADT1(1,up) SEA Group TC5: EP1.addCall(2); ADT1(2,down) TC6: EP1.addCall(3); ADT1(2,down) 43 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 44. Experiment 2: the Cargo Router example SEA Group 44 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 45. Architecture(s) of the Cargo Router Cargo Router, Version 2 Cargo Router, Version 1 Note: In Version2, the connection (*) Port Vehicle Warehouse Clock is replaced with the connections (**) (P) (V) (W) (C) Bus1 Planner (P) NextShipment (NS) (**) (*) Bus2 Bus2A (**) Bus2B Port Vehicle Warehouse Port Vehicle Warehouse Artist (PA) Artist (VA) Artist (WA) Artist2 (PA2) Artist2 (VA2) Artist2 (WA2) Bus3 Bus3A CargoRouter CargoRouter2 (CR) Bus3C (CR2) Planner Artist (PlA) Translator (T) Bus4 Graphics Binding (GB) SEA Group a) b) 45 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 46. Architectural Test Case SEA Group 46 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 47. Related Topics » Regression Testing - Many many papers on code-based regression testing - Very few (only two) papers on specification-based regression testing » SA-level Dependence Analysis » Code-level Dependence Analysis » SA-based Testing SEA Group 47 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 48. References » Mary Jean Harrold, Testing Evolving Software. Journal of Systems and software, vol. 47, number 2-3, pp.173-181, July 1999. » Mary Jean Harrold, Testing: A Roadmap. In Future of Software Engineering, 22nd International Conference on Software Engineering, June 2000. » Anneliese von Mayrhauser, Richard T. Mraz, Jeff Walls, Domain Based Regression Testing. Dept. of Computer Science, Colorado State University. » Gregg Rothermel and Mary Jean Harrold, Proc. Of the Conf. On Software Maintenance, Montreal, CA, September 1993, pages 358-367. » Yih-Farn Chen, David S. Rosenblum and Kiem-Phong Vo, TestTube: A System for Selective Regression Testing. Software Engineering Research Department, AT&T Bell Laboratories, NJ. » Samuel Bates and Susan Horwitz, Incremental Program Testing Using Program Dependence Graphs, University of Wisconsin-Madison. SEA Group 48 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II
  • 49. References (Con’t) » Todd L. Graves, Mary Jean Harrold, Jung-Min Kim, Adam Porter and Gregg Rothermel, An Empirical Study of Regression Test Selection Techniques, ACM Transactions on Software Engineering and Methodology (to appear). » David Binkley, The Application of Program Slicing to Regression Testing, Loyola College, Maryland. » Mary Jean Harrold, Architecture-Based Regression Testing of Evolving Systems, The Ohio State University. SEA Group 49 © 2005 by H. Muccini and V. Cortellessa / Ingegneria del Software II