• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Scenario-Based Functional Regression Testing
 

Scenario-Based Functional Regression Testing

on

  • 1,624 views

 

Statistics

Views

Total Views
1,624
Views on SlideShare
1,624
Embed Views
0

Actions

Likes
0
Downloads
34
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Scenario-Based Functional Regression Testing Scenario-Based Functional Regression Testing Document Transcript

    • Scenario-Based Functional Regression Testing Wei-Tek Tsai*, Xiaoying Bai*, Ray Paul**, Lian Yu + *Department of Computer Science and Engineering Arizona State University Tempe, AZ 85259 **OASD C3I Investment and Acquisition Washington, D.C. + Department of Industrial Engineering Arizona State University Tempe, AZ 85259 software platform [15,19,30,40,44,46]. The changes bring much ris k to the software system Abstract because change propagations may introduce new bugs, sometimes even fatal errors [28]. Regression testing is the technique to verify the Regression testing has been a popular quality integrality and correctness of the modified assurance technique. Most regression testing system. There exist two strategies for selecting techniques are based on code or software design. regression test cases: retest-all and This paper proposes a scenario-based functional selective-retest. In most of situations, it is regression testing, which is based on end-to-end impossible to retest all the test cases of the (E2E) integration test scenarios. The test system under test. Therefore, the selective testing scenarios are first represented in a template is necessary. model that embodies both test dependency and Many techniques have been proposed for traceability. By using test dependency selective regression testing. Yau and Kishimoto information, one can obtain a test slicing presented a technique based on input partitioning algorithm to detect the scenarios that are and cause-effected graphing [52]. Fischer and his affected and thus they are candidates for colleagues established a linear programming regression testing. By using traceability model based on four matrices: connectivity, information, one can find affected components reachability, test-case dependency (or and their associated test scenarios and test cases cross-reference), and variable set/use, reflecting for regression testing. With information of various views of the program such as control dependency and traceability, one can use the flow and test coverage [16]. This model was ripple effect analysis to identify all affected, further extended to include dataflow information including directly or indirectly, scenarios, and [20]. Gupta, Harrold and Soffa [17] examined thus the set of test cases can be selected for traditional program slicing [41], while Bates and regression testing. This paper also provides Horwits [3] exercised program dependency several alternative test-case selection graphs, to select regression tests based on approaches and a hybrid approach to meet program dataflow and control flow analysis. various requirements and restrictions. A Agrawal, Horgan, Krauser and London presented web-based tool has been developed to support a set of slicing techniques with the purpose to these regression tasks. determine the test cases on which the new and the old program may produce different output [2]. 1 Introduction Chen and Tsai proposed class, message, constrained and recursive slicing techniques for Software system is subject to changes through maintaining and understanding C++ programs out development, maintenance, and evolution, based message, class and declaration due to a variety of reasons such as changed dependencies [11]. requirements (both functional and nonfunctional), Most of the existing regression test selection update technology, and upgrade hardware or techniques are code-based using program slicing -1-
    • [2,17,44], program dependence graphs [3,7], data thin-threads and conditions, are semi -formal flow and control flow analysis [21]. This paper representations of restructured and refined presents a test scenario-based methodology for software requirements. A thin thread is a functional regression testing in the context of minimum usage scenario of the integrated End-to-End (E2E) integration testing, which system from the end user’s point of view. focuses on the functional correctness of large Conditions identify the factors that will affect the integrated system from the end users’ point of execution of the functionality by a thin thread. view [45,55]. Both thin threads and conditions are identified Section 2 introduces the E2E testing and based on a thorough understanding of the system presents the test scenario model, including under test, including system functionality template definition, dependencies, and (requirements), the connection (static topology) traceability. Section 3 describes the model-based and interactions (dynamic communication regression test selection technique supported by structure) between subsystems in the integrated test scenario slicing and characterized ripple system. Therefore, in addition to functional effect analysis. Section 4 briefly outlines the requirements, thin threads and conditions also regression testing support tool. Section 5 draws carry information of software configuration. conclusion of the paper. A thin-thread serves as a basic test scenario with input, output and actions specified. By 2 Test Scenario Model applying sequencing and structural computations on basic and/or existing scenarios, one can also generate complex test scenarios. Test cases are This section presents the E2E testing generated from test scenarios by associating the specification, and describes the test scenario input(s) of a test scenario with actual data based model from three perspectives: test scenario on various test criteria. specification, the dependencies between test scenarios, and the traceability to other software 2.2 Test Scenario Specification artifacts. 2.1 End-to-End Testing Test scenarios are semi-formal specifications that sit between descriptive system requirements and executable test cases. The test scenarios used E2E testing is focused on the functionality of in this paper differ from the common sense the integrated system from the end user’s scenarios in two ways: (1) they describe detailed viewpoint. It is different from integration testing test requirements in terms of data, conditions and which can be performed on any subset of constraints; (2) In addition to normal inputs, they subsystems. It assumes that both module testing also capture abnormal inputs and exceptions that and integration testing have been performed and the system should also handle reliably. approved, possibly including integration testing Test scenario is described as a template as at multiple levels. It is to verify that the shown in Table 1. integrated software, include internal subsystems Section Attributes Representation and external supporting systems, will ID String Id; General collectively support the organizational core Name String Na; Description String Des; business functions [45]. Policy Test Strategy String TSt; Test Criteria String TCr; E2E testing is a life cycle process that is made Input {{idat},act}; //multi-entry Input/Output up of following procedures [55]: Expected Output {{odat},deli}; //multi-entry ?? Test planning: specifies the key tasks as well Modules Involved Interfaces Involved {md }; {inf}; //multi-entry //multi-entry Execution as the associated schedule and resources; Persistent Data {pd}; //multi-entry ?? Test design: designs an E2E testing, Execution Path {pa}; // ordered set Pre-conditions {pr}; //multi-entry Conditions including test specifications, test case Post-conditions {po}; //multi-entry Requirements {req}; //multi-entry Linkage generation, risk analysis, usage analysis, and Test Scripts {ts}; //multi-entry scheduling tests; Status String St; Agents {ag}; //multi-entry ?? Test execution: executes the test cases and Others Schedules {sc}; //multi-entry Risks float Ri; documents the testing results; and ?? Result analysis: analyzes the testing results, evaluates testing and performs additional Table 1. Template for Test Scenario definition testing if necessary. E2E testing specifications, defined in ?? General describes identification of a test -2-
    • scenario with ID and Name, as well as a definition and validation rules, is made to these brief summary of the context and objective inputs, all these test scenarios need to be of the test scenario. examined and possibly re-run. ?? In Policy, user defines the testing policy in Output Dependence: Similar to input terms of test strategy and test criteria. dependence, test scenarios may also share ?? Attribute Input identifies the set of data to be common output, either data or messages. If a exercised, and the action that triggers the change is made to an output, all the test scenarios execution. Attribute Expected Output that take it as their expected outputs are subject identifies the expected output data and to change. execution delivery. Input/Output Dependence: An output of a test ?? Execution defines the set of software scenario may be an input of another test scenario, components that are verified by the test and any change to one of the pair of test scenario composed of Modules, Interfaces, scenarios may affect the other one. Input/Output Persistent data storage, and Execution Path dependence is used to capture the data through these components. produce-usage relationships between test ?? Condition defines the pre- and scenarios. It is important when a sequence of test post-conditions for a test scenario. scenarios is executed in one session to test a Pre-conditions are those factors that must be complex usage of the system. met before the execution, and Persistent Data Dependence: In addition to user Post-conditions are the expected system input data, test scenarios may be input status after the execution. dependent, output dependent, or input/output ?? Linkage involves information for a test dependent through persistent data from files or scenario to trace to Requirements and Test databases. Any change to the persistent data, Scripts. such as definition change, configuration change, ?? This template also provides other attributes, and content change, will affect the test scenarios which are important for test management: that use the data. Persistent data dependence is Status, Agents, Schedules and Risk. used to analyze the use relationships between test scenarios and persistent data. The number of test scenarios is large. For better Execution Dependence: Execution dependence management and understanding, test scenarios captures component and interaction relationships are arranged hierarchically into a tree structure among the individual execution paths of test [45]. Test scenarios that share certain scenarios. Test scenarios may be execution commonalities can be grouped together. This dependent in three ways: identical paths, covered grouping can be recursive, that is, a collection of paths and crossing paths. lower-level scenario groups can be further Condition Dependence: Test scenarios may grouped into a higher-level group. share either the pre-conditions or post-conditions, or both. Test scenarios, which are involved in pre-condition dependent, may 2.3 Test Scenario Dependencies and start from the same system status, triggering Traceablity event, and environment setting. While for test scenarios that are post-condition dependent, their Dependence analysis provides the foundation executions may result in the same system status. for regression testing and ripple effect analysis. The traceability enables global change impact The following lists the kinds of dependence: analysis among software artifacts In Linkage of Functional Dependence: A functional the temple, attribute Requirement records all the requirement may be tested from various requirements what the test scenario is associated; perspectives, such as normal input, abnormal attribute Test Script stores all the test cases that input and exception handling, and represented by are generated from the test scenario. Test a set of test scenarios. Thus functional scenarios are also traced to software elements, dependence identifies how a set of test scenarios including subsystem components, interfaces and is related to each other with respect to a data, specified in Execution of the template. functional requirement. In general cases, a requirement can be traced to Input Dependence: Input dependence identifies multiple test scenarios and a test scenario to the common inputs shared by a set of test multiple test cases (test scripts). Each test scenarios, including input data, actions and scenario may cover multiple software triggering events. If any change, such as data components along its execution path; and each -3-
    • software component may be tested by multiple In this research, two perspectives characterize test scenarios from various aspects. the REA process: ?? Keep consistence among scenarios, software 3 Scenario-Based Regression components , and requirements, no matter Testing where the change are initiated. Whenever there is an inconsistence, modifications and impact analysis are required. Different to program slicing, which are ?? At each iterate of the process, impacts are performed on source code, test scenario slicing identified and validated using test scenario in ths paper is performed on test scenarios and slicing with various slicing criteria. enables global change impact analysis based on The following steps illustrate the characterized dependency and traceability analysis. Using test REA process for regression test selection. scenario slicing level-by-level, Ripple Effect Analysis technique directs the iterative process of 1. Associate a change request, wherever it is regression test selection and revalidation. originated from, with a set of test scenarios. If changes come from requirements and 3.1 Test Scenario Slicing implementations, a tester can identify the Definition 1. Slicing Criterion: A slicing test scenarios through the traceability criterion is a 2-tuplet, denoted as |? |?|, meaning analysis. that upon the given attribute ? , slice 2. Trace each of the test scenarios identified in corresponding attribute ?. step 1 to software components through the traceability analysis. The attributes in the two fields are identical, 3. Identify and revalidate the affected software except that when a tester analyzes the components correspondingly. input/output dependence, where the input 4. Revalidate the test scenarios identified with attribute and output attribute flip over in the two respect to inputs, outputs, conditions, test fields. data and configurations. For each of the test scenarios that need to be changed, check out Definition 2. Slice: Given test scenario s, and the corresponding test cases. slicing criterion |s.? |si .? | (s ? s i ), search 5. For each of the test scenarios, determine scenario(s) s i that match with the given slicing criteria based on its changes. Perform scenario’s attribute s. ? . slicing to identify the potentially affected The results of a scenario slice are a set of test scenarios with the criteria. The scenario(s), which may full-match, or following are some policies for determining semi -match the slicing criterion, dependent on slicing criteria: that the slicing criterion is a simple attribute or a ? ? Slicing with all of the defined slicing complex attribute. criteria. This is to identify all the dependent test scenarios. 3.2 Ripple Effect Analysis ? ? Slicing with only some slicing criteria. If an experienced test engineer is sure that changes on some parts of the test Ripple Effect Analysis (REA) is used to analyze scenario will not affect the other parts and eliminate side effects due to changes and to ensure consistency and integrity after changes and will not propagate along certain are made to software [27,43,51]. It is an iterative directions, he can perform impact analysis with a subset of slicing criteria. process of change request, software modification, impacts identification and validation. It ends For example, suppose only the inputs of when there are no more ripples. a test scenario need to be changed such as a layout of the interface, the tester The notion of ripple effect analysis , originally studied in the context of software programs, can may slice with only input slicing be extended to all software artifacts including criterion and backward input/output slicing criterion. specifications, design, and test cases. Also, it can be extended to different artifacts across different ? ? Slicing by whole set of attribute values. phases of the software life cycle. In particular, This is to select all the test scenarios the REA process is not specific to any particular that are dependent with respect to the programming language or design paradigm. attribute. -4-
    • ? ? Slicing by a subset of values. This is to strategies for selecting and scheduling regression select all the test scenarios that are tests to balance cost and reliability. dependent with respect to some specific ?? Risk-Based Strategy: Selects high-risk test aspects of the attribute. scenarios, which have high probabilities to ? ? Hybrid approach. For some test criteria, fail and/or cause serious consequences in the tester may perform slicing by whole case of fail, and to exercise them with set of attribute value, while for the high priority. others, slicing by a selected subset of ?? Usage-Based Strategy: Selects highly used attribute values. test scenarios as candidates for regression 6. If there is any scenario identified in step 5, testing. This strategy has been advocated go to step 2; otherwise prepare for the in the Cleanroom methodology [14]. following regression testing. The results of this process are: ?? Random Testing: Selects a sample set of a. A set of modified software components; test scenarios. If the sample size is large b. A set of affected test scenarios; and enough, it is possible to achieve c. A set of test cases corresponding to the reasonable statistical confidence. This is affected test scenarios. the basis for reliability testing and A partial example in Figure 1 shows how a assurance based testing [44]. change request is propagated among test ?? Time-Wise Strategy: Regression testing scenarios, and traced to software components may be performed at different stages of and test cases. The change request is first software development, such as daily build, mapped to two scenarios s1 and s2. s1 is traced weekly build, milestone, and major to software components c1 and c3, and s2 is release. traced to c4. c1 and c4 are identified for changes. Performed scenario slicing with certain slicing 4 Tool Support criterion, s3, s4, s5, and s6 are identified as potentially affected scenarios, and s3, s5 and s6 are validated as candidates to changes where s3 A web-based tool has been developed to tests c1 and s6 tests c4. Because there is no support the regression testing. The tool has a change to s4, stop impact analysis on s4; while three-tier architecture and runs on the J2EE s3, s5, and s6 need further to be propagated platform using Enterprise JavaBean (EJB), as using scenario slicing technique. Test cases are shown in Figure 3. also identified by the traceability analysis from ?? At the backend lie the file/database test scenarios to test cases, such as from s2 to t1 servers, the storage of test scenarios model, and from s6 to t2. change request and regression report ; ?? The middle layer performs the core Change Request functions of test data management, analysis, c1 c2 s1 s2 t1 and query, such as test scenario definition c3 s3 s4 s5 s6 management, dependence management, c4 t2 c5 c6 s7 s8 s9 s11 t4 traceability management, impact analysis, c7 c8 s12 s13 s10 t3 and report generation; ?? The front tier is the presentation layer. It Software Test Scenarios Test Cases Components provides users with various graphical Affected software component Change propagation between test scenarios views of the test data and analysis results. Unaffected software component Affected test scenario Trace test scenario to software component Unaffected test scenario Affected test cases Trace test scenario to test case Unaffected test cases Figure 1. Change impacts Analysis 3.3 Hybrid Regression Testing Incorporating other testing criteria, this subsection also provides the following hybrid -5-
    • Presentation Tier Middle Tier Data Tier Program Testing Using Program Dependence Graphs”, Proc. 20th ACM Symp. User Thin-Thread Def. Mgmt ODBC/ JDBC Test Database of Principles of Programming Languages, Web HTTP Jan. 1993, pp. 384-396. Condition Browser Def. Mgmt HTTP DMC IIOP Test Schedule Test [4] G. Baradhi and N. Mansour, “A Comparative User RMI Mgmt IIOP Database Study of Five Regression Testing DMC GUI Application Risk Mgmt RMI Algorithms”, Proceedings of Software DCOM Test Engineering Conference, Australian, 1997, User Test Results DMC Mgmt DCOM Database pp. 174-182. [5] Boris Beizer, Black-box testing: techniques User Interface Components Middle Tier Data Management for functional testing of software and Components Components systems, John Wiley & Sons, Inc., 1995. [6] R.V. Binder, Testing Object-Oriented Figure 2. Tool Architecture System: Models, Patterns, and Tools, Addison-Wesley, 1999. 5 Conclusion [7] D. Binkley, “Semantics Guided Regression Test Cost Reduction”, IEEE Transactions on Regression testing on large integrated software Software Engineering, Vol. 23, No. 8, Aug. system is hard because the number of test cases 1997, pp.498-516. is huge and the change impacts may be widely spread. For high-level integration regression [8] S.A. Bohner and R.S. Arnold, Software testing, the retest-all approach is usually time Change Impact Analysis, IEEE Computer and resource consuming. Society Press, Los Alamitos, 1996. To counter this challenge, this paper proposes a [9] G. Booch and T. Quatrani, Visual Modeling test scenario based approach for functional With Rational Rose and UML, regression testing. Test scenarios are semi -formal Addison-Wesley, 1998. representations of detailed system requirements, with test inputs, outputs, and conditions defined. [10] J.M. Carroll, Scenario-Based Design: By using test dependency information, one can Envisioning Work and Technology in System obtain a test slicing algorithm to detect the Development, John Wiley & Sons, Inc., New scenarios that are affected and thus they are York, 1995. candidates for regression testing. By using [11] X. Chen, W.T. Tsai, H. Huang, M. traceability information, one can find affected Poonawala, S. Rayadurgam, and Y. Wang, components and their associated test scenarios “Omega-An Integrated Environment for and test cases for regression testing. With C++ Program Maintenance”, Proc. Int’l information of dependency and traceability, one Conf. Software Maintenance, 1996, pp. can use the ripple effect analysis to identify all 114-123. affected, including directly or indirectly, scenarios, and thus the set of test cases can be [12] Y.F. Chen, D. S. Rosenblum, and K.P. Vo, selected for regression testing. “TestTube: A System for Selective Regression Testing”, Proc. 16th Int’l Conf. Software Eng., May 1994, pp. 211-222. Reference [13] D. Harel, “From Play-In Scenarios to Code: [1] H. Agrawal and J. Horgan, “Dynamic An Achievable Dream”, IEEE Computer, Program Slicing”, Proceedings of the ACM Vol. 34, No. 1, Jan. 2001, pp. 53-60. SIGPLAN’90 Conference on Programming Language Design and Implementation, 1990, [14] M. Dyer, The Cleanroom Approach to pp. 246-256. Quality Software Development, Wiley, New York, New York, 1992. [2] H. Agrawal, J.R. Horgan, E.W. Krauser and S.A. London, “Incremental Regression [15] S.G. Eick, T.L. Graves, A.F. Karr, J.S. Testing”, Proceedings of the Conference on Marron, and A. Mockus, “Does Code Software Maintenance, 1993, pp. 348-357. Decay? Assessing the Evidence from Change Management Data”, IEEE [3] S. Bates and S. Horwitz, “Incremental Transactions on Software Engineering, Vol. -6-
    • 27, No. 1, Jan. 2001, pp. 1-12. Department, University of Minnesota, 1993. [16] K. Fischer, F. Raji and A. Chruscicki, “A [28] C. Jones, “Software Change Management”, Methodology for Re-testing Modified Computer, Vol. 29, No. 2, Feb. 1996, pp. Software”, Proc. National 80-82. Telecommunications Conf., 1981, B6.3.1-B6.3.6. [29] D. Kulak and E. Guiney, Use Cases: Requirements in Context, Addison Wesley, [17] R. Gupta, M.J. Harrold and M.L. Soffa, “An Reading, MA, 2000. Approach to Regression Testing Using Slicing”, Proceedings of the Conference on [30] M.M Lehman and L.A. Belady, Program Software Maintenance, 1992, pp. 299-308. Evolution: Processes of Software Change, Academic Press, 1985. [18] J. Han, “Supporting Impact Analysis and Change Propagation in Software [31] H.K.N. Leung and L. White, “Insights into Engineering Environments”, In Proceeding s Regression Testing”, Proceedings of the of the 8th IEEE International Workshop on Conference on Software Maintenance, 1989, Software Technology and Engineering pp. 60-69. Practice, July 1997, pp. 32-41. [32] H.K.N. Leung and L. White, “A Study of [19] J. Hartmann and D.J. Robson, “Approaches Integration Testing and Software Regression to Regression Testing”, Proceedings of the at the Integration Level”, Proc. Conf. Conference on Software Maintenance, 1988, Software Maintenance, Nov. 1990, pp. pp. 368-372. 290-301. [20] J. Hartmann and D.J. Robson, “Techniques [33] V. Matena and B. Stearns, Applying for Selective Revalidation”, IEEE Software, Enterprise JavaBeansTM : Component-Base Vol. 7, No. 1, Jan. 1990, pp. 31-36. Development for the J2EETM Platform, Addison-Wesley, Boston, 2000. [21] M.J. Harrold and M.L. Soffa, “An Incremental Approach to Unit Testing [34] A.K. Onoma, W.T. Tsai, M. Poonawala, and During Maintenance”, Proceedings of The H. Suganuma, “ Regression Testing in an Conference on Software Maintenance, 1988, Industrial Environment”, Communications pp. 362-367. of the ACM, Vol. 41, No. 5, May 1998, pp. 81-86. [22] S. Horwitz, T. Reps and D. Binkley, “Interprocedural Slicing Using Dependence [35] M. Poonawala, S. Subramanian, R. Graphs”, ACM Transactions on Vishnuvajjala, W.T. Tsai, R. Mojdehbakhsh, Programming Languages and Systems, Vol. and L. Elliot, “Testing Safety-Critical 2, 1990, pp. 29-60. Systems – A Reuse-Oriented Appraoch”, Proc. of 9th International Conference on [23] H. Huang, W.T. Tsai and S. Subramanian, SEKE, 1997, pp. 271-278. “Generalized Program Slicing for Software Maintenance”, Proceedings of International [36] R.S. Pressman, Software Engineering: A Conference on Software Engineering and Practitioner’ Approach, McGraw Hill s Knowledge engineering, 1996, pp. 261-268. College Div., 5th edition, 2000. [24] I. Jacobson, M. Christerson, P. Jonsson, and [37] Ed Roman, Mastering Enterprise G. Overgarrd, Object-Oriented Software JavaBeansTM and the JavaTM 2 Platform, Engineering: A Use Case Driven Approach, Enterprise Edition, John Wiley & Sons, Inc. Addison Wesley, Reading, MA, 1992. New York, New York, 1999. [25] I. Jacobson, G. Booch, and J. Rumbaugh, [38] G. Rothermel and M.J. Harrold, “Selecting The Unified Software Development Process, Regression Tests for Object-Oriented Addison Wesley, Reading, MA, 1999. Software”, Proc. Conf. Software Maintenance, Sept. 1994, pp. 14-25. [26] P. C. Jorgensen, Software Testing: A Craftsman’s Approach, CRC Press, 1995. [39] G. Rothermel and M.J. Harrold, “Analyzing Regression Test Selection Techniques”, [27] J. K. Joiner and W. T. Tsai, “Ripple Effect IEEE Transactions on Software Engineering, Analysis, Program Slicing and Dependency Vol. 22, No. 8, Aug. 1996, pp. 529-551. Analysis”, TR-93-84, Computer Science -7-
    • [40] R.C. Sugden and M.R. Stens, “Strategies, Object-Oriented Applications”, Proc. of Tactics and Methods for Handling Change”, IEEE WORDS, 1997, pp. 163-170. Proceedings of 1996 IEEE Symposium and Workshop on Engineering of [48] M. Weiser, “Program Slicing”, IEEE Computer-Based System, 1996, pp. Transaction on Software Engineering, Vol. 457-463. 10, No. 4, 1984, pp. 352-357. [41] F. Tip, “A Survey of Program Slicing [49] L. White and H.K.M Leung, “On the Edge, Techniques”, Journal of Programming Regression Testing”, IEEE Micro, Vol. 12, Languages, Vol. 3, 1995, pp. 121-189. No. 2, April 1992, pp. 81-84. [42] W. T. Tsai, W. Xie, I. A. Zualkernan, and S. [50] L.J. White V. Narayanswamy, T. Friedman, K. Musukula, "A Framework for Systematic M. Kirschenbaum, P. Piwowarski, and M. Testing of Software Specifications", Proc. of Oha, “Test Manager: A Regression Testing International Conference on Software Tool”, Proc. Conf. Software Maintenance, Engineering and Knowledge Engineering, Sept. 1993, pp. 338-347. 1993, pp. 380-387. [51] S.S. Yau, J.S. Collofello and T. MacGregor, [43] W.T. Tsai, R. Mojedehbakhsh, F. Zhu, “Ripple Effect Analysis of Software “Ensuring System and Software Reliability Maintenance”, IEEE COMPSAC’78, 1978, in Safety-Critical Systems”, Proceedings of pp. 60-65. 1998 IEEE workshop on Application-Specific Software Engineering [52] S.S. Yau and Z. Kishimoto, “A Method for Technology, ASSET-98, 1998, pp. 48-53. Revalidating Modified Programs in the Maintenance Phase”, IEEE COMPSAC’87, [44] W.T. Tsai, X. Bai, R. Paul, G. Devaraj and V. 1987, pp. 272-277. Agarwal, “An Approach to Modify and Test Expired Window Logic”, in First [53] DoD OASD C3I Investment and Acquisition, Asia-Pacific Conference on Quality “Year 2000 Management Plan”, 1999. Software, 2000, pp. 99-108. [54] DoD OASD C3I Investment and Acquisition, [45] W.T. Tsai, X. Bai, R. Paul, W. Shao, V. “Repairing Latent Year 2000 Defects Caused Agarwal, T. Sheng, B. Li and J. Honnavalli, by Date Windowing”, 2000. “End-to-End Integration Testing Design”, Department of Computer Science, Arizona [55] DoD OASD C3I Investment and Acquisition, State University, 2001. “End-to-End Integration Testing Guidebook”, 2001. [46] Y. Wang, W.T. Tsai, X.P. Chen and S. Rayadurgam, “The Role of Program Slicing [56] IEEE Standard Glossary of Software in Ripple Effect Analysis”, in Proc. of Engineering Terminology, IEEE Std Software Engineering and Knowledge 610.12-1990, 10 Dec. 1990. Engineering, 1996, pp. 369-376. [57] “MSCs: ITU-T Recommendation Z.120: [47] Y. Wang, R.Vishnuvajalla and W.T. Tsai, Message Sequence Chart (MSC)”, ITU-Y, “Sequence Specification for Concurrent Geneva, 1996. -8-