Published on

1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide


  1. 1. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE 6.5 Testing VI.D.8.Testing This subsection must include a narrative describing the QBP’s approach to testing and a sample Test Plan used by the QBP on another project as described in Section VI.D.8 – Testing. 6.5.A Testing VI.D.8.Testing Req. Requirement Response / Reference Num. M-43 Contractor shall describe the overall testing approach and Section 6.4 methodology used for the SOMS Project The Bidder must submit a narrative describing the testing approach and methodology with their proposal response as identified in Section VIII – Proposal and Bid Form Bidder Response Detail: As a leader in information technology services, EDS has invested heavily in establishing application and solution testing as a core competency. We use disciplined testing practices to make sure functional quality, performance, and security for applications. Our services provide a control point for our clients to mitigate their risk as they seek to develop and modernize applications that utilize modern infrastructure and modern SOA architectures. Our recent acquisition of RelQ Software, a pioneer in the independent testing and V&V services market, positions the EDS Global Testing Practice as a leader in the global testing market. Our global knowledge can be applied locally, helping CDCR to improve the quality of the software delivered – measurably reducing defects and rework while improving application coverage and speed to delivery. Approach to SOMS Testing EDS uses a unique, comprehensive approach to risk-based testing – one driven by business requirements. Our Global Testing Practice identifies high-risk, high-value business requirements and concentrates testing activity on those requirements to minimize risk and deliver the most value to CDCR. Team EDS’ approach to testing is based on our commitment to delivering a product of very high quality for acceptance testing. The objective of our rigorous testing approach is to validate that all requirements have been completely tested and all issues resolved before the start of user acceptance testing. EDS’ philosophy is that testing is an integral part of virtually every phase of the system development life cycle. We begin our test planning at the earliest stages of the project, and it is tailored to meet the development methodology for the SOMS project. EDS’ approach to EDS SOMS STATEMENT OF WORK EXHIBIT I-215
  2. 2. INITIAL FINAL PROPOS AL (CDCR -5225-113) testing is governed by specific workflows, roles, and artifacts prescribed by the quality assurance (QA) process defined in our project management approach. The primary goal of the testing effort is to increase the probability that when the system is launched, it behaves as expected under all circumstances and meets the defined requirements as set forth in the RFP. In support of this goal the following testing objectives are defined: Verify that all end-user paths, inclusive of all interfaces to and through the system, perform correctly (end-to-end testing) Verify that the user interface screens perform correctly Verify that the system complies with defined response times Verify that all devices, such as workstations, printers, and scanners, perform correctly Verify that the environments are operationally ready and production worthy Verify that all sites are operationally ready and production worthy Verify that all system utility functions (backup, recovery, failsafe) perform correctly Incorporate a test design that not only provides knowledge transfer to the CDCR, but also minimizes test rework associated with modifications to the system post-production deployment Perform test activities that support both defect prevention and defect detection Incorporate automated test design to create reusable and maintainable scripts, not only to expedite the testing process during initial development, but to provide these testing assets to the client at project end Verify that requirements are testable Verify traceability to the requirements to validate test coverage The most important step in the successful testing of a component or a complete software system is capturing a clear and deep understanding of the functional requirements. Our approach to requirements gathering makes sure that they are testable. Testability in a requirement is determined by asking questions about the requirement to determine if it is complete, unambiguous, consistent with other system requirements, correct, and described at an elemental level for testing purposes (not a compound set of conditions). Our approach is adaptable to any software development methodology – whether waterfall, iterative, spiral, or agile. We align with the selected methodology through our Testing V Model, depicted in Figure 6.5-1, EDS’ Testing V Model, verifying compliance with the development process and validating the quality of application requirements, design, and code. Our Enterprise Testing Method (ETM), of which Testing V Model is a part, is a comprehensive testing framework, provides the structure, processes, tools, and templates to make sure that CDCR has consistent, high-quality testing services. EXHIBIT I-216 EDS SOMS STATEMENT OF WORK
  3. 3. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE The purpose of the ETM is to increase productivity and the quality of EDS' testing practices, resulting in reduced risk of failed or faulty implementations by improving the comprehensiveness and focus of our testing activities. Testing Metrics Testing Approach Ambiguity Risk Systematic Requirements Close-Down Test Design Traceability Collection Analysis Analysis Activities and Reporting Test Preparation, Analysis Test Design Close-Down Execution, and Monitoring EDS Enterprise Testing Method ETM EDS Strategy Design Techniques Management Measurement Production Support Analysis Implementation Risk-based, requirements-driven, Testing V Model end-to-end testing Design Testing Construction Figure 6.5-1, EDS’ Testing V Model Testing Methodology – Overview of Enterprise Testing Method (ETM) As a comprehensive testing framework, the ETM provides test coverage for all hardware and software that make up a delivered system, providing direction and guidance in developing a testing strategy; planning, designing, and executing the required testing; managing and measuring all testing activities; and providing post-deployment testing support to an application or system in production. EDS bases its philosophy, process, and methodology on the following seven principles. Meeting Business Needs – The overriding objective of testing at all stages is to confirm that a system is fit for its business purpose. The correctness of test results is judged at all stages of development in terms of whether the system performs in a way that meets the business needs. Defect-Centric Testing – The objective of designing and running a test is to find a defect. Testing can never prove that a computer system works: It can only build up a level of confidence. Confidence is derived from finding errors and fixing them. Testing Throughout the Life Cycle – Testing must be performed on all products at all stages of the implementation process. Software products and associated design and user documents emerge throughout the development process. Testing must therefore be planned as an integral part of the iterative life cycle. EDS SOMS STATEMENT OF WORK EXHIBIT I-217
  4. 4. INITIAL FINAL PROPOS AL (CDCR -5225-113) Comprehensive Requirements Traceability – EDS uses an effective combination of Borland StarTeam and HP Quality Center for comprehensive requirements traceability. We have used these tools successfully on many large projects. Independent Testing – A deliverable should be tested by someone other than its creator. Independent testing is more effective than testing performed by the author of a deliverable. The active and constant involvement of users in the project makes certain that an independent perspective can be applied. Repeatable Testing – Tests must be repeatable. To make a test repeatable, it must be documented. Well-Managed Test Data – Given the complexity and interdependence among internal and external systems with regard to test data, a formal test data management strategy for each SOMS phase and area (such as eOMIS, interfaces, data conversion) will be developed once requirements and design documentation are sufficiently complete and available for review. At a high level, the types of data required for each level of testing includes: – Application data, basic configuration – Application data, transactional – Input data from external source systems – Output data for external destination systems – List of databases and files – Summary of record and transaction types and volumes – Data converted from legacy systems – Transient data requirements – Live data from other SOMS projects The ETM delivers value by enabling projects, programs, and organizations to define the most appropriate testing approach for their needs, to formalize testing requirements with clients, and to execute and manage testing across an effort of any size. The ETM defines testing activities across deliverable components throughout the entire deployment life cycle. It helps testing resources clearly define, execute, and manage testing activities – lowering risk and increasing solution quality by providing a testing framework that is comprehensive enough to guarantee consistency, yet flexible enough to allow tailoring that adapts the ETM to the specific testing needs of any client. The ETM has associations with other EDS methods, processes, and strategies. The ETM links, by direct reference, to other methods for cross-discipline functions such as project management and requirements determination. Specifically, the ETM fully aligns with System Lifecycle 3 (SLC3), the software development life cycle that has been selected for the SOMS project. It also fully supports the unique needs of SOA testing. Please refer to proposal Section 6.04 - System Design and Development for a detailed description of SLC3. The EDS ETM includes several levels of testing that are essential to the successful implementation of enterprise solution engagements. The following section provides a high- EXHIBIT I-218 EDS SOMS STATEMENT OF WORK
  5. 5. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE level overview of the focus of each of the levels with specific regard to enterprise implementations. ETM testing focuses on the following: Application components – Unit testing – Component integration testing Services – Service testing Integrated systems – System testing – System integration testing System Performance – Performance testing End-to-end intersystem testing – Consolidated integration testing Formal acceptance – User acceptance testing – Operational acceptance testing EDS will develop a Master Test Plan that provides a framework for all testing to be performed for the SOMS project. The SOMS solution is thoroughly tested as specified in the Master Test Plan. We provide “out of the box” testing to verify that the COTS infrastructure products are functioning properly. Negative testing scenarios are included. The Master Test Plan includes testing for the configured and programmed items, the programs and reports, and a complete end-to-end test including testing of interfaces to external systems. For each SOMS release, EDS does the following: Develop a Release Test Plan and testing schedule Conduct all test levels specified in the Release Test Plan Develop a focused Test Plan for each test level Facilitate and coordinate the staff testing effort Provide support and software corrections during user acceptance testing Submit a written document summarizing test results There are many security concerns related to the use of real or live production data throughout the project life cycle. The SOMS project uses the services of many people for this project that are not employed by CDCR and may not have the proper security clearances EDS SOMS STATEMENT OF WORK EXHIBIT I-219
  6. 6. INITIAL FINAL PROPOS AL (CDCR -5225-113) to view and use personal information. Consequently, there are many security, privacy, and legal issues that must be considered when using production data. Team EDS in collaboration with CDCR reviews these issues through CDCR’s security and legal departments, and determines the necessary steps to make sure that production data is used properly. Testing of the system follows recommendations and requirements determined by CDCR to make sure the proper handling of test data in testing environments. Managing the Testing Effort Team EDS has extensive experience in managing large testing efforts in public sector and commercial projects. We believe in leveraging our past experience and established methodologies to define a clear and effective test strategy. Testing is integrated into every facet of our development cycle. The primary goal of the testing effort is to make sure that when the system is placed in the production environment, it behaves as expected under all normal circumstances and meets the requirements as defined in the RFP. Test Planning The SOMS Test Plans begin during the requirements validation stage and are presented as deliverables. Each section of the Test Plans also specifies the components of the SOMS Test Reports where appropriate. The Test Plans include a description of the tests being conducted, any special requirements for data, and the goal for each test. The Test Report records the results of the tests, presents the capabilities and deficiencies for review, and provides a means of assessing progress toward the next stage of development or testing. When defects are found, the Test Report provides the error description and appropriate action taken to correct the affected component subsystem or system. It also provides appropriate statistics for each component. The SOMS Test Plans are living documents and must be reviewed and updated as needed. The initial test descriptions are defined based on the baseline requirements and are elaborated on during the Design Phase. During the system test phases, software engineers, testers, and the Quality Assurance team may develop additional or more detailed descriptions and test cases. All test artifacts are maintained in the central Borland StarTeam repository. Within each test phase, various test conditions are tested and validated, as shown in Table 6.5-1, Test Phases and Conditions Tested. Table 6.5-1, Test Phases and Conditions Tested Test Condition System Performance Fault Insertion ● String Testing (end to end) ● Error Handling ● ● EXHIBIT I-220 EDS SOMS STATEMENT OF WORK
  7. 7. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE Test Condition System Performance Stress ● Performance ● ● Portal Functionality ● ● Functional ● Regression ● Interface Testing ● ● Security ● ● Operational Readiness ● User Interface ● Audit Trails ● Backup/Recovery/Failsafe ● Usability ● Conversion ● ● As a precursor to testing, Team EDS develops a comprehensive plan to guide the activities of the levels and stages of testing. The Test Plans detail the complete end-to-end testing of the system and all of its components. To guarantee the completeness of the SOMS Test Plans, formal system requirements identified during the Requirements Validation Phase of the release are used to drive the design and execution of the Test Plans. At the inception of the project, a Test team is formed that is responsible for formulating the Test Plans. The team defines the criteria for evaluating the testing effort, design test scenarios for both normal and exceptional situations, and evaluates the results of system- level tests. As part of the creation of the Test Plans, Team EDS develops a comprehensive test database to serve as the baseline information required to test the system fully. As the various levels of testing are executed, the data used to verify the viability of each aspect of the system is used to augment the test database. The system Test Plans are developed for SOMS during the application software system test phase. Part of the plan is to break the system test into cycles and test variants within cycles. Expected results are prepared for each test variant. Test variants are grouped into test scripts for execution. Test Data Creation Team EDS plans to use “real world” data for testing to be able to age the data in the same way a production case might change characteristics. To maintain confidentiality, we EDS SOMS STATEMENT OF WORK EXHIBIT I-221
  8. 8. INITIAL FINAL PROPOS AL (CDCR -5225-113) scramble critical identifiers such as social security numbers and dates of birth. We also enforce multiple layers of physical data safeguards, including confidentiality agreements, staff training, secure work sites, and use of shredders. This means that we must start data conversion early enough to have a usable test database when the developers are ready to begin unit testing. The Data Conversion team feeds data to the test environment. As the team executes iterations of cleansing and sanitizing the converted legacy data and loading it into the electronic Offender Management Information System (eOMIS) data structure, the data become more reliable. The SMEs, who have been working on offender management for years, guide the creation of test data for the test scenarios established for SOMS. The test data created from legacy data conversion processes is used to develop and test all aspects of SOMS including interfaces and business processes. For performance testing, we recognize that it is difficult to generate sufficient data artificially before the real data is created, and the real data only builds up gradually during back- scanning and data conversion. The test data for performance testing is generated from the storage area network (SAN) backup database in the production environment (Release 1 onwards) and the data conversion database in Release 1, repeating load tests as the back- scanning and converted legacy data builds up, for example, at 25 percent, 50 percent, and 100 percent load. Defect Tracking and Resolution It is not sufficient to detect issues and defects. It is critical that they be recorded, assigned for resolution, and retested after they are fixed. Borland StarTeam supports very efficient generation, assignment, and tracking of issues and defects. Issues and defects can also be linked using Realization connectors to model elements that are responsible for them. Issue tracking is important on complex projects, especially those with multiple state and federal requirements that must be met. This software test and defect tracking activity begins the moment a software discrepancy is detected and continues indefinitely across all iterations. Test Environment Unit and string testing is performed by the application developers in the development test environments. All system, acceptance testing, regression testing, and performance testing is performed in dedicated and controlled test environments. EDS’ Infrastructure team works in coordination with the EDS Test team to confirm that all components necessary for testing are part of the environment. The EDS Test Team does the following: Work closely with the configuration manager to migrate software ready for testing to the test environment Load all data needed to perform testing. This includes new case scenarios and other dependencies. Schedule batch processes Identify performance issues and defects in the test environment that may not be related to software EXHIBIT I-222 EDS SOMS STATEMENT OF WORK
  9. 9. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE The migration of software to the test environment is managed through a very controlled configuration management process, using the configuration management tool CA Cohesion. Communication Approach Written and oral communication is critical to successful management and the Testing Phase is no exception. EDS uses a standard set of processes and tools to make sure that our Communication Plan is defined and followed throughout the Testing Phase. Tools include regular status meetings and reports. The status of system tests is part of the regular Weekly Project Status Report delivered to CDCR SOMS Project Management. Reporting is not handled strictly through producing and delivering electronic and hard copy documents to the State. Testing Phase Plans and results are also shared by conducting walkthroughs and system demonstrations. These interactive sessions are conducted by the EDS Test team with key stakeholders and are supported by the documentation of changes before and after system outputs. Test Team Reports, Deliverables, and Performance Metrics The Test team performs an essential quality assurance (QA) role by subjecting planned system changes to a series of inspections and process executions to determine the behavior of the system. The testing activities are like scientific experiments that include detailed specifications of the planned change, criteria for the conditions of the tests, multiple executions of software code under different scenarios, and rigorous reporting and analysis of outcomes. Figure 6.5-2, Sample SOMS System Test Status Report (Period to Period) An excerpt from a sample system test status report for the SOMS application is presented graphically above in Figure 6.5-2, Sample SOMS System Test Status Report (Period to Period), and includes the status of test variant execution. This example shows that of the EDS SOMS STATEMENT OF WORK EXHIBIT I-223
  10. 10. INITIAL FINAL PROPOS AL (CDCR -5225-113) 2,443 total test variants for SOMS, 50 percent have been executed and 40 percent have passed. In this example, passed means executed successfully without generating any Level 1 or Level 2 incidents. Reports The topics shown in Table 6.5-2, Communication Topics, are covered on a scheduled basis to communicate the progress and status of testing. They are included in the weekly project status report that provides details on the oversight of the project to the EDS and CDCR Project Management teams. Table 6.5-2, Communication Topics Topics To Be Covered Description Management Summary Report on the activities of the Test team for the previous period and activities planned for the period ahead. Migration Schedule Report of planned release content, such as work products to be included. It is used by Team EDS to schedule fixes, and by project management to manage releases. Testing Completed Report of testing completed in the reporting period. Indicates issues and unresolved cases of testing not meeting expectations. Incidents in Rework This report, a logical adjunct to the Testing Completed report, identifies all rework items, who is doing the rework, and the nature of the rework, such as recoding or revisiting the interpretation of requirements. Application Defects Summary and detailed report of application defects found during testing. Approach to Testability The term “testability” is used to refer to two distinct design practices: Design for Test and Design for Diagnosis. It is important that a distinction be made between these two practices: Design for Test refers to the use of good design practices that facilitate testing. In this sense of the word, design engineers typically verify that good testability practices are followed during the traditional design process. Design for Diagnosis refers to the optimization of a design and accompanying test procedures to facilitate good diagnostics. In Design for Diagnosis, the word “testability” involves the assessment of the fault detection and fault isolation capability of a system or device, as well as the optimization of test point placement, functional partitioning, and diagnostic strategies that are needed to meet a system’s testability requirements. At EDS we use both of these methods to enhance the testability of our systems, and we apply these concepts to make sure that SOMS is built to exceed the State’s expectations. EDS approaches testability as a design characteristic that allows the status (operable, inoperable, or degraded) of an item to be determined and the isolation of faults within it to be performed in a timely manner. We have incorporated several best practices into our standard development methodology to enhance testability through the Design, Integration, and Testing phases. EXHIBIT I-224 EDS SOMS STATEMENT OF WORK
  11. 11. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE The implementation of eOMIS accelerates the testability of SOMS and reduces total time for testing. The eOMIS version proposed for SOMS is a 91 percent fit to the SOMS requirements. The modules of the proposed eOMIS are proven, tested, and established at other client locations, thus reducing risk and the time needed to establish SOMS. System Testability Best Practices Build a cleaner system in the first place, so that less test-debug-fix cycles of rework are needed. As part of this best practice, the following quality practices are incorporated into the SOMS project: Use requirements validation Design to facilitate testing Build from precise specifications (designs) Perform consistent code inspections Evaluate and validate unit testing coverage Automate execution of unit tests to enable frequent testing Test early; test small pieces first; developers perform unit testing Use generational capabilities of tools to reduce human error Use source code control Establish test entry criteria. We obtain consensus with the project manager and developers on the criteria for determining whether the system is ready to test, and whether the test environment is ready. Increase the early involvement of testers in the project. The SOMS project Test team is engaged early in the project and has enough time to master the system, determine how to test it, prepare the test cases, and fully validate the test environment. The testers actively participate in developing the overall project Work Plan. In fact, we plan to move the Functional Manager and part of the Functional team that validates and defines the requirements immediately to the system test planning task. Verify that the testers have a thorough understanding. Team EDS system testers understand the following: The project goals and success factors The project context The system’s functionality The system’s risks and vulnerabilities The test equipment, procedures, and tools EDS SOMS STATEMENT OF WORK EXHIBIT I-225
  12. 12. INITIAL FINAL PROPOS AL (CDCR -5225-113) The testers use Design for Test reviews to instrument and place probes into the system being tested. Design for Test is intended to give black-box testers access to the hidden internal behavior of the system. Encourage a common system architecture and component reuse. Although these areas are not generally the primary concern of testers and QA analysts, a common architecture across a family of systems and planned component reuse can drastically shorten the test time and the overall development cycle time. Stabilize the system being tested as early as possible. The SOMS applications are placed under change control and version control, and a cutoff date is established beyond which no changes are allowed except emergency show-stopper fixes (the code freeze date) before moving into the System Test Phase. In some cases, EDS sets the cutoff early, even if this means reducing functionality, to allow sufficient time after the fix for final testing and fixing of the stabilized version (at least 2 weeks for small systems and at least 1 month for large complex ones). Stabilize and control the test environment. An argument can be made that more stop- and-go interruptions of test execution are caused by gremlins in the test environment than by any other cause. Team EDS establishes and tests the testing environment well in advance of the scheduled start of testing. The test environment includes tools to manage the test configuration, diagnose problems in the environment, and easily reconfigure the environment from test to test. Benefits of EDS’ Testing Approach EDS believes our proposed testing approach is consistent with the CDCR requirements and provides enhanced benefits. Table 6.5-3, Features and Benefits of Our Testing Approach, highlights the features and benefits of our approach. Table 6.5-3, Features and Benefits of Our Testing Approach Feature Benefit Early, up-front test planning, where test conditions Verifies that SOMS directly supports the CDCR and cycles are developed as part of the business model and meets the needs of each specification development stakeholder Provide CDCR SMEs and stakeholders the Verifies that the SOMS application directly supports opportunity to review and approve the Test Model the CDCR business model and meets the needs of and Plan each stakeholder Build milestones into the Test Plan that the CDCR can use when assessing the quality and thoroughness of the test process Execute integrated test packages based on realistic Verifies that the application supports the business business cases, with substantial input from the functions being tested CDCR SMEs Develop well-documented, repeatable test models Reduces risk, cost, and the testing time frame to facilitate analysis and regression testing Automated testing tools are used, where possible, to help automate the regression testing process Follow the EDS Enterprise Testing Method, to make Verifies stage containment and minimizes the sure that the Test Plan is complete and aligned duration and cost of testing EXHIBIT I-226 EDS SOMS STATEMENT OF WORK
  13. 13. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE Feature Benefit with the design process Use a tightly controlled test environment that is Verifies that application components are tested in a separate from the development environment, so production-like environment and that code versions that fixes can be made to the application without are managed accurately affecting user testing Provide comprehensive written and oral Enhances communication with CDCR, verifies that communication of testing efforts at all levels, at Team EDS meets business needs and stakeholder scheduled periods expectations, and avoids rework Use third-party tools to manage test execution, Provides the following benefits to CDCR: including for:  Verification of completed requirements  Defect tracking  Version control  Source code management  Reduces risk, cost, and time frame for testing  Automated test scripts  Multiple simultaneous environments  Database management  Close interaction with CDCR to make sure that  Status reporting Team EDS delivers – at a minimum – what is requested and expected 6.5.B Test Plan Requirements VI.D.8.a. Test Plan Requirements Req. Requirement Response / Reference Num. M-44 The Contractor shall incorporate the testing approach into a Vol 1 – Appendices comprehensive Test Plan complying with CDCR testing Appendix J practices and IEEE Std. 829-1998 Standard for Software Test Documentation. The Test Plan shall include the procedures for documenting the completion of each test phase, test scripts, test conditions, test cases, and test reports. Detailed Test Plans shall be created for the following:  Unit Testing  Functional Testing  Integration Testing  System Testing  Security Testing  Regression Testing  Stress/Load Testing  Performance Testing  Acceptance/Usability Testing The Bidder must submit a sample Test Plan with their proposal response as identified in Section VIII – Proposal and Bid Format. Bidder Response Detail: EDS SOMS STATEMENT OF WORK EXHIBIT I-227
  14. 14. INITIAL FINAL PROPOS AL (CDCR -5225-113) One of the objectives of the Team EDS approach to application development is reuse. Defining and applying a consistent approach to testing is a form of reuse that can reduce duplication of effort, improve productivity, and increase the speed with which SOMS implementation is completed and delivered. Team EDS complies with this requirement and confirms that we will incorporate the testing approach into a comprehensive Master Test Plan that complies with CDCR testing practices and exceeds IEEE Std. 829-1998, Standard for Software Test Documentation. We also confirm that the Test Plan will include the procedures for documenting the completion of each test phase and of test scripts, test conditions, test cases, and test reports. We also confirm that Team EDS will provide detailed Test Plans for: Unit testing Functional testing Integration testing System testing Security testing Regression testing Stress/load testing Performance testing Acceptance/usability testing As part of this proposal, we have also included a Sample Test Plan in Appendix J of our response. The purpose of the Master Test Plan is to define the comprehensive testing strategy that is applied to the SOMS program as a whole and to each release undertaken by EDS. The primary objective of a Master Test Plan is to provide guidance and establish a program-level framework within which all testing activities for a specific project or release can be defined, planned, and executed. This “umbrella” strategy focuses on defining and facilitating an efficient and cost-effective approach to all testing activities that supports: Achieving goals for the SOMS implementation Defining how an iterative-incremental approach to application development applies to testing activities Meeting agreed requirements Managing risks Informed decision-making Before the Testing teams begin the SOMS testing effort, the Test team members are walked through the Test Plan, test strategy, and testing goals, so that they work in unison and create a better SOMS application. We establish guidelines and processes for the Test team EXHIBIT I-228 EDS SOMS STATEMENT OF WORK
  15. 15. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE for creating the test data, positive and negative tests, and guidelines for logging and closing a bug. These guidelines document and support the SOMS implementation project life cycle with information applicable specifically to the testing life cycle, and include the following: Testing roles and responsibilities The testing methodology The test levels that are performed The test coverage that each test level provides The deliverables associated with each of the test levels The testing activities developed and implemented for each test level are described in detail, including relevant assumptions and dependencies. The test planning and deliverables (plans, cases, and data) for each SOMS release derive their direction from the Master Test Plan. Implementing a common testing framework verifies that the appropriate levels of testing are performed, that testing takes place in the appropriate environments, that proper test data is generated, and that appropriate test cases are developed and executed. This approach results in a better product for SOMS implementation, as defects are identified early in the development life cycle. The cost of development and delivery are also reduced, as rework during the later stages of the product development cycle is minimized. In addition to the Enterprise Testing Method (ETM) described under M-43, EDS has developed a related SOA testing strategy template that can be tailored to meet the unique needs of an SOA development for SOMS. This template is used as the basis for developing the SOMS Master Test Plan. It supports the application of efficient and effective testing practices across all SOA projects and provides comprehensive guidance on all aspects of the testing process by defining the following: SOA testing organization along with generic and specific roles and responsibilities for the appropriate project stakeholders Generic, standard testing methodology to be adapted for use on each SOA project Activities associated with test planning, preparation, and execution Activities associated with managing all aspects of testing Requirements for testing environments and tools required to support the testing effort This SOA testing strategy template also includes a comprehensive glossary of testing terms. For each SOMS release, EDS develops a Release Test Plan that addresses the scope, testing requirements, test coverage, test levels to be performed, resourcing, and schedule. Detailed Test Plans are developed for each test level to be performed for the release. Table 6.5-4, EDS SOMS STATEMENT OF WORK EXHIBIT I-229
  16. 16. INITIAL FINAL PROPOS AL (CDCR -5225-113) Test Deliverables, summarizes the primary testing deliverables that EDS will produce for each SOMS release. Table 6.5-4, Test Deliverables Work Product Purpose Testing Life-Cycle Phase Test Plan The primary means by which each Testing team Planning communicates what it plans to test, its environmental Inputs can include: and resource needs, and the schedule for each testing Documented and agreed level to be performed. business requirements Agreed change requests E2E testing strategy Activities include defining: Scope (in and out) Objectives Project- and release-specific entry and exit criteria Tasks and deliverables Assumptions, constraints, and risks Resource requirements (source information, equipment, software, data, personnel, tools) Team roles and responsibilities Training Plan for testers (if required) High-level test scenarios Test Scenarios Grouping of selected test cases that follow a logical Design sequence or common grouped business processes. Inputs can include: Approved functional specifications and design Define the conditions to be tested and their assigned documents Test Cases priorities. Prerequisites, detailed test procedures, and Approved technical expected test results must also be included. specifications Test Plan for the testing level Activities include: Reviewing and refining test scenarios Creating test cases Defining test data Defining expected results Test Scripts Automated instances of selected test cases. Test Execution Inputs can include: Test cases Test data Test results Testing Progress Regular reports on testing progress for each test level Activities include: EXHIBIT I-230 EDS SOMS STATEMENT OF WORK
  17. 17. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE Work Product Purpose Testing Life-Cycle Phase Reports (ongoing) based on the following metrics: Executing tests (cases and scripts) Verifying test results Number of test cases planned Documenting actual results Number and percentage of test cases executed Identifying defects Number and percentage of test cases closed Re-executing tests Number of defects by status (severity and priority) Test Summary Summary of the final outcome of test execution for Closedown Report (final) each test level: Inputs consist of recorded test results Number of test cases planned Number and priority of open test cases Activities include: Number, severity, and priority of unresolved Assessing the outstanding status defects of testing (i.e., number of test A recommendation on the readiness of the system cases planned, number and for the next testing level or project phase percentage of test cases executed, number and percentage of test cases closed, number of defects by status) Preparing and delivering a recommendation 6.5.C General Testing Requirements VI.D.8.b.General Testing Requirements Req. Requirement Response / Reference Num. M-45 Testing and Development shall have their own environments, Section 5.0 separate from Production. Testing or development shall not be performed in the production environment. Bidder Response Detail: Team EDS complies with this requirement and confirms that testing and development have their own environments, separate from production. Testing and development are not performed in the production environment. In proposal Section 5.0, Technical Requirements, we have described the proposed configurations for the various SOMS environments. That section describes in detail the hardware and software components proposed for each environment. EDS recognizes that the environment has a large influence on the quality, lead-time, and cost of the testing process. Elements of an effective test environment include the following: Environment Operational Processes – Documented policies and procedures for the setup and ongoing operations of each environment. The environment is managed with respect to factors like setup, availability, maintenance, version control, error handling, EDS SOMS STATEMENT OF WORK EXHIBIT I-231
  18. 18. INITIAL FINAL PROPOS AL (CDCR -5225-113) and authorizations. The saving and restoring of certain configurations and conditions can be arranged quickly and easily, for example, different copies of the database are available for the execution of different test cases and scenarios. Environment Scalability – The ability to modify the environment based on the need to reflect the current or future state of production. Metrics Collection – To provide continuous improvement throughout the environments, collection of specific metrics is required. These include the number of testing events in each environment, and the projected and actual time spent for each event. Environment Personnel – The environment consists at the very least of an environment manager and the appropriate resources to support the environment. Cost/Budget – A method is in place to determine the costs involved in the setup and ongoing operations of the required environment. Release Management – EDS recommends the Release Management Methodology (RMM) approach for successfully introducing new or altered components into existing IT environments, in order to develop these environments and enable clients to achieve their strategic business goals. Each testing environment is configured like the final production environment, although the testing environments for system and system integration testing and for user acceptance testing can have smaller databases and use emulators instead of actual external interfaces. The SOMS Master Test Plan specifies the testing environments that are established for the SOMS project. The Test Plan developed for each SOMS release identifies the detailed requirements for each environment for each specified test level. Furthermore, the Test Plan for each test level identifies: The differences between its testing environment and the live environment Any risks stemming from these differences The approach to mitigating these risks When each environment is needed and for how long Once a testing environment is established, the Release Testing Manager must approve any change to the testing environment before that change is made. The agreement of all project stakeholders to the documented testing environment management process is needed. The performance testing environment mirrors production and has all the components necessary to fully evaluate system performance accurately so that the necessary tuning can be undertaken. The following factors are considered when creating the performance testing environment: Specific project components required for performance testing Server capacity and configuration Network environment EXHIBIT I-232 EDS SOMS STATEMENT OF WORK
  19. 19. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE Tier load-balancing scheme Back-end database M-46 The Contractor shall use an automated testing tool for, at a N/A minimum, stress/load and regression testing. The tool shall create the test scripts, perform the test execution, and generate the test reports. Bidder Response Detail: Team EDS complies with this requirement and confirms that Team EDS will use an automated testing tool for, at a minimum, stress/load and regression testing. We confirm that we will use the tool to create the test scripts, perform the test execution, and generate the test reports. EDS test automation specialists will use HP QuickTest Professional and HP LoadRunner during performance testing. Functional Test Automation Test automation enables the creation, modification, and execution of functional, distributed functional, regression and sanity tests for graphical user interface (GUI) applications. In addition, results of the tests, including defects generated, are recorded and available for analysis and reporting purposes. Test automation is both a desirable and necessary component of an efficient and effective testing process. It is also software development; consequently, it requires careful planning and design, and the resulting automated test must itself be verified. Like all other forms of software development, it is very time consuming; in fact, it can take from 3 to 10 times as long to automate a test as it takes to execute the same test manually. Finally, the time saved through test automation can easily be offset and even exceeded through test script maintenance. In fact, the introduction of test automation requires a significant investment, including the cost of: Acquiring and maintaining licenses Creating a set of evaluation criteria for functions to be considered when using the automated test tool Examining the existing set of test cases and test scripts to see which ones are most applicable for test automation Examining the current testing process and determining where it needs to be adjusted for using automated test tools This does not mean that test automation does not add value to the testing process. However, it does confirm the importance of selecting tests for automation that provide the best return on investment for the SOMS project. EDS SOMS STATEMENT OF WORK EXHIBIT I-233
  20. 20. INITIAL FINAL PROPOS AL (CDCR -5225-113) There are four attributes of a good test case, regardless of whether the testing is manual or automated: Effectiveness – Its potential for finding defects Efficiency – How much test coverage it provides, that is, how much it reduces the total number of test cases required Practicality – How easily and quickly it can be performed, analyzed, and debugged Maintainability – How much maintenance effort is required to modify the test case each time the system that it tests is changed Test developers must strike a balance among these attributes to make sure that each test, whether manual or automated, uncovers a high proportion of defects and still avoids excessive cost. When to automate a functional test EDS considers tests that fall into the following categories to be ideal candidates for automation: Any test that is repeated often enough to offset the cost of developing and maintaining the resulting automated test script (for example, tests that have been identified for regression testing or smoke tests, and tests that are executed frequently as preconditions for other tests) Any test that measures the ability of a system to handle stress, load, or volume and continue to perform reliably over time Structural – especially application programming interface (API)-based – unit and integration tests When to test manually If the automation of a test results in excessive up-front or maintenance costs that cannot be offset, EDS recognizes that it should be done manually. Likewise, any test that requires human judgment to assess the validity of the result or extensive, ongoing human intervention to keep the test running should be done manually. Typically, no return on investment results from automating tests that have these attributes. Manual testing is the appropriate choice for the following tests: Installation and setup Configuration and compatibility Error handling and recovery Usability It is estimated that 60 to 80 percent of all test cases meet the criteria for automation. EDS automates all tests identified for smoke tests. For each SOMS release, EDS test automation specialists conduct a test automation feasibility analysis to determine which manual test cases meet the criteria for test automation to support regression testing. EXHIBIT I-234 EDS SOMS STATEMENT OF WORK
  21. 21. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE EDS introduces functional test automation with Release 1B, selecting and automating functional tests from Release 1A that support regression testing of Release 1A functionality. The same approach is applied to subsequent releases, concluding with the development of automated test scripts for the final Release. This managed approach to functional test automation results in a comprehensive regression test suite for delivery to CDCR. When test automation is deemed appropriate and necessary for system and system integration testing, user acceptance testing, or consolidated integration testing, EDS test automation specialists will use HP QuickTest Professional (QuickTest Pro). Performance Testing For every SOMS release, EDS will use HP LoadRunner during performance testing to: Facilitate the automation of performance testing (for example, load, volume, stress, endurance) Predict and measure system behavior and performance LoadRunner can exercise the entire enterprise infrastructure by emulating thousands of users and employs performance monitors to identify and isolate problems. By using LoadRunner for performance testing, testers are able to minimize testing cycles, optimize performance, and accelerate deployment. LoadRunner uses a suite of integrated performance monitors to quickly isolate system bottlenecks with minimal impact to the system. The suite consists of monitors for the network, application servers, Web servers and database servers. These monitors are designed to accurately measure the performance of every single tier, server, and component of the system during the load test. By correlating this performance data with end-user loads and response times, it is possible to determine the source of bottlenecks. In addition, all system performance data can be collected and managed from the LoadRunner Controller. M-47 The Contractor shall repeat the test lifecycle when a failure N/A occurs at any stage of testing (e.g., a failure in Acceptance Testing that necessitates a code change will require the component to go back through Unit Testing, Integration Testing, and so forth). Bidder Response Detail: Team EDS complies with this requirement and confirms that Team EDS will repeat the test life cycle when a failure occurs at any stage of testing. The standard EDS testing life cycle includes the requirement that each defect be analyzed for point of origin in the development life cycle. For example, if a defect that is discovered in acceptance testing originated in design, the defect is resolved, the design is re-verified, the code is updated, and all subsequent, related tests are repeated before the system is returned for verification by acceptance testing. EDS SOMS STATEMENT OF WORK EXHIBIT I-235
  22. 22. INITIAL FINAL PROPOS AL (CDCR -5225-113) EDS will use the requirements traceability matrix developed for each SOMS release to verify that all required retests are completed for each defect. M-48 The Contractor shall perform Full Lifecycle Testing throughout N/A the duration of the project. This includes Unit, Integration/String, System, Operational (Stress/Load, Performance), and Regression Testing. Bidder Response Detail: Team EDS complies with this requirement and confirms that Team EDS will perform full life- cycle testing throughout the duration of the project, which includes unit, integration/string, system, operational (stress/load, performance), and regression testing. Unlike more traditional testing practices, which tend to engage the software development life cycle only when detailed design is complete and disengage after an application has been deployed into production, our testing process for each project release begins immediately after release initiation and also supports post-production maintenance of the application. As a result, the Testing teams can plan and prepare for their testing efforts well before the software is delivered to them. Specifically, the teams participate in the joint application development (JAD) workshops and verify that JAD outputs map to documented and agreed requirements. The testing methodology is consistent with, and aligns with, the testing practices specified by the following: EDS Global Applications Delivery Quality Management System (GAD QMS) Enterprise Testing Method (ETM), which is the testing component of GAD QMS EDS applies its ETM to verify the delivery of efficient and effective testing for the SOMS project. ETM defines testing activities across deliverable components throughout the entire development life cycle and includes all test levels to be performed for each release. The Testing V Model, typically associated with waterfall development, has been tailored to support a manageable balance between consistency and flexibility – both are required by an incremental approach to software development. The Testing V Model offers another important advantage by supporting early testing involvement in the project and release life cycle. EXHIBIT I-236 EDS SOMS STATEMENT OF WORK
  23. 23. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE Figure 6.5-3, EDS’ Testing V Model The left-hand side of Figure 6.5-3, EDS’ Testing V Model, represents the capturing of the client’s needs and expectations and the definition of the requirements of the system. The high-level technical and architectural design process identifies the hardware and software components and their integration required to deliver the solution that meets the requirements. The detailed design phase identifies the lowest level solution components and subcomponents, defines their internal structure, and specifies them to a level that supports their construction. The right-hand side of the figure represents the test execution of the individual subcomponents and the progressive integration of the components into a delivered solution, which is subject to acceptance by the client. An important feature of the Testing V Model is the review activity shown on the left-hand side, which focuses on building quality into a deliverable from the development process by examining it for compliance with standards and requirements, and ensuring that it provides a sound basis for the development of dependent, downstream deliverables. This “total quality” approach is the key to EDS’ philosophy and methodology. The testing life cycle for each test level of a SOMS release will consist of the following sequence of major activities: Developing a Test Plan Determining the required test scenarios and test cases for each test level EDS SOMS STATEMENT OF WORK EXHIBIT I-237
  24. 24. INITIAL FINAL PROPOS AL (CDCR -5225-113) Developing test cases for each test level Developing test data Executing the tests and reporting on progress Completing closedown activities Types of Testing EDS uses a full-scale quality control strategy to verify a high-quality implementation of the system. EDS’ testing strategy includes the following comprehensive testing classifications: Technical testing SOA service testing System testing Business testing Installation testing Technical Testing Technical testing includes: Unit testing Code reviews Peer review Component integration testing (CIT) Smoke tests Unit Testing Unit testing is the testing of individual components (units) of the software. The objective of unit testing is to test the functionality of the code, which implements the functional requirements identified in the business scenario of the high-level design (HLD) documents. EDS follows a test-driven development methodology – tests are written before development starts. Using the JUnit tool, unit tests are written by each developer for each documented RFP requirement before starting development. These unit test cases are reviewed and approved by the Development Team Lead. Unit tests are conducted along with development and are performed by each developer on his or her code. Unit testing follows a self-correcting mechanism in that, if there is an error, it has to be corrected by the developer in order to commit the code. Unless all bugs are fixed, the code cannot be completed and stored in a developed status. EXHIBIT I-238 EDS SOMS STATEMENT OF WORK
  25. 25. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE All components are subject to unit testing. Unit testing is an internal EDS activity. Unit test results are stored in Borland StarTeam. Code Review Code review is the peer review activity of software development. Code is reviewed for adherence to coding standards, consistency, and accuracy by either fellow team members or the team lead. It is also checked for adherence to the development specifications. During development, code is analyzed and re-factored through several processes. This allows the code to be reformatted and streamlined and creates the following benefits in the development life cycle: Reduces defects within code Increases reusability Increases application performance Shortens debugging time Eases task distribution Enforces best practice Peer Review Peer reviews are conducted by the developers on the screens developed by their fellow team members, and provide a general check of the workability of the screen: Peer reviews are not accompanied by any formal test plans or scripts. Peer review comments are formally added in Borland StarTeam for the function or screen being tested. This is a formal step in the development quality control process and development for a function is not complete until this stage is successfully passed and recorded in Borland StarTeam. Component Integration Testing Component integration testing (CIT) is performed on each individual function and interface. This test is used to verify that the technical objects that are servicing a screen are working together properly to make the screen functional: CIT is performed by an independent team. The objective is to test the function for error-free operation; it does not test functionality or business process flow. It is conducted using test scripts and is executed using Test Director. All bugs that are found during CIT are recorded manually in Borland StarTeam by the CIT team along with a status change and assignment to the system analysts for review and correction. EDS SOMS STATEMENT OF WORK EXHIBIT I-239
  26. 26. INITIAL FINAL PROPOS AL (CDCR -5225-113) Smoke Test The smoke test is an application readiness check that is performed after the completion of a binary build and release for service testing. Smoke testing is conducted by the CIT team. Key application activities are quickly checked to see if the release is healthy. If the application fails the smoke test, it is rejected and the environment is rolled back to the previous release. Only when the release passes the smoke test can it be used for service testing and any subsequent test levels. SOA Service Testing SOA testing requires a fundamental change in testing strategy. With SOA, testing is required in isolation at the service level. Reliance on testing through the application’s user interface may lead to erroneous conclusions. SOA testing begins earlier in the development life cycle as the cost of repairing defects rises rapidly as time elapses. Functional testing is not usually sufficient; testing must occur along several dimensions such as security, interoperability, and performance. Testing tasks are assigned differently in an SOA environment: Testing of services in isolation is best handled by Service Delivery teams who have the expertise to effectively perform this type of testing. The introduction of a round of isolation testing by an independent group (internal or outsourced) provides an impartial second look that identifies problems that may have gone undetected. Existing Testing teams should continue to perform application-level testing. SOA testing requires enhanced technical skills and deeper business acumen. Exclusive use of GUI testing tools is not possible, as testers must be involved in the execution of test harnesses in order to test services in isolation. A deeper understanding of the business allows the tester to determine how well the services embody the business process. Service testing is a distinct test level that follows component integration testing and precedes system testing (where the service is tested in conjunction with other services and applications, to verify that they jointly deliver the required functionality). Service testing must involve collaboration among business analysts, developers, and testers. The objective of service testing is to validate that the built service delivers the required functionality and exhibits the expected nonfunctional performance and secure code characteristics. EXHIBIT I-240 EDS SOMS STATEMENT OF WORK
  27. 27. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE When the components have been developed to encompass a self-contained business function or service, and the components have completed component integration testing, the application can be released for service testing. This verifies that all business functions within a specific service deliver the required service. The service testing approach includes the following testing activities: Verify That Integration of Components Is Complete – It is expected that developers will complete component integration testing prior to hand-over to (independent) system testing. However there may be occasions where this is not possible. For example, the component might include deliverables from a third party, or code that requires an operating environment which cannot be replicated in the development arena. In these circumstances, and by previous agreement with the Service Testing team, this final stage of component integration testing may be performed by the testing team that perform service testing as an initial activity, on receipt of the hand- over of the component. Verify Service Integration – Testing of combinations of components at each layer (Presentation, Process and Service, Integration, and Data/Applications). This testing uses a white box approach, but is driven by a view of the business functionality to be delivered. This activity requires a collaborative effort involving independent testers, application and middleware developers, and database administrators. Verify Functional Correctness – Verification against the service requirement and specifications. Verification against business specifications. This activity is as much about testing that additional functionality has not been delivered, as it is about testing of delivered functionality. To be certified for reuse, services must not include business processing or logic specific to only one system. To exercise all possible tests, it is necessary to construct stubs and drivers to inject tests into the service and to check outputs for correctness. Verify Policy, Standards, and Registry – SOA registries and repositories help manage metadata related to SOA artifacts (for example, services, policies, processes, and profiles) and include the creation and documentation of the relationships (that is, configurations and dependencies) between various metadata and artifacts. SOA policy management provides the technology to create, discover, reference, and sometimes enforce policies related to SOA artifacts, such as access control, and performance and service levels. As these form part of the service deliverable, they require testing. Any additional SOA rules or conventions that EDS adopts for its offerings (A3 architecture direction/constraints) over and above the industry standard services (often XML) and protocols that allow the service to be provided, also require testing. Verify External Interfaces – Testing of the interfaces between the service and the service consumers to make sure that the external interfaces satisfy their defined requirements, incoming interface data is correctly and thoroughly vetted by the service, only data that satisfies the rules of the interface is input to the service, and data output from the service is in accord with requirements. Verify Security – Testing to verify that functionality and data are only delivered appropriate to the type of access and usage. EDS SOMS STATEMENT OF WORK EXHIBIT I-241
  28. 28. INITIAL FINAL PROPOS AL (CDCR -5225-113) Verify Quality of Service – Performance, load, stress, scalability, reliability (or long- running – how long can the service run without failing), and service failure resilience testing. Verify Service Level Agreement – Testing against service level specifications in the business model. Frequency At least one cycle of service testing is performed for each new release of the software. Separate builds of the same release result in the remaining non-closed test cases being executed. System and System Integration Testing System and system integration testing is the most important testing phase; it makes sure that SOMS functionality is signed off by CDCR in the HLD documents. In system and system integration testing, all requirements are tested in the context of overall system requirements. The HLD documents specify the flow of all business scenarios. The HLD is used to create requirements in Borland StarTeam, and then test cases are written to test each business scenario. All requirements in Borland StarTeam are mapped to test cases in HP Quality Center, and this mapping can be shared with the CDCR to verify that all requirements are covered by test cases. Specific types of testing conducted during system and system integration testing include the following: Smoke test Functional testing Interface testing Regression testing The requirements for SOMS system and system integration testing include the following: The in-scope functions list is based on HLD documents (prepared on the basis of RFP requirements) and the change requests approved by the CDCR team. HLD documents prepared by the project teams are considered the basis for testing activities. Any changes to the detailed functional specifications, once baselined, go through a change management process to assess the impact on the testing effort and schedule. All interface connections are tested using live connections in the user acceptance testing environment. Stubs are used when the interface connections are unavailable. EDS will work with the CDCR SOMS team to get appropriate access to the interfaces. All third- party interfaces will be available for the testing through a testing environment that can be connected from the DTS data center. The infrastructure, hosted in the DTS environment for system testing (FTP-based interfaces), is ready and configured correctly. EXHIBIT I-242 EDS SOMS STATEMENT OF WORK
  29. 29. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE All HLD documents and business rules documents are signed off before testing begins. EDS will also get concurrence on all the HLD changes before completing system and system integration testing (as some of the changes may come during system and system integration testing). System Testing System testing is performed when the software is functioning as a whole, or when well- defined subsets of its behavior are implemented and the software under test has successfully progressed through the formal unit and component integration test levels. System testing validates the functional and structural stability of the application or system, as well as nonfunctional requirements such as reliability, security, and interoperability. In this test level, testing is concerned with the behavior of the whole system, not with the workings of individual components. Tests may be based on risks, requirements specifications, business processes, use cases or other high-level descriptions of system behavior, and interactions with the operating system or system resources. Testing investigates both functional and nonfunctional requirements. Functional test types addressed by system testing include: System transactions System processes System functionality Business function Integrated functionality Application security Accessibility Nonfunctional test types addressed by system testing include: Conversion Data integrity Installability Legal and regulatory Compatibility Portability Privacy Application security, ensuring that users are restricted to specific functions or are limited to the data made available to them System security, ensuring that only those users granted access to the system are capable of accessing the applications through the appropriate gateways Usability EDS SOMS STATEMENT OF WORK EXHIBIT I-243
  30. 30. INITIAL FINAL PROPOS AL (CDCR -5225-113) Infrastructure security Each SOMS release undergoes rigorous system testing. System tests are prepared and performed based on the approved requirements documents, the application architecture and the application design as specified by the Development team. Tests are created to exercise the specific business functionality selected and approved by CDCR. This verifies that the functions perform as specified in the requirements documents, and verifies that the system as designed supports the client’s business. System test cases are based on two types of scenarios: functions and roles. Functional testing validates all the functionality associated with functional requirements and data flows through the use of the application. Role-based testing verifies that access to the application, and to menus, submenus, buttons, and icons, is complete and correct for each defined role, and that the process flow is complete and correct for each role. Authorization checks can be performed by entering invalid logon ID or unauthorized access codes to determine whether screens or windows are displayed when they should not be. Test cases verify that specific access codes take the user to where (and only to where) access rules permit. Once sufficient components have been developed to constitute a self-contained subsystem, and the subsystem has completed component integration testing, the application can be released to system testing for black box testing that verifies that all of its business functions interact correctly with all other business functions. During this type of testing, the database is loaded with sample production data. This provides a foundation for static information as well as initial data for batch and month-end processes. System Integration Testing The purpose of system integration testing is to confirm that the necessary setup and communication exists with respect to interfaces and reports so that functional testing can be performed. System integration testing addresses the need for the product under test to interface with other applications or systems without interfering with their operations. System integration testing is performed once the interface code is available from the Development team. Early execution of these tests is recommended because this may be some of the most complex testing – it normally involves the communication and security layers and requires coordination with outside organizations. It can be performed at the same time as system testing or late in component integration testing. Initially this testing involves the use of drivers, since all testing components, particularly data, may not be available in the current environment. This approach also provides the tester with direct control over the transactions being sent to the external entity – the potential for functionality errors in client or GUI code modules is eliminated. Ideally, the tests are performed without the use of stubs and drivers. Once the front-end code has been tested, these tests are re-executed. When testing with an external organization, we provide a copy of all test cases for review and to permit coordination of test execution and synchronization of data. We identify each interface and provide details of the type of tests that are performed and any tools that may EXHIBIT I-244 EDS SOMS STATEMENT OF WORK
  31. 31. EDS AGREEMENT NUMBER [XX XX] CALIFORNIA DEPARTMENT OF CORRECTIONS AND REHABILITATION (CDCR ) EXHIBIT I MANAGEMENT REQUIREMENTS RESPONSE be required. The following is an example of the types of information that could be provided for an interface to be tested: Named applications or systems with which to interface Messaging format, medium, and content Compatibility with products from different vendors Conversion requirements for hardware, applications, or data Messaging and transmission protocols Process timing and sequencing Regression Testing During the development of system and system integration test cases, candidate test cases for use in regression testing are also identified. Typically, those test cases that test critical components or functionality of the system are selected as regression test cases, and are the primary candidates for automation. The resulting automated test scripts can then be used in the testing of future releases of the system or re-executed as part of the testing of the current project or release as changes to the system are made. Test Objective: To validate that identified business scenarios are working as desired. Technique: Execute pre-selected HLD, HLD flow, or function, using valid data, to verify that the expected results are obtained when valid data is used. Completion Criteria: All planned tests have been executed, and all identified defects have been addressed. Special Considerations: None. Performance Testing Performance testing is designed to verify that the developed programs and system work as required at expected usage levels. The objective is to verify that the product is structurally sound and will function correctly at peak operation. Performance testing determines that the technology has been used properly and that, when all the component parts are assembled, they function as a cohesive unit that meets response time requirements. The techniques are not designed to validate that the application system is functionally correct, but rather, that it is structurally sound and reliable. Performance testing is conducted to evaluate the compliance of a program component or system with specified performance requirements or service levels. These may include subsets related to stress, volume, reliability, and load. Performance testing requires use of sophisticated tools that generate high levels of use, monitor system throughput, and provide reports on the results. The resulting consistent and repeatable tests identify any bottlenecks in resource utilization, transaction response times, and overall system performance. The following are high-level goals for performance testing: EDS SOMS STATEMENT OF WORK EXHIBIT I-245
  32. 32. INITIAL FINAL PROPOS AL (CDCR -5225-113) Validate that documented SLAs and requirements regarding the performance of the application and infrastructure have been satisfied Verify that the system performs as required under expected usage levels Identify points of system degradation or bottlenecks Identify system capacity and limitations on specific components Identify causes of poor performance of the business functions Reduce implementation risk EDS application developers and architects incorporate performance testing into their existing development activities, regardless of language type, to develop code that meets performance requirements. Developers conduct performance testing at the unit and component integration test levels to make sure that the application performance requirements have been met. At or near the completion of system testing, performance testing of the application and infrastructure is required. It tests the performance of the application and infrastructure in a production-like environment to verify that the system meets or exceeds SLA requirements for performance response time. Performance testing uses the following high-level approach: Identify business and user processes to be used in the development of testing scenarios – The workload to be generated for the performance tests consists of three components: the number of users, critical business functions that must be executed, and creation of a workload profile. The latter is calculated as the estimated number of users times the estimated number of transactions (that is, the sum of the information defined in the user activity profiles, which are created on the basis of the workload anticipated in the current production system). – Develop and automate testing scenarios that align with and simulate user workload – Execute tests and analyze the results to identify and resolve bottlenecks by completing: o Consistent load test runs simulating the average workload o Consistent load test runs simulating the peak workload o Stress test runs at double the expected peak workload o An endurance test run simulating the average workload o A formal report of test results and appropriate recommendations The number of executions of the performance test is: Two consistent load test runs simulating the average workload Two consistent load test runs simulating the peak workload One stress test run at double the expected peak workload One twelve-hour endurance test run simulating the average workload EXHIBIT I-246 EDS SOMS STATEMENT OF WORK