White Paper
Upcoming SlideShare
Loading in...5
×
 

White Paper

on

  • 880 views

 

Statistics

Views

Total Views
880
Views on SlideShare
880
Embed Views
0

Actions

Likes
0
Downloads
8
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

White Paper White Paper Document Transcript

  • RCG INFORMATION TECHNOLOGY delivery excellence delivery excellence ® Software Development Best Practices White Paper © 2005 RCG Information Technology, Inc.
  • Published in DM Review Online in May 2005. Software Development Best Practices Robert Wegener National Practice Director, QA & Software Testing Abstract This paper focuses on Software Development Best Practices. As opposed to a check-off list, it will examine how an organization can improve its testing best practices by integrating testing into the development life-cycle. It will also provide guidelines on creating Test Strategies and Test Plans. Testing is an ongoing process and has to be integrated into the development methodology being used. Test strategies and plans should be designed in conjunction with each feature/function upfront, so that any specific testing needs can be identified and included in the development of the component and the testing environment. The first step to improve the quality of the software testing function is to perform an assessment of the development and testing organizations. The assessment should start with a comparison of the organization’s best practices with those developed by the Software Engineering Institute (SEI) known as the Capability Maturity Model® (CMM®). This model, depicted below, is very similar to ISO 9001 but extends it by adding a framework for continuous process improvement. CMMI Five Levels of Maturity 12 Figure 1: SEI Capability Maturity Model © 2005 RCG Information Technology, Inc. 2
  • Level 1 depicts an organization that takes an ad-hoc approach and has very little repeatable processes. This level relies on individual success over team success. Level 2 incorporates basic management techniques that are established, defined and documented. Level 3 describes an organization that has its own software process documented and integrated into the development life-cycle. Level 4 adds a layer of monitoring and control over the development life-cycle through data collection and analysis. At level 5, all processes are constantly being improved by the use of feedback and the introduction of innovative processes. Here is an abbreviated roadmap to get to each level: Level 1: Initialize • Identify current state • Identify current resources • Assess the current testing capability state Level 2: Repeatable • Institutionalize basic testing techniques and methods • Initiate a test planning process • Develop testing and debugging goals Level 3: Defined/Integration • Control and monitor the testing process • Integrate testing into the software life-cycle • Establish a technical training program • Establish a software testing organization Level 4: Management and Measurement • Develop a software quality evaluation • Establish a test measurement program • Establish an organization-wide review program Level 5: Optimization/Defect Prevention and Quality Control • Optimize the test process • Implement quality control measures • Apply process data for defect prevention Before buying testing tools, it is desirable to get the organization to at least Level 3. It is more important to have robust artifacts and well trained staff than having expensive tools on the shelf. The primary artifacts that are necessary for any testing organization include: Test Strategy Test Costing Resource Planning Test Plans Test Cases Test Scripts Test Management artifacts to track defects and to add traceability After the artifacts are in place and the staff trained, the next step is to focus on the, Who, What, When and How much of testing. Testing is defined differently depending on who the expert is. Regardless, the key takeaway is to provide a stable working system that recognizes failures and has a defined path for remediation and recovery. Complex systems must be designed to handle failures and good testing practices focus on how the systems display, log and recover from errors. © 2005 RCG Information Technology, Inc. 3
  • What Should be Tested The basic V-Model shows what should be tested and the direct correlation between system development and the validation required for each. It also gives an idea of who should be responsible for the testing in each development phase. Figure 2: Basic V-Model Starting from the bottom of the V-model with Construction it should be apparent that the developers will test the components. Developers are responsible for their individual contributions to the application and must ensure that the individual pieces that they created meet the specification, adhere to coding standards and interface properly to the rest of the system. Moving up the right side of the V-Model, the next testing phase covers the stringing of all the components together into a working system. Many organizations call this a String or Assembly test and it is performed by the developers. The V-Model tends to group this into Integration Testing. The next phase, System Testing, is where an independent group of testers focuses on validating the system as a whole by exercising the architecture through the functional and system requirements. The final test which is performed by the end-users of the system validates that the requirements have been met. It is safe to say that aside from the granularity of unit testing and white-box testing, system testing should provide 100% coverage of all functionality including error handling and recoverability functionality. A traceability matrix that shows each function and its related test cases is a must to ensure maximum coverage. The more functionality validated the better. Test functionality should be grouped by categories such as security, interface logic, business rules, usability, error handling, transaction handling, recovery, session handling etc. Strategies and Plans A solid testing strategy provides the framework necessary to implement the testing methodology. A separate strategy should be developed for each system being developed taking into account the development methodology being used and the specific application architecture. © 2005 RCG Information Technology, Inc. 4
  • The heart of any testing strategy is the Master Testing Strategy Document. It aggregates all the information from the Requirements, System Design and Acceptance Criteria into a detailed plan for testing. A detailed Master Strategy should cover the following: Project Scope Restates the business objective of the application and defines the scope of the testing. The statement should incorporate a list of activities that will be in scope or out of scope. A sample list would include: • List of software to be tested • Software configurations to be tested • Documentation to be validated • Hardware to be tested Test Objectives The system under test should be measured by its compliance to the Requirements and the User Acceptance Criteria. Each Requirement and Acceptance Criteria must be mapped to specific Test Plans that validate and measure the expected results for each test being performed. The objectives should be listed in order of importance and weighted by Risk. Features and Functions to be Tested Every feature and function must be listed for test inclusion or exclusion, along with a description of the exceptions. Some features may not be testable due to a lack of hardware or lack of control, etc. The list should be grouped by functional area to add clarity. The following is a basic list of functional areas: • Backup and Recovery • Workflow • Interface Design • Installation • Procedures (Users, Operational, Installation) • Requirements and Design • Messaging • Notifications • Error Handling • System exceptions and third party application faults Testing Approach The testing approach provides the detail necessary to describe the levels and types of testing. More specific test types include Functionality, Performance Testing, Backup and Recovery, Security Testing, Environmental Testing, Conversion Testing, Usability Testing, Installation and Regression Testing. The specific testing methodology should be described and the entry/exit criteria for each phase noted in a matrix by phase. A project plan that lists the resources and schedule for each testing cycle should also be created that maps the specific testing task to the overall development project plan. Testing Process and Procedures The order of test execution and the steps necessary to perform each type of test should be described in sufficient detail to provide clear input into the creation of Test Plans and Test Cases. Procedures should include how test data is created, managed and loaded. Test cycles should be planned and scheduled based on system availability and deliverable dates from development. All application and environmental dependencies should be identified along with the procedures necessary to gain access to all the dependent systems. © 2005 RCG Information Technology, Inc. 5
  • Test Compliance Every level of testing must have a defined set of entry/exit criteria which is used to validate that all prerequisites for a valid test have been met. All mainstream software testing methodologies provide an extensive list of entry/exit criteria and checklist. In addition to the standard list, additional items should be added based on specific testing needs. Some common additions are, environmental availability, data availability, and validated code which is ready to be tested. Each level of testing should define specific pass/fail acceptance criteria to ensure that all quality gates have been validated and that the Test Plan focuses on developing tests that validate the specific criteria defined by the User Acceptance Plan. Testing Tools All testing tools should be identified and their use, ownership and dependencies defined. The tools category includes manual tools, such as templates in spreadsheets and documents, as well as automated tools for test management, defect tracking, regression testing and performance/ load testing. Any specific skill sets should also be identified and compared against the existing skills identified for the project to highlight any required training needs. Defect Resolution A plan to address the resolution of failed tests needs to be created that lists the escalation procedures to seek correction and retest of the failed tests along with a risk mitigation plan for high-risk test. Defect tracking should include basic metrics for compliance based on number and type of defect found. Roles and Responsibilities Another item to be prepared is a matrix listing the roles and responsibilities of everyone involved in the testing activities, along with the anticipated amount of their time allocated to the project. Process Improvement The entire testing process should be focused on process improvement. The strategy should list ways to monitor progress and provide constant feedback. This feedback can serve to enhance the process, deliverables and metrics used in testing. Root cause analysis should be performed on all reported defects to help isolate the true nature of the problem and prevent an unnecessary repeat defect. Deliverables All deliverables should be defined and their location specified. Common deliverables are Test Plans, Test Cases, Test Scripts, Test Matrix, and a Defect Log. Schedule All testing activities should be combined into one master testing schedule. The schedule should include an estimate of time for each task and the dependencies for each. Testing resources should be assigned to each task and quality gates should be listed to insure oversight of the entire process. Environmental Needs All the requirements of the Testing Environment need to be listed. Common ones include a description of the environment’s use, management, hardware and software, specific tools needed, data loading and security requirements. Resource Management The skills of all personnel involved in the testing effort need to be assessed and the gaps noted, so that a comprehensive training program can be designed. Specialty skills that will not be filled with in-house staff will require job descriptions and budgeting. © 2005 RCG Information Technology, Inc. 6
  • Risk and Contingencies Planning for risk in advance and ways to mitigate it are essential for a robust strategy. A risk assessment that is prioritized by severity of risk and covers technology, resource, schedule and environmental issues should feed a detailed plan to mitigate each ’red flag’. Approvals and Workflow All items on the critical path must go through an approval cycle. The procedures for approval and escalation must be well defined and assigned to resources prior to the start of the testing. Conclusion The above covers the main sections of a well drafted and documented Testing Strategy. The more included in the Strategy document, the less ambiguity and chance for deviation there will be throughout the project. The completion of the Strategy signals the beginning of the Test Planning phase. For each type of testing identified in the Master Test Strategy, there should be a Test Plan identifying the components to be tested, the location of the test data, the test environment needs, the test procedures, resources required, and the tests schedule. For each plan, a series of test conditions should be identified, so that Test Cases with expected results can be generated for later execution. In summary, the Strategy and Planning documents are the most critical documents to any successful testing. A good source of detail on testing documents is IEEE Std 829-1998. It provides the specific form and content for a basic set of software testing documents. A set of basic software test documents is described. This standard specifies the form and content of individual test documents. About RCG IT RCG Information Technology, Inc. (www.rcgit.com) is a global leader in IT professional services with three decades of experience and best practices providing IT strategy and design, application development, integration and management. We are committed to delivery excellence. Specialized solutions include: Business Intelligence, Web services, Application Management, QA and Software Testing, Offshore Delivery and Project Management. RCG IT serves 360 clients and 43 of the Fortune 100 across a range of markets, with a special focus on the financial services, energy/oil & gas, insurance, retail and pharmaceutical industries. RCG IT operates at the Repeatable Level of the Software Engineering Institute’s (SEI) Capability Maturity Model® for software processes. RCG IT is based in Edison, New Jersey, with offices nationwide, an Offshore Delivery Center (ODC) in Manila, capital city of the Philippines, and a global recruiting engine. RCG IT’s ODC operates at Level 5 of the SEI’s CMMI® (Capability Maturity Model Integration®) for software capability. About the Author Robert Wegener is RCG IT’s National Practice Director for QA & Software Testing and Director Web Services. He has over 20 years of information and business engineering experience in operations, customer service, transportation, finance, product development, telecommunications and information systems. Mr. Wegener’s focus has been to help clients develop a sound and cohesive strategy for their organization, integrate the integration layer to the organizations’ enterprise architecture, and leverage best practices to assure that the business and technology requirements have been met. Mr. Wegener also has extensive experience in various process and development methodologies, quality assurance and testing methodologies. Mr. Wegener has published numerous articles and is a monthly columnist for DM Review. He has presented at seminars around the country on Software Testing and Web services. He holds an MBA and BA in Computer Science from North Central College. © 2005 RCG Information Technology, Inc. 7