http://www.xprogramming.com/xpmag/whatisxp.htm “ Extreme Programming is a discipline of software development based on values of simplicity, communication, feedback, and courage. It works by bringing the whole team together in the presence of simple practices, with enough feedback to enable the team to see where they are and to tune the practices to their unique situation.” – Ron Jeffries, 11/08/2001 http://www.fawcette.com/resources/managingdev/methodologies/scrum/ “ Scrum is an agile software-management process characterized, among other things, by quick daily meetings to report on project status. Scrum puts a strong emphasis on communication, teamwork, and flexibility.” – Brian Noyes, 28/06/2002
Test early – start “testing” by establishing a Test Strategy / Master Test Plan as requirements are being defined Establishing test environments takes time, often more time than the project has. The sooner test environment requirements are known, the sooner acquisition and deployment can commence. The effort involved in test case development is often underestimated – no test cases, no testing! IBM-Rational tool installation and tailoring can be problematic in a tightly locked down environment – require time to work out problems before test case development commences.
The mind map covers test types that may be required for a project, sorted by IBM-Rational FURPS+ categories. Test types can be used as a “checklist” during requirements definition to ensure inclusion or exclusion of specific classes of requirements. Sets of re-usable test cases can be generated for most non-functional (i.e. URPS+) test types, and maintained as test assets from project to project. Functional test cases will depend on the business functions included in each build, but once created can be used for regression testing of “unchanged” functions from build to build. Volume, stress and performance requirements will have the most impact on test environment characteristics and cost. Interface requirements (e.g. pipelines into legacy applications, batch queues and dated hardware – cheque printers!) will be the most difficult to implement / simulate reliably in a test environment.
Test often – use successive test levels to exercise builds against requirements baselines. Each test level has a different, complementary objective. Test levels are defined in the Test Strategy / Master Test Plan, along with entry and exit criteria for each level. TestManager test suites are implemented for each test level, consisting of test scripts (implemented and configured test cases) relevant to the test level objective - e.g. performance test scripts for system testing, functional test scripts for acceptance testing.
IBM-Rational tools licence servers – 24 / 7 uptime required. Network bandwidth must support the volume required to test builds against performance, stress and volume requirements. IBM-Rational Administrator project files, RequisitePro databases, ClearQuest databases and TestManager datastores must be setup on shared servers available to all team members. Do not put databases / datastores on already loaded servers. Once a server fills up, these tools will not run under “read only” permissions. Allow for database / datastore growth during testing.
The test automation architecture allows each separate IBM-Rational tool to communicate with the others. A tester can submit defects into ClearQuest directly from TestManager, as they step through the test results. This tight integration provides the most benefit to the project team. Once the test automation architecture is established – don’t change it! Unexpected things will break. Therefore, careful planning is required as to what components will sit where in the test environment before the tools are installed and databases / datastores created.
Requirements are documented in standard RUP MS-Word templates, then imported into and tagged in RequisitePro. Traceability is established between hierarchical requirements in RequisitePro for both “traced-to” and “traced-from” relationships. SoDA is used to create build plans for each iteration. Requirements are associated with test cases created in TestManager, then coverage reporting between test cases and requirements is implemented and progress monitored. Test cases are exported into MS-Word via SoDA, and printed off for review by the project team. Test cases are implemented as test scripts using ManualTest, Functional Tester, Robot, etc. Test scripts are combined into test suites and run manually or automatically, depending on how they were implemented. Test results are stored in test logs, and defects are raised in ClearQuest. Regular defect reporting and charting is available via ClearQuest (standard EDS Global ClearQuest Schema components) Test summary reports are created showing test cases executed for each requirement in scope for a build.
Agile Testing Test Early, Test Often, Test Fast Sharon Picken - EDS
Foundations of Agile Testing Process Environment People Tools
An Agile Client <ul><li>Empowered to make funding and scope decisions </li></ul><ul><li>Participated on a regular basis with developers and testers </li></ul><ul><li>Enthusiastic supporter of new approach (Agile RUP) </li></ul><ul><li>Exercised admin level control over development and test environments </li></ul><ul><li>Changed project direction in a timely manner when necessary </li></ul>
<ul><li>Technical Architecture (TA) Pilot </li></ul><ul><li>Highly visible J2EE application architecture pilot covering “most used” business functions </li></ul><ul><li>Small (< 15 people) co-located collaborative team of Architects, BAs, Developers, Testers </li></ul><ul><li>Not delivering critical business processes into Production – medium risk project </li></ul><ul><li>Risk-based architectural and functional requirements baselines – Risks and Use Cases </li></ul><ul><li>Iterative development and testing of successive application architecture builds - RUP </li></ul><ul><li>Demonstrated business value from the initial build – high risks addressed early </li></ul>An Agile Project Page