Essential Test Management and Planning

1,127 views
1,032 views

Published on

The key to successful testing is effective and timely planning. Rick Craig introduces proven test planning methods and techniques, including the Master Test Plan and level-specific test plans for acceptance, system, integration, and unit testing. Rick explains how to customize an IEEE-829-style test plan and test summary report to fit your organization’s needs. Learn how to manage test activities, estimate test efforts, and achieve buy-in. Discover a practical risk analysis technique to prioritize your testing and become more effective with limited resources. Rick offers test measurement and reporting recommendations for monitoring the testing process. Discover new methods and develop renewed energy for taking your organization’s test management to the next level.

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,127
On SlideShare
0
From Embeds
0
Number of Embeds
11
Actions
Shares
0
Downloads
49
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Essential Test Management and Planning

  1. 1.       rial      Presented by:  Rick  raig  Software  eering  Brought to you by:      340 Corporate Way, Suite   Orange Park, FL 32073  888‐2 ME  AM Tuto 4/7/2014  8:30 AM          “Essential Test Management and Planning”      C Quality Engin                   300, 68‐8770 ∙ 904‐278‐0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com   
  2. 2.                        A consultant, lecturer, author, and test manager, Rick Craig has led numerous teams ny e has quent on. Rick Craig ity EngineeringSoftware Qual       of testers on both large and small projects. In his twenty-five years of consulting worldwide, Rick has advised and supported a diverse group of organizations on ma testing and test management issues. From large insurance providers and telecommunications companies to smaller software services companies, h mentored senior software managers and helped test teams improve their effectiveness. Rick is coauthor of Systematic Software Testing and is a fre speaker at testing conferences, including every STAR conference since its incepti
  3. 3. © 2014 SQE Training V3.2 1 Rick Craig rcraig@sqe.com ESSENTIAL TEST MANAGEMENT AND PLANNING Administrivia Course timing Breaks 4© 2014 SQE Training V3.2
  4. 4. © 2014 SQE Training V3.2 2 Course Agenda 1. The Culture of Testing and Quality 2 Introduction to STEP and Preventive2. Introduction to STEP and Preventive Testing 3. Test Levels 4. Master Test Plan 5 The Test Summary Report5. The Test Summary Report 5© 2014 SQE Training V3.2 1 THE CULTURE OF TESTING AND QUALITY
  5. 5. © 2014 SQE Training V3.2 3 What are your problems? What are your testing  challenges? What are your testing  challenges? 7© 2014 SQE Training V3.2 What is quality? •Meeting  requirements (stated and/or  implied) Quality implied) 8© 2014 SQE Training V3.2
  6. 6. © 2014 SQE Training V3.2 4 What is testing? •The process of  determining  conformance to  requirements—stated  and/or implied Testing and/or implied 9© 2014 SQE Training V3.2 Class Questionnaire • The overall quality of the software systems/products at my organizationy /p y g is: Outstanding - one of the best Acceptable - OK Poor - must be improved Unknown - a mystery to me 10© 2014 SQE Training V3.2
  7. 7. © 2014 SQE Training V3.2 5 Class Questionnaire • The time, effort, and money that my organization spends trying to achieveorganization spends trying to achieve high software quality is: Too much - needs to be reduced About right - OKg Too little - needs to be increased Unknown - a mystery to me 11© 2014 SQE Training V3.2 Corporate Culture • Them and Us vs. One Team • Early vs. Late 12© 2014 SQE Training V3.2
  8. 8. © 2014 SQE Training V3.2 6 Economics of Test and Failure • The cost of testing • The cost of failure The sa ings of p e enti e $14,102 • The savings of preventive testing and defect prevention Cost to Correct in 1970s $$7,136 ProductionDesignRequirements Code Test Time $139 $455 $977 TRW, IBM, and Rockwell Landmark Study 13© 2014 SQE Training V3.2 Software Psychology What is “good enough”? # of Bugs Time 14© 2014 SQE Training V3.2
  9. 9. © 2014 SQE Training V3.2 7 2 INTRODUCTION TO STEP AND PREVENTIVE TESTING Select a Method/Process • Systematic Test and Evaluation  Process (STEP™) – SQEExample  ( ) • MS Test Process – Microsoft • TMap® • Exploratory Testing p testing  methodologies 1.22 1.10 1.20 1.24 1.30 16© 2014 SQE Training V3.2
  10. 10. © 2014 SQE Training V3.2 8 STEP™ • Level Plans – Acceptance S– System – Integration – Unit Project Testing ... Unit ... Plan Analyze Design Implement Aquire ... Measure Integration ... System ... Acceptance 17© 2014 SQE Training V3.2 STEP™ Activities • P1. Establish master test plan • P2 Develop detailed test plans Plan strategy • P2. Develop detailed test plansstrategy • A1. Analyze test objectives • A2. Design tests • A3. Implement plans and designs Acquire testware • M1. Execute tests • M2. Check test set adequacy • M3. Evaluate software and process Measure software and testware 18© 2014 SQE Training V3.2
  11. 11. © 2014 SQE Training V3.2 9 Level Timing Project plan Requirements specification High-level design ImplementationDetailed design Master Test Planning and Acceptance i System Plan Plan Acquire Acquire Measure Measure Methodology Standards Guidelines Unit Integration Plan Plan Acquire Acquire Measure Measure 19© 2014 SQE Training V3.2 Preventive Testing • Testing begins early (i.e., during requirements) and test cases are d i t d lused as requirements models • Testware design leads to software design • Defects are detected earlier or prevented altogetherp g • Defects are analyzed systematically • Testers and developers work together 20© 2014 SQE Training V3.2
  12. 12. © 2014 SQE Training V3.2 10 Opportunities for Defects Business Needs Requirements Design Coding Installation Operation 21© 2014 SQE Training V3.2 Testers Improve Requirements • Testers ask: – What does this mean? – Are there any unspecified situations? – How can this requirement be adequately demonstrated? – What problems might occur? • Tests define and help specify• Tests define and help specify requirements – “A function/method or constraint specification is unfinished until its test is defined” 22© 2014 SQE Training V3.2
  13. 13. © 2014 SQE Training V3.2 11 Two Views Food and Drug  Administration (FDA):( ) “You CAN’T test quality into  your software.” SQE:SQE: “You MUST test quality into  your software.” 23© 2014 SQE Training V3.2 3 TEST LEVELS
  14. 14. © 2014 SQE Training V3.2 12 What is a Level? A level is defined by the environment An environment is the collection of • Hardware • System Software • Application Software • Documentation • People • Interfaces • Data 25© 2014 SQE Training V3.2 The “V” Model Plan Requirements • Acceptance• Plan Plan High-level design Detailed design • System • Integration • • Plan g Coding • Unit• 26© 2014 SQE Training V3.2
  15. 15. © 2014 SQE Training V3.2 13 What determines the number of levels? Risk Politics OrganizationOrganization Objectives 27© 2014 SQE Training V3.2 Acceptance Testing Objective: To demonstrate a system’s  readiness for operational use and customer  acceptance • Focuses on user requirements • Follows systems testing and is expected to work • Is a final stage of confidence building • Provides protection to ensure production readiness• Provides protection to ensure production readiness • Allows customers to “sign off” on the product Ends: When the system is approved for use 28© 2014 SQE Training V3.2
  16. 16. © 2014 SQE Training V3.2 14 System Testing Objective: To develop confidence that a system is  ready for acceptance testing. It forms the basis of  a regression test set and often represents the bulk  of the testing effort • Most comprehensive testing • Usually tests software design and requirements • Usually consumes most test resources• Usually consumes most test resources Ends: When a system is turned over for  acceptance testing or moved into production 29© 2014 SQE Training V3.2 Integration Testing Objective: To progressively test the interfaces  b i d d lbetween units and modules • Focuses on interfaces • Can be top‐down, bottom‐up, or functional Ends: When the entire system has been  integrated and stability has been demonstrated 30© 2014 SQE Training V3.2
  17. 17. © 2014 SQE Training V3.2 15 Unit Testing Objective: To determine that each unit meets  its requirements (program specification) andits requirements (program specification) and  that internal consistency has been achieved • Can be black‐box as well as white‐box • Primary means of code coverage Ends: When each unit has met its exit criteria 31© 2014 SQE Training V3.2 4 MASTER TEST PLAN
  18. 18. © 2014 SQE Training V3.2 16 Master Test Plan Master Test Plan Unit Test Plan Acceptance Test Plan Plan Integration Test Plan System Test Plan Test CasesTest Cases 33© 2014 SQE Training V3.2 Process vs. Documentation The process may be more important than the product (i.e., paper) 34© 2014 SQE Training V3.2
  19. 19. © 2014 SQE Training V3.2 17 Audience Who will read a Master Test Plan? 35© 2014 SQE Training V3.2 IEEE Style Master Test Plan (2008) 1. Introduction – Document identifier – Scope 2. Details of the MTP – Test processes including definition of test levels M t – References – System overview and key features – Test overview • Organization • Master test schedule • Integrity level schema R • Management • Acquisition • Supply • Development • Operation • Maintenance – Test documentation requirements • Resources summary • Responsibilities • Tools, techniques, methods, and metrics – Test administration – Test reporting 3. General – Glossary – Document change procedures and history 36© 2014 SQE Training V3.2
  20. 20. © 2014 SQE Training V3.2 18 Master Test Plan Sections 1. Test Plan Identifier 2. Introduction 3. Test Items 4. Features to be tested 5. Features not to be tested 6. Software Risks 7. Planning Risks and Contingencies 8. Approach 9. Item Pass/Fail Criteria 10. Suspension Criteria/Resumption Requirements 11. Test Deliverables 12 Testing Tasks12. Testing Tasks 13. Environmental Needs 14. Responsibilities 15. Staffing and Training Needs 16. Schedule 17. Approvals 37© 2014 SQE Training V3.2 1. Test Plan Identifier / 2. Introduction • Test Plan Identifier • Introduction – Scope of project – Scope of plan (functional) 38© 2014 SQE Training V3.2
  21. 21. © 2014 SQE Training V3.2 19 3. Test Items • What is to be tested, from the library management perspective Whi h b ild/ i f• Which build/version of: – Application software – Documentation – Databases – etc Version 2.1 etc. 39© 2014 SQE Training V3.2 To test or not to test? That is the question! 4. Features to Be Tested / 5. Not to Be Tested 40© 2014 SQE Training V3.2
  22. 22. © 2014 SQE Training V3.2 20 6. Risk Analysis Two kinds of risk Software/product Project /planning Testing priority Contingencies and risk mitigation 42© 2014 SQE Training V3.2 6. Example Software/Product Risks •Scalability •Scope of use •Environment •Accessibility •Functional complexity •Performance  •Reliability •Third‐party productsy •Usability •Interface complexity •Technical complexity Third party products •Data integrity •Recoverability •New technology 43© 2014 SQE Training V3.2
  23. 23. © 2014 SQE Training V3.2 21 6. Inventory Features and Attributes Features • Upload a file• Upload a file • Add a record • Logon to the system • Update a user profile • Generate a report Attributes • Accessibility • Reliability • Security • Performance • Backward compatibility 44© 2014 SQE Training V3.2 6. Risk Likelihood and Impact Likelihood Hi h Impact Hi h• High • Medium • Low • High • Medium • Low 45© 2014 SQE Training V3.2
  24. 24. © 2014 SQE Training V3.2 22 6. Initial Risk Priority Likelihood ImpactX High (3) Medium (2) Low (1) High (3) Medium (2) Low (1) = Potential $ Loss ood 3 3 6 9 $ Risk (Test) Priority 1 2 3 Impact Likeliho 12 1 2 3 2 4 6 46© 2014 SQE Training V3.2 6. Risk Inventory ― Example Spelling Web Site Attribute Likelihood Impact Priority High Low 3Spelling Invalid mail-to Email viruses Wrong tel #s Slow performance Poor usability Ugly site High Low Medium Low High Medium Low Low Medium Medium High High Medium Medium 3 2 4 3 9 4 2 Does not work with Browser X Hacker spam attack Site intrusion Medium Low Low High Medium High 6 2 3 47© 2014 SQE Training V3.2
  25. 25. © 2014 SQE Training V3.2 23 6. Risk Inventory ― Example (cont.) Slow performance Web Site Attribute Likelihood Impact Priority High High 9Slow performance Does not work with Browser X Poor usability Email viruses Spelling Wrong tel #s High Medium Medium Medium High Low High High Medium Medium Low High 9 6 4 4 3 3 Site intrusion Invalid mail-to Ugly site Hacker spam attack Low Low Low Low High Medium Medium Medium 3 2 2 2 48© 2014 SQE Training V3.2 6. Risk Inventory ― Example (cont.) Slow performance Web Site Attribute Likelihood Impact Priority High High 9Slow performance Does not work with Browser X Poor usability Email viruses Spelling Wrong tel #s High Medium Medium Medium High Low High High Medium Medium Low High 9 6 4 4 3 3 Site intrusion Invalid mail-to Ugly site Hacker spam attack Low Low Low Low High Medium Medium Medium 3 2 2 2 49© 2014 SQE Training V3.2
  26. 26. © 2014 SQE Training V3.2 24 7. Class Questionnaire 1) _________________________________ What are your greatest planning risks? ) _________________________________ 2) _________________________________ 3) _________________________________ 4) _________________________________ 1) What are your contingencies? 1) _________________________________ 2) _________________________________ 3) _________________________________ 4) _________________________________ 50© 2014 SQE Training V3.2 7. Example Planning Risk Checklist • Delivery dates • Staff availability • Lack of product Requirements k• Budget • Environment options • Tool inventory • Acquisition schedule • Participant buy-in/ • Risk assumptions • Usage assumptions • Resource availability Check List xxxxxxxx Xp y marketing • Training needs • Scope of testing xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx X / / / / / 51© 2014 SQE Training V3.2
  27. 27. © 2014 SQE Training V3.2 25 7. Contingencies Contingency Options: + Time+ Time - Scope/Size + Resources - Quality/Risk 52© 2014 SQE Training V3.2 8. Approach / Strategy The set of directions that has a major  impact on the effectiveness or efficiency ofimpact on the effectiveness or efficiency of  the testing effort (i.e., forks in the road) 54© 2014 SQE Training V3.2
  28. 28. © 2014 SQE Training V3.2 26 8. Methodology (Political) Decisions • When will testers become involved in the project? • What testing levels will be employed?g p y – Acceptance – System – Integration – Unit • How many (if any) beta sites will be used? Will there be a pilot?• Will there be a pilot? • What testing techniques will be utilized? – Inspections – Walkthroughs 56© 2014 SQE Training V3.2 8. Entrance and Exit Criteria Decisions • What criteria will be used? • Are the criteria flexible/negotiable? • Who decides?  57© 2014 SQE Training V3.2
  29. 29. © 2014 SQE Training V3.2 27 8. Smoke Tests as Entrance Criteria Smoke tests are often used to evaluate  whether or not a release is REALLY ready for  system testing 58© 2014 SQE Training V3.2 8. Test Coverage Decisions What strategy should be used for  • Functional coverage should be measured – For every test domain • Program-based coverage should be used determining “where to make the cut”? – On all code written (given sufficient resources) – As a criteria for finishing unit testing 59© 2014 SQE Training V3.2
  30. 30. © 2014 SQE Training V3.2 28 8. S/W Config Management Decisions • Who owns/is responsible for each component? • How many environments will be used, and what will each be used for? • How often will the SUT be updated? • How much regression testing is required for each build? – Full – Partial • Who controls changes to the test environment? 60© 2014 SQE Training V3.2 8. Full Regression Strategy ― Arguments For AgainstFor • Easy to manage  (one test set per  level) • Lowest risk  • Easy to manage  (one test set per  level) • Lowest risk  Against • Heavy and often  impractical  resource  requirements  (ti d d ll ) • Heavy and often  impractical  resource  requirements  (ti d d ll ) Full regression usually requires maintenance of full, effective test sets (time and dollars)(time and dollars) 61© 2014 SQE Training V3.2
  31. 31. © 2014 SQE Training V3.2 29 8. Partial Regression Strategy ― Arguments For Against • Solves execution resource  problem • Only viable strategy for  emergency  • Solves execution resource  problem • Only viable strategy for  emergency  • Must develop and maintain  adequate relationship  information between  functions and tests and  functions and components • Must know which  components have changed • Must decide which features • Must develop and maintain  adequate relationship  information between  functions and tests and  functions and components • Must know which  components have changed • Must decide which features• Must decide which features  need to be re‐tested and  assemble subsets • Adds risk for features not  re‐tested • Must decide which features  need to be re‐tested and  assemble subsets • Adds risk for features not  re‐tested 62© 2014 SQE Training V3.2 8. Test Environment(s) Decisions • Will multiple platforms be needed? • What hardware will be used? • Data – Where will it come from? – How much volume will be needed? – How fresh (fragility) must it be? – How inter-dependent (referential integrity) d d bdoes it need to be? • What user profiles will be used? • Are simulators or other “test” software required? 64© 2014 SQE Training V3.2
  32. 32. © 2014 SQE Training V3.2 30 8. Automation Decisions • The four questions What?– What? – When? – Who? – How? 65© 2014 SQE Training V3.2 Automated Testing Tools are not THE answer 66© 2014 SQE Training V3.2
  33. 33. © 2014 SQE Training V3.2 31 9. Item Pass/Fail Criteria • Pass/fail criteria can be expressed using a number of different measurements:measurements: – % of test cases passed/failed – Number of bugs • Type of bugs • Severity of bugs • Location of bugs Bugs and/or test cases may be “weighted” 67© 2014 SQE Training V3.2 10. Suspension and Resumption Criteria Examples of suspension criteria include Blocking bugs Test case failures > 10% GUI response > 5 seconds Requirements < 80% complete Environmental problems 68© 2014 SQE Training V3.2
  34. 34. © 2014 SQE Training V3.2 32 11. Testing Deliverables Test Plan Test LogTest Plan Test Procedure Specification Test Design Specification Test Case Specification Test Summary Report Test Incident Report 69© 2014 SQE Training V3.2 12. Testing Tasks To do list aaaaaaaa bbbb cccc dddd eeee 70© 2014 SQE Training V3.2
  35. 35. © 2014 SQE Training V3.2 33 13. Environmental Needs • Hardware configuration • Data• Data • Interfaces • Facilities • Publications • Security access• Security access • System software • Documentation 71© 2014 SQE Training V3.2 14. Responsibilities 72© 2014 SQE Training V3.2
  36. 36. © 2014 SQE Training V3.2 34 15. Staffing and Training Needs Successful Testing Recipe Take 1 experienced project leader Add 5 senior testers, plus a handful of novice testers Throw-in liberal amounts of training Allow to mix and then add tools to taste Garnish with appropriate amounts of hardware 73© 2014 SQE Training V3.2 16. Schedule A timeline with milestones 74© 2014 SQE Training V3.2
  37. 37. © 2014 SQE Training V3.2 35 16. Class Discussion Where do milestones come from? I need it today! Where do milestones come from? • Marketing business decision • Wishful thinking • Estimates/guesstimates • Politics WAG• WAGs 75© 2014 SQE Training V3.2 17. Approvals Who calls the shots? Users? Product manager? Development manager? Test manager? 76© 2014 SQE Training V3.2
  38. 38. © 2014 SQE Training V3.2 36 Test Summary Report • Report Identifier • References – Test items (with i i # ) • Adequacy Assessment – Evaluation of coverage – Identify uncovered attributesrevision #s) – Environments – Test Plan • Variances (Deviations) – From test plan or requirements – Reasons for deviations • Summary of Incidents attributes • Summary of Activities – System/CPU usage – Staff time – Elapsed time • Software Evaluation – Limitations – Failure likelihoodSummary of Incidents – Resolved incidents – Defect patterns – Unresolved incidents Failure likelihood • Approvals 77© 2014 SQE Training V3.2 Shameless Commercial Message rcraig@sqe.com 78© 2014 SQE Training V3.2

×