Your SlideShare is downloading. ×
About the AuthorsWith over a quarter-century of software and systemsengineering experience, Rex Black is President of RexB...
Rex Black · Jamie L. MitchellAdvanced SoftwareTesting—Vol. 3Guide to the ISTQB Advanced Certificationas an Advanced Techni...
Rex Blackrex_black@rbcs-us.comJamie L. Mitchelljamie@go-tac.comEditor: Dr. Michael BarabasProjectmanager: Matthias Rossman...
vRex Black’s AcknowledgementsA complete list of people who deserve thanks for helping me along in my careeras a test profe...
vi   Rex Black’s Acknowledgements     and live courseware from the constituent bits and pieces fell to Dena Pauletti,     ...
Jamie Mitchell’s Acknowledgements    viitheir baby teeth. Laurel, Emma, and Charlotte, I hope the joys of Decemberbeach su...
viii   Jamie Mitchell’s Acknowledgements       of working with you and expanding the field of testing. Thanks for the oppo...
Contents           ixContentsRex Black’s Acknowledgements                                                                 ...
x   Contents           2.5.4        Starting Test Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...
Contents          xi       3.9.6        Risk-Based Testing throughout the Lifecycle . . . . . . . . . . . . . . . .89     ...
xii   Contents             4.2.3        Decision Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...
Contents          xiii                   4.3.2.2 Basis Path/Cyclomatic Complexity Testing . . . . . . . . 220             ...
xiv   Contents             4.5.10 Static Analysis for Integration Testing . . . . . . . . . . . . . . . . . . . . 288     ...
Contents          xv       5.3.11 Exercise: Security, Reliability, and Efficiency Debrief . . . . . . . 372       5.3.12 M...
xvi   Contents      8      Standards and Test Process Improvement                                                         ...
Contents          xvii10         People Skills and Team Composition                                                       ...
xviii   Contents        000     Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ....
xixIntroductionThis is a book on advanced software testing for technical test analysts. By thatwe mean that we address top...
xx   Introduction        This book focuses on technical test analysis. It consists of 11 chapters,     addressing the foll...
Introduction   xxiwith different emphasis. For example, technical test analysts need to know quitea bit about incident man...
xxii   Introduction       might not appear. You’ll also notice that each section starts with a text box       describing t...
11       Test Basics       “Read the directions and directly you will be directed in the right direction.”       A doorkno...
2   1 Test Basics        ISTQB Glossary        software lifecycle: The period of time that begins when a software product ...
1.2 Testing in the Software Lifecycle   3Let’s look at two examples of alignment.     In a sequential lifecycle model, a k...
4   1 Test Basics        ISTQB Glossary        system of systems: Multiple heterogeneous, distributed systems that are    ...
1.2 Testing in the Software Lifecycle   5■ Test techniques that will be applied, as appropriate for the level, for the tea...
6   1 Test Basics    In the V-model, with a well-aligned test process, test planning occurs concur-    rently with project...
1.3 Specific Systems   71.3 Specific Systems        Learning objectives        Recall of content onlyIn this section, we a...
8   1 Test Basics         As mentioned earlier, with systems of systems projects, we are typically    going to have multip...
1.3 Specific Systems   9                                                    User acceptance test   System of    systems   ...
10   1 Test Basics         ISTQB Glossary         safety-critical system: A system whose failure or malfunction may result...
1.4 Metrics and Measurement             11  ISTQB Glossary  metric: A measurement scale and the method used for measuremen...
12   1 Test Basics          Not only must we have metrics and measurements, we also need goals.     What is a “good” resul...
1.4 Metrics and Measurement    13     You also want to ensure uniform, agreed-upon interpretations of these met-rics to mi...
14   1 Test Basics        ISTQB Glossary        ethics: No definition provided in the ISTQB Glossary.                     ...
1.5 Ethics   15Since, as a technical test analyst, you’ll often have access to confidential andprivileged information, eth...
16   1 Test Basics     standards possible. For example, if you are working as a consultant and you     leave out important...
1.6 Sample Exam Questions   171   You are working as a test analyst at a bank. At the bank, technical test    analysts wor...
18   1 Test Basics
192      Testing Processes       Do not enter. If the fall does not kill you, the crocodile will.                         ...
20   2 Testing Processes     The ISTQB Foundation syllabus describes the ISTQB fundamental test process.     It provides a...
2.3 Test Planning and Control   21  ISTQB Glossary  test planning: The activity of establishing or updating a test plan.  ...
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Advanced software test_lucanhquan90
Upcoming SlideShare
Loading in...5
×

Advanced software test_lucanhquan90

6,003

Published on

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
6,003
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
298
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "Advanced software test_lucanhquan90"

  1. 1. About the AuthorsWith over a quarter-century of software and systemsengineering experience, Rex Black is President of RexBlack Consulting Services (www.rbcs-us.com), a leader insoftware, hardware, and systems testing. RBCS deliversconsulting, outsourcing, and training services, employingthe industry’s most experienced and recognizedconsultants. RBCS worldwide clientele save time andmoney through improved product development,decreased tech support calls, improved corporatereputation, and more. Rex is the most prolific author practicing in the field ofsoftware testing today. His popular first book, Managingthe Testing Process, has sold over 40,000 copies around theworld, and is now in its third edition. His six other books—Advanced Software Testing: Volumes I, II, and III, CriticalTesting Processes, Foundations of Software Testing, andPragmatic Software Testing—have also sold tens ofthousands of copies. He has written over thirty articles;presented hundreds of papers, workshops, and seminars;and given about fifty keynote and other speeches atconferences and events around the world. Rex is theimmediate past President of the International SoftwareTesting Qualifications Board (ISTQB) and a Director of theAmerican Software Testing Qualifications Board (ASTQB).Jamie L. Mitchell has over 28 years of experience indeveloping and testing both hardware and software. He isa pioneer in the test automation field and has workedwith a variety of vendor and open-source test automationtools since the first Windows tools were released withWindows 3.0. He has also written test tools for severalplatforms. Jamie specializes in increasing the productivity ofautomation and test teams through innovative ideas andcustom tool extensions. In addition, he provides training,mentoring, process auditing, and expert technicalsupport in all aspects of testing and automation. Jamie holds a Master of Computer Science degreefrom Lehigh University in Bethlehem, PA, and a CertifiedSoftware Test Engineer certification from QAI. He was aninstructor and board member of the InternationalInstitute of Software Testing (IIST) and a contributingeditor, technical editor, and columnist for the Journal ofSoftware Testing Professionals. He has been a frequentspeaker on testing and automation at severalinternational conferences, including STAR, QAI, and PSQT.
  2. 2. Rex Black · Jamie L. MitchellAdvanced SoftwareTesting—Vol. 3Guide to the ISTQB Advanced Certificationas an Advanced Technical Test Analyst
  3. 3. Rex Blackrex_black@rbcs-us.comJamie L. Mitchelljamie@go-tac.comEditor: Dr. Michael BarabasProjectmanager: Matthias RossmanithCopyeditor: Judy FlynnLayout and Type: Josef HegeleProofreader: James JohnsonCover Design: Helmut Kraus, www.exclam.dePrinter: CourierPrinted in USAISBN: 978-1-933952-39-01st Edition © 2011 by Rex Black and Jamie L. Mitchell16 15 14 13 12 11 1 2 3 4 5Rocky Nook802 East Cota Street, 3rd FloorSanta Barbara, CA 93103www.rockynook.comLibrary of Congress Cataloging-in-Publication DataBlack, Rex, 1964- Advanced software testing : guide to the ISTQB advanced certification as an advancedtechnical test analyst / Rex Black, Jamie L. Mitchell.-1st ed. p. cm.-(Advanced software testing)ISBN 978-1-933952-19-2 (v. 1 : alk. paper)-ISBN 978-1-933952-36-9 (v. 2 : alk. paper)1. Electronic data processing personnel-Certification. 2. Computer software-Examinations-Study guides. 3. Computer software-Testing. I. Title. QA76.3.B548 2008 005.14-dc222008030162Distributed by O’Reilly Media1005 Gravenstein Highway NorthSebastopol, CA 95472-2811All product names and services identified throughout this book are trademarks or registeredtrademarks of their respective companies. They are used throughout this book in editorialfashion only and for the benefit of such companies. No such uses, or the use of any tradename, is intended to convey endorsement or other affiliation with the book. No part of thematerial protected by this copyright notice may be reproduced or utilized in any form,electronic or mechanical, including photocopying, recording, or bay any information storageand retrieval system, without written permission from the copyright owner.This book is printed on acid-free paper.
  4. 4. vRex Black’s AcknowledgementsA complete list of people who deserve thanks for helping me along in my careeras a test professional would probably make up its own small book. Here I’ll con-fine myself to those who had an immediate impact on my ability to write thisparticular book. First of all, I’d like to thank my colleagues on the American Software TestingQualifications Board and the International Software Testing QualificationsBoard, and especially the Advanced Syllabus Working Party, who made thisbook possible by creating the process and the material from which this bookgrew. Not only has it been a real pleasure sharing ideas with and learning fromeach of the participants, but I have had the distinct honor of being elected pres-ident of both the American Software Testing Qualifications Board and theInternational Software Testing Qualifications Board twice. I spent two terms ineach of these roles, and I continue to serve as a board member, as the ISTQBGovernance Officer, and as a member of the ISTQB Advanced Syllabus Work-ing Party. I look back with pride at our accomplishments so far, I look forwardwith pride to what we’ll accomplish together in the future, and I hope this bookserves as a suitable expression of the gratitude and professional pride I feeltoward what we have done for the field of software testing. Next, I’d like to thank the people who helped us create the material thatgrew into this book. Jamie Mitchell co-wrote the materials in this book, ourAdvanced Technical Test Analyst instructor-lead training course, and ourAdvanced Technical Test Analyst e-learning course. These materials, alongwith related materials in the corresponding Advanced Test Analyst book andcourses, were reviewed, re-reviewed, and polished with hours of dedicatedassistance by José Mata, Judy McKay, and Pat Masters. In addition, James Nazar,Corne Kruger, John Lyerly, Bhavesh Mistry, and Gyorgy Racz provided usefulfeedback on the first draft of this book. The task of assembling the e-learning
  5. 5. vi Rex Black’s Acknowledgements and live courseware from the constituent bits and pieces fell to Dena Pauletti, RBCS’ extremely competent and meticulous systems engineer. Of course, the Advanced syllabus could not exist without a foundation, spe- cifically the ISTQB Foundation syllabus. I had the honor of working with that Working Party as well. I thank them for their excellent work over the years, cre- ating the fertile soil from which the Advanced syllabus and thus this book sprang. In the creation of the training courses and the materials that I contributed to this book, I have drawn on all the experiences I have had as an author, practi- tioner, consultant, and trainer. So, I have benefited from individuals too numer- ous to list. I thank those of you who have bought one of my previous books, for you contributed to my skills as a writer. I thank those of you who have worked with me on a project, for you have contributed to my abilities as a test manager, test analyst, and technical test analyst. I thank those of you who have hired me to work with you as a consultant, for you have given me the opportunity to learn from your organizations. I thank those of you who have taken a training course from me, for you have collectively taught me much more than I taught each of you. I thank my readers, colleagues, clients, and students, and hope that my contributions to you have repaid the debt of gratitude that I owe you. For over a dozen years, I have run a testing services company, RBCS. From humble beginnings, RBCS has grown into an international consulting, training, and outsourcing firm with clients on six continents. While I have remained a hands-on contributor to the firm, over 100 employees, subcontractors, and business partners have been the driving force of our ongoing success. I thank all of you for your hard work for our clients. Without the success of RBCS, I could hardly avail myself of the luxury of writing technical books, which is a source of great pride but not a whole lot of money. Again, I hope that our mutual suc- cesses together have repaid the debt of gratitude that I owe each of you. Finally, I thank my family, especially my wife, Laurel, and my daughters, Emma and Charlotte. Tolstoy was wrong: It is not true that all happy families are exactly the same. Our family life is quite hectic, and I know I miss a lot of it thanks to the endless travel and work demands associated with running a global testing services company and writing books. However, I’ve been able to enjoy seeing my daughters grow up as citizens of the world, with passports given to them before their first birthdays and full of stamps before they started losing
  6. 6. Jamie Mitchell’s Acknowledgements viitheir baby teeth. Laurel, Emma, and Charlotte, I hope the joys of Decemberbeach sunburns in the Australian summer sun of Port Douglas, learning to skiin the Alps, hikes in the Canadian Rockies, and other experiences that frequentflier miles and an itinerant father can provide, make up in some way for the lim-ited time we share together. I have won the lottery of life to have such a wonder-ful family.Jamie Mitchell’s AcknowledgementsWhat a long, strange trip its been. The last 25 years has taken me from being abench technician, fixing electronic audio components, to this time and place,where I have cowritten a book on some of the most technical aspects of softwaretesting. Its a trip that has been both shared and guided by a host of people that Iwould like to thank. To the many at both Moravian College and Lehigh University who startedme off in my “Exciting Career in Computers”: for your patience and leadershipthat instilled in me a burning desire to excel, I thank you. To Terry Schardt, who hired me as a developer but made me a tester, thanksfor pushing me to the dark side. To Tom Mundt and Chuck Awe, who gave mean incredible chance to lead, and to Barindralal Pal, who taught me that to leadwas to keep on learning new techniques, thank you. To Dean Nelson, who first asked me to become a consultant, and LarryDecklever, who continued my training, many thanks. A shout-out to Beth andJan, who participated with me in choir rehearsals at Joe Sensers when thingswere darkest. Have one on me. To my colleagues at TCQAA, SQE, and QAI who gave me chances todevelop a voice while I learned how to speak, my heartfelt gratitude. To the peo-ple I am working with at ISTQB and ASTQB: I hope to be worthy of the honor
  7. 7. viii Jamie Mitchell’s Acknowledgements of working with you and expanding the field of testing. Thanks for the opportu- nity. In my professional life, I have been tutored, taught, mentored, and shown the way by a host of people whose names deserve to be mentioned, but they are too abundant to recall. I would like to give all of you a collective thanks; I would be poorer for not knowing you. To Rex Black for giving me a chance to coauthor the Advanced Technical Test Analyst course and this book: Thank you for your generosity and the opportunity to learn at your feet. For my partner in crime, Judy McKay: Even though our first tool attempt did not fly, I have learned a lot from you and appreciate both your patience and kindness. Hoist a fruity drink from me. To Laurel, Dena, and Michelle: Your patience with me is noted and appreciated. Thanks for being there. And finally, to my family, who have seen so much less of me over the last 25 years than they might have wanted, as I strove to become all that I could be in my chosen profession: words alone cannot convey my thanks. To Beano, who spent innumerable hours helping me steal the time needed to get through school and set me on the path to here, my undying love and gratitude. To my loving wife, Susan, who covered for me at many of the real-life tasks while I toiled, trying to climb the ladder, my love and appreciation. I might not always remember to say it, but I do think it. And to my kids, Christopher and Kim- berly, who have always been smarter than me but allowed me to pretend that I was the boss of them, thanks. Your tolerance and enduring support have been much appreciated. Last, and probably least, to “da boys,” Baxter and Boomer, Bitzi and Buster. Whether sitting in my lap while I was trying to learn how to test or sitting at my feet while writing this book, you guys have been my sanity check. You never cared how successful I was, as long as the doggie-chow appeared in your bowls, morning and night. Thanks.
  8. 8. Contents ixContentsRex Black’s Acknowledgements vJamie Mitchell’s Acknowledgements viiIntroduction xix1 Test Basics 11.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11.2 Testing in the Software Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21.3 Specific Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .71.4 Metrics and Measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .111.5 Ethics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .141.6 Sample Exam Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .162 Testing Processes 192.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .192.2 Test Process Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .202.3 Test Planning and Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .212.4 Test Analysis and Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21 2.4.1 Non-functional Test Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . .23 2.4.2 Identifying and Documenting Test Conditions . . . . . . . . . . . . . .25 2.4.3 Test Oracles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .29 2.4.4 Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .31 2.4.5 Static Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .34 2.4.6 Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .352.5 Test Implementation and Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .36 2.5.1 Test Procedure Readiness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .37 2.5.2 Test Environment Readiness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .39 2.5.3 Blended Test Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .41
  9. 9. x Contents 2.5.4 Starting Test Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 2.5.5 Running a Single Test Procedure . . . . . . . . . . . . . . . . . . . . . . . . . 44 2.5.6 Logging Test Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 2.5.7 Use of Amateur Testers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 2.5.8 Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 2.5.9 Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 2.6 Evaluating Exit Criteria and Reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 2.6.1 Test Suite Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 2.6.2 Defect Breakdown . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 2.6.3 Confirmation Test Failure Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 2.6.4 System Test Exit Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 2.6.5 Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 2.6.6 Evaluating Exit Criteria and Reporting Exercise . . . . . . . . . . . . 60 2.6.7 System Test Exit Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 2.6.8 Evaluating Exit Criteria and Reporting Exercise Debrief . . . . . 63 2.7 Test Closure Activities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 2.8 Sample Exam Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 3 Test Management 69 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 3.2 Test Management Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 3.3 Test Plan Documentation Templates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 3.4 Test Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 3.5 Scheduling and Test Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 3.6 Test Progress Monitoring and Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 3.7 Business Value of Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 3.8 Distributed, Outsourced, and Insourced Testing . . . . . . . . . . . . . . . . . . 74 3.9 Risk-Based Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 3.9.1 Risk Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 3.9.2 Risk Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 3.9.3 Risk Analysis or Risk Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . 82 3.9.4 Risk Mitigation or Risk Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 3.9.5 An Example of Risk Identification and Assessment Results . 87
  10. 10. Contents xi 3.9.6 Risk-Based Testing throughout the Lifecycle . . . . . . . . . . . . . . . .89 3.9.7 Risk-Aware Testing Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .90 3.9.8 Risk-Based Testing Exercise 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .92 3.9.9 Risk-Based Testing Exercise Debrief 1 . . . . . . . . . . . . . . . . . . . . . . .93 3.9.10 Project Risk By-Products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .95 3.9.11 Requirements Defect By-Products . . . . . . . . . . . . . . . . . . . . . . . . . .95 3.9.12 Risk-Based Testing Exercise 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .96 3.9.13 Risk-Based Testing Exercise Debrief 2 . . . . . . . . . . . . . . . . . . . . . . .96 3.9.14 Test Case Sequencing Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . .973.10 Failure Mode and Effects Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .973.11 Test Management Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .983.12 Sample Exam Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .984 Test Techniques 1014.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1024.2 Specification-Based . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 4.2.1 Equivalence Partitioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 4.2.1.1 Avoiding Equivalence Partitioning Errors . . . . . . . . . 110 4.2.1.2 Composing Test Cases with Equivalence Partitioning . . . . . . . . . . . . . . . . . . . . 111 4.2.1.3 Equivalence Partitioning Exercise . . . . . . . . . . . . . . . . 115 4.2.1.4 Equivalence Partitioning Exercise Debrief . . . . . . . . . 116 4.2.2 Boundary Value Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 4.2.2.1 Examples of Equivalence Partitioning and Boundary Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 4.2.2.2 Non-functional Boundaries . . . . . . . . . . . . . . . . . . . . . . 123 4.2.2.3 A Closer Look at Functional Boundaries . . . . . . . . . . 124 4.2.2.4 Integers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 4.2.2.5 Floating Point Numbers . . . . . . . . . . . . . . . . . . . . . . . . . . 128 4.2.2.6 Testing Floating Point Numbers . . . . . . . . . . . . . . . . . . 130 4.2.2.7 How Many Boundaries? . . . . . . . . . . . . . . . . . . . . . . . . . . 132 4.2.2.8 Boundary Value Exercise . . . . . . . . . . . . . . . . . . . . . . . . . 134 4.2.2.9 Boundary Value Exercise Debrief . . . . . . . . . . . . . . . . . 135
  11. 11. xii Contents 4.2.3 Decision Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 4.2.3.1 Collapsing Columns in the Table . . . . . . . . . . . . . . . . . 143 4.2.3.2 Combining Decision Table Testing with Other Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . 145 4.2.3.3 Nonexclusive Rules in Decision Tables . . . . . . . . . . . . 147 4.2.3.4 Decision Table Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . 148 4.2.3.5 Decision Table Exercise Debrief . . . . . . . . . . . . . . . . . . 149 4.2.4 State-Based Testing and State Transition Diagrams . . . . . . . 154 4.2.4.1 Superstates and Substates . . . . . . . . . . . . . . . . . . . . . . . 161 4.2.4.2 State Transition Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . 162 4.2.4.3 Switch Coverage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166 4.2.4.4 State Testing with Other Techniques . . . . . . . . . . . . . 169 4.2.4.5 State Testing Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170 4.2.4.6 State Testing Exercise Debrief . . . . . . . . . . . . . . . . . . . . 172 4.2.5 Requirements-Based Testing Exercise . . . . . . . . . . . . . . . . . . . . 175 4.2.6 Requirements-Based Testing Exercise Debrief . . . . . . . . . . . . 175 4.3 Structure-Based . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177 4.3.1 Control-Flow Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179 4.3.1.1 Building Control-Flow Graphs . . . . . . . . . . . . . . . . . . 180 4.3.1.2 Statement Coverage . . . . . . . . . . . . . . . . . . . . . . . . . . . 183 4.3.1.3 Decision Coverage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188 4.3.1.4 Loop Coverage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191 4.3.1.5 Hexadecimal Converter Exercise . . . . . . . . . . . . . . . . 195 4.3.1.6 Hexadecimal Converter Exercise Debrief . . . . . . . . 197 4.3.1.7 Condition Coverage . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197 4.3.1.8 Decision/Condition Coverage . . . . . . . . . . . . . . . . . . . 200 4.3.1.9 Modified Condition/Decision Coverage (MC/DC) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201 4.3.1.10 Multiple Condition Coverage . . . . . . . . . . . . . . . . . . . 205 4.3.1.11 Control-Flow Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . 209 4.3.1.12 Control-Flow Exercise Debrief . . . . . . . . . . . . . . . . . . 210 4.3.2 Path Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214 4.3.2.1 LCSAJ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
  12. 12. Contents xiii 4.3.2.2 Basis Path/Cyclomatic Complexity Testing . . . . . . . . 220 4.3.2.3 Cyclomatic Complexity Exercise . . . . . . . . . . . . . . . . . 225 4.3.2.4 Cyclomatic Complexity Exercise Debrief . . . . . . . . . . 225 4.3.3 A Final Word on Structural Testing . . . . . . . . . . . . . . . . . . . . . . . 227 4.3.4 Structure-Based Testing Exercise . . . . . . . . . . . . . . . . . . . . . . . . . 228 4.3.5 Structure-Based Testing Exercise Debrief . . . . . . . . . . . . . . . . 2294.4 Defect- and Experience-Based . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236 4.4.1 Defect Taxonomies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237 4.4.2 Error Guessing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242 4.4.3 Checklist Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243 4.4.4 Exploratory Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245 4.4.4.1 Test Charters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247 4.4.4.2 Exploratory Testing Exercise . . . . . . . . . . . . . . . . . . . . . 249 4.4.4.3 Exploratory Testing Exercise Debrief . . . . . . . . . . . . . . 249 4.4.5 Software Attacks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252 4.4.5.1 An Example of Effective Attacks . . . . . . . . . . . . . . . . . . 256 4.4.5.2 Other Attacks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257 4.4.5.3 Software Attack Exercise . . . . . . . . . . . . . . . . . . . . . . . . . 259 4.4.5.4 Software Attack Exercise Debrief . . . . . . . . . . . . . . . . . 259 4.4.6 Specification-, Defect-, and Experience-Based Exercise . . . . 260 4.4.7 Specification-, Defect-, and Experience-Based Exercise Debrief . . . . . . . . . . . . . . . . . . . 260 4.4.8 Common Themes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2614.5 Static Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264 4.5.1 Complexity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265 4.5.2 Code Parsing Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268 4.5.3 Standards and Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270 4.5.4 Data-Flow Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273 4.5.5 Set-Use Pairs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275 4.5.6 Set-Use Pair Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278 4.5.7 Data-Flow Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284 4.5.8 Data-Flow Exercise Debrief . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284 4.5.9 Data-Flow Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
  13. 13. xiv Contents 4.5.10 Static Analysis for Integration Testing . . . . . . . . . . . . . . . . . . . . 288 4.5.11 Call-Graph Based Integration Testing . . . . . . . . . . . . . . . . . . . . . 290 4.5.12 McCabe Design Predicate Approach to Integration Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 292 4.5.13 Hex Converter Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296 4.5.14 McCabe Design Predicate Exercise . . . . . . . . . . . . . . . . . . . . . . . 301 4.5.15 McCabe Design Predicate Exercise Debrief . . . . . . . . . . . . . . . 301 4.6 Dynamic Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302 4.6.1 Memory Leak Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305 4.6.2 Wild Pointer Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307 4.6.3 API Misuse Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308 4.7 Sample Exam Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309 5 Tests of Software Characteristics 323 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 323 5.2 Quality Attributes for Domain Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325 5.2.1 Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326 5.2.2 Suitability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329 5.2.3 Interoperability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 330 5.2.4 Usability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 331 5.2.5 Usability Test Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 335 5.2.6 Usability Test Exercise Debrief . . . . . . . . . . . . . . . . . . . . . . . . . . . 335 5.3 Quality Attributes for Technical Testing . . . . . . . . . . . . . . . . . . . . . . . . . . 337 5.3.1 Technical Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 338 5.3.2 Security Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 339 5.3.3 Timely Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 344 5.3.4 Reliability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349 5.3.5 Efficiency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355 5.3.6 Multiple Flavors of Efficiency Testing . . . . . . . . . . . . . . . . . . . . . 357 5.3.7 Modeling the System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 361 5.3.8 Efficiency Measurements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366 5.3.9 Examples of Efficiency Bugs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 368 5.3.10 Exercise: Security, Reliability, and Efficiency . . . . . . . . . . . . . . . 372
  14. 14. Contents xv 5.3.11 Exercise: Security, Reliability, and Efficiency Debrief . . . . . . . 372 5.3.12 Maintainability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375 5.3.13 Subcharacteristics of Maintainability . . . . . . . . . . . . . . . . . . . . . 379 5.3.14 Portability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386 5.3.15 Maintainability and Portability Exercise . . . . . . . . . . . . . . . . . . 393 5.3.16 Maintainability and Portability Exercise Debrief . . . . . . . . . . . 3935.4 Sample Exam Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3966 Reviews 3996.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3996.2 The Principles of Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4036.3 Types of Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4076.4 Introducing Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4126.5 Success Factors for Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 413 6.5.1 Deutsch’s Design Review Checklist . . . . . . . . . . . . . . . . . . . . . . . 417 6.5.2 Marick’s Code Review Checklist . . . . . . . . . . . . . . . . . . . . . . . . . . 419 6.5.3 The OpenLaszlo Code Review Checklist . . . . . . . . . . . . . . . . . . 4226.6 Code Review Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4236.7 Code Review Exercise Debrief . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4246.8 Deutsch Checklist Review Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4296.9 Deutsch Checklist Review Exercise Debrief . . . . . . . . . . . . . . . . . . . . . . . 4306.10 Sample Exam Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4327 Incident Management 4357.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4357.2 When Can a Defect Be Detected? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4367.3 Defect Lifecycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4377.4 Defect Fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4457.5 Metrics and Incident Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4497.6 Communicating Incidents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4507.7 Incident Management Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4517.8 Incident Management Exercise Debrief . . . . . . . . . . . . . . . . . . . . . . . . . . 4527.9 Sample Exam Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454
  15. 15. xvi Contents 8 Standards and Test Process Improvement 457 9 Test Techniques 459 9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 459 9.2 Test Tool Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 460 9.2.1 The Business Case for Automation . . . . . . . . . . . . . . . . . . . . . . . 461 9.2.2 General Test Automation Strategies . . . . . . . . . . . . . . . . . . . . . . 466 9.2.3 An Integrated Test System Example . . . . . . . . . . . . . . . . . . . . . . 471 9.3 Test Tool Categories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 473 9.3.1 Test Management Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 474 9.3.2 Test Execution Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 475 9.3.3 Debugging, Troubleshooting, Fault Seeding, and Injection Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 479 9.3.4 Static and Dynamic Analysis Tools . . . . . . . . . . . . . . . . . . . . . . . . 480 9.3.5 Performance Testing Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 483 9.3.6 Monitoring Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 485 9.3.7 Web Testing Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 486 9.3.8 Simulators and Emulators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 488 9.4 Keyword-Driven Test Automation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 489 9.4.1 Capture/Replay Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 495 9.4.2 Capture/Replay Exercise Debrief . . . . . . . . . . . . . . . . . . . . . . . . . 495 9.4.3 Evolving from Capture/Replay . . . . . . . . . . . . . . . . . . . . . . . . . . . 497 9.4.4 The Simple Framework Architecture . . . . . . . . . . . . . . . . . . . . . 499 9.4.5 Data-Driven Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 502 9.4.6 Keyword-Driven Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . 504 9.4.7 Keyword Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 511 9.4.8 Keyword Exercise Debrief . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 512 9.5 Performance Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 514 9.5.1 Performance Testing Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . . . 520 9.5.2 Performance Testing Exercise Debrief . . . . . . . . . . . . . . . . . . . . 521 9.6 Sample Exam Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 523
  16. 16. Contents xvii10 People Skills and Team Composition 52710.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52810.2 Individual Skills . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52810.3 Test Team Dynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52810.4 Fitting Testing within an Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . 52910.5 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52910.6 Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53010.7 Sample Exam Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53211 Preparing for the Exam 53511.1 Learning Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 535 11.1.1 Level 1: Remember (K1) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 536 11.1.2 Level 2: Understand (K2) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 536 11.1.3 Level 3: Apply (K3) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 537 11.1.4 Level 4: Analyze (K4) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 538 11.1.5 Where Did These Levels of Learning Objectives Come From? . . . . . . . . . . . . . . . . . . . . . . 53811.2 ISTQB Advanced Exams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 539 11.2.1 Scenario-Based Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 541 11.2.2 On the Evolution of the Exams . . . . . . . . . . . . . . . . . . . . . . . . . . . 543Appendix A – Bibliography 545 11.2.3 Advanced Syllabus Referenced Standards . . . . . . . . . . . . . . . . 545 11.2.4 Advanced Syllabus Referenced Books . . . . . . . . . . . . . . . . . . . . 545 11.2.5 Other Referenced Books . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 547 11.2.6 Other References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 547Appendix B – HELLOCARMSThe Next Generation of Home Equity Lending 549System Requirements Document . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 549I Table of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 551II Versioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 553III Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 555
  17. 17. xviii Contents 000 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 557 001 Informal Use Case . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 558 003 Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 560 004 System Business Benefits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 561 010 Functional System Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 562 020 Reliability System Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 566 030 Usability System Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 567 040 Efficiency System Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 568 050 Maintainability System Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 570 060 Portability System Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 571 A Acknowledgement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 573 Appendix C – Answers to Sample Questions 575 Index 577
  18. 18. xixIntroductionThis is a book on advanced software testing for technical test analysts. By thatwe mean that we address topics that a technical practitioner who has chosensoftware testing as a career should know. We focus on those skills and tech-niques related to test analysis, test design, test tools and automation, test execu-tion, and test results evaluation. We take these topics in a more technicaldirection than in the earlier volume for test analysts by including details of testdesign using structural techniques and details about the use of dynamic analysisto monitor internal status. We assume that you know the basic concepts of testengineering, test design, test tools, testing in the software development lifecycle,and test management. You are ready to mature your level of understanding ofthese concepts and to apply these advanced concepts to your daily work as a testprofessional. This book follows the International Software Testing Qualifications Board’s(ISTQB) Advanced Level Syllabus, with a focus on the material and learningobjectives for the advanced technical test analyst. As such, this book can helpyou prepare for the ISTQB Advanced Level Technical Test Analyst exam. Youcan use this book to self-study for this exam or as part of an e-learning orinstructor-lead course on the topics covered in those exams. If you are taking anISTQB-accredited Advanced Level Technical Test Analyst training course, thisbook is an ideal companion text for that course. However, even if you are not interested in the ISTQB exams, you will findthis book useful to prepare yourself for advanced work in software testing. Ifyou are a test manager, test director, test analyst, technical test analyst, auto-mated test engineer, manual test engineer, programmer, or in any other fieldwhere a sophisticated understanding of software testing is needed—especiallyan understanding of the particularly technical aspects of testing such as white-box testing and test automation—then this book is for you.
  19. 19. xx Introduction This book focuses on technical test analysis. It consists of 11 chapters, addressing the following material: 1. Basic aspects of software testing 2. Testing processes 3. Test management 4. Test techniques 5. Testing of software characteristics 6. Reviews 7. Incident (defect) management 8. Standards and test process improvement 9. Test tools and automation 10. People skills (team composition) 11. Preparing for the exam Since the structure follows the structure of the ISTQB Advanced syllabus, some of the chapters address the material in great detail because they are central to the technical test analyst role. Some of the chapters address the material in less detail because the technical test analyst need only be familiar with it. For exam- ple, we cover in detail test techniques—including highly technical techniques like structure-based testing and dynamic analysis and test automation—in this book because these are central to what a technical test analyst does, while we spend less time on test management and no time at all on test process improve- ment. If you have already read Advanced Software Testing: Volume 1, you will notice that there is overlap in some chapters in the book, especially chapters 1, 2, 6, 7, and 10. (There is also some overlap in chapter 4, in the sections on black- box and experience-based testing.) This overlap is inherent in the structure of the ISTQB Advanced syllabus, where both learning objectives and content are common across the two analysis modules in some areas. We spent some time grappling with how to handle this commonality and decided to make this book completely free-standing. That meant that we had to include common material for those who have not read volume 1. If you have read volume 1, you may choose to skip chapters 1, 2, 6, 7, and 10, though people using this book to pre- pare for the Technical Test Analysis exam should read those chapters for review. If you have also read Advanced Software Testing: Volume 2, which is for test managers, you’ll find parallel chapters that address the material in detail but
  20. 20. Introduction xxiwith different emphasis. For example, technical test analysts need to know quitea bit about incident management. Technical test analysts spend a lot of time cre-ating incident reports, and you need to know how to do that well. Test managersalso need to know a lot about incident management, but they focus on how tokeep incidents moving through their reporting and resolution lifecycle and howto gather metrics from such reports. What should a technical test analyst be able to do? Or, to ask the questionanother way, what should you have learned to do—or learned to do better—bythe time you finish this book?■ Structure the tasks defined in the test strategy in terms of technical requirements (including the coverage of technically related quality risks)■ Analyze the internal structure of the system in sufficient detail to meet the expected quality level■ Evaluate the system in terms of technical quality attributes such as performance, security, etc.■ Prepare and execute adequate activities, and report on their progress■ Conduct technical testing activities■ Provide the necessary evidence to support evaluations■ Implement the necessary tools and techniques to achieve the defined goalsIn this book, we focus on these main concepts. We suggest that you keep thesehigh-level objectives in mind as we proceed through the material in each of thefollowing chapters. In writing this book, we’ve kept foremost in our minds the question of howto make this material useful to you. If you are using this book to prepare for anISTQB Advanced Level Technical Test Analyst exam, then we recommend thatyou read chapter 11 first, then read the other 10 chapters in order. If you areusing this book to expand your overall understanding of testing to an advancedand highly technical level but do not intend to take the ISTQB Advanced LevelTechnical Test Analyst exam, then we recommend that you read chapters 1through 10 only. If you are using this book as a reference, then feel free to readonly those chapters that are of specific interest to you. Each of the first 10 chapters is divided into sections. For the most part, wehave followed the organization of the ISTQB Advanced syllabus to the point ofsection divisions, but subsections and sub-subsection divisions in the syllabus
  21. 21. xxii Introduction might not appear. You’ll also notice that each section starts with a text box describing the learning objectives for this section. If you are curious about how to interpret those K2, K3, and K4 tags in front of each learning objective, and how learning objectives work within the ISTQB syllabus, read chapter 11. Software testing is in many ways similar to playing the piano, cooking a meal, or driving a car. How so? In each case, you can read books about these activities, but until you have practiced, you know very little about how to do it. So we’ve included practical, real-world exercises for the key concepts. We encourage you to practice these concepts with the exercises in the book. Then, make sure you take these concepts and apply them on your projects. You can become an advanced testing professional only by applying advanced test tech- niques to actual software testing. ISTQB Copyright This book is based on the ISTQB Advanced Syllabus version 2007. It also references the ISTQB Foundation Syllabus version 2011. It uses terminology definitions from the ISTQB Glossary version 2.1. These three documents are copyrighted by the ISTQB and used by permission.
  22. 22. 11 Test Basics “Read the directions and directly you will be directed in the right direction.” A doorknob in Lewis Carroll’s surreal fantasy, Alice in Wonderland.The first chapter of the Advanced syllabus is concerned with contextual andbackground material that influences the remaining chapters. There are five sec-tions.1. Introduction2. Testing in the Software Lifecycle3. Specific Systems4. Metrics and Measurement5. EthicsLet’s look at each section and how it relates to technical test analysis.1.1 Introduction Learning objectives Recall of content onlyThis chapter, as the name implies, introduces some basic aspects of softwaretesting. These central testing themes have general relevance for testing profes-sionals.There are four major areas:■ Lifecycles and their effects on testing■ Special types of systems and their effects on testing■ Metrics and measures for testing and quality■ Ethical issues
  23. 23. 2 1 Test Basics ISTQB Glossary software lifecycle: The period of time that begins when a software product is conceived and ends when the software is no longer available for use. The soft- ware lifecycle typically includes a concept phase, requirements phase, design phase, implementation phase, test phase, installation and checkout phase, operation and maintenance phase, and sometimes, retirement phase. Note that these phases may overlap or be performed iteratively. Many of these concepts are expanded upon in later chapters. This material expands on ideas introduced in the Foundation syllabus. 1.2 Testing in the Software Lifecycle Learning objectives Recall of content only Chapter 2 in the Foundation syllabus discusses integrating testing into the soft- ware lifecycle. As with the Foundation syllabus, in the Advanced syllabus, you should understand that testing must be integrated into the software lifecycle to succeed. This is true whether the particular lifecycle chosen is sequential, incre- mental, iterative, or spiral. Proper alignment between the testing process and other processes in the lifecycle is critical for success. This is especially true at key interfaces and hand- offs between testing and lifecycle activities such as these: ■ Requirements engineering and management ■ Project management ■ Configuration and change management ■ Software development and maintenance ■ Technical support ■ Technical documentation
  24. 24. 1.2 Testing in the Software Lifecycle 3Let’s look at two examples of alignment. In a sequential lifecycle model, a key assumption is that the project teamwill define the requirements early in the project and then manage the (hopefullylimited) changes to those requirements during the rest of the project. In such asituation, if the team follows a formal requirements process, an independent testteam in charge of the system test level can follow an analytical requirements-based test strategy. Using a requirements-based strategy in a sequential model, the test teamwould start—early in the project—planning and designing tests following ananalysis of the requirements specification to identify test conditions. This plan-ning, analysis, and design work might identify defects in the requirements,making testing a preventive activity. Failure detection would start later in thelifecycle, once system test execution began. However, suppose the project follows an incremental lifecycle model,adhering to one of the agile methodologies like Scrum. The test team won’treceive a complete set of requirements early in the project, if ever. Instead, thetest team will receive requirements at the beginning of sprint, which typicallylasts anywhere from two to four weeks. Rather than analyzing extensively documented requirements at the outsetof the project, the test team can instead identify and prioritize key quality riskareas associated with the content of each sprint; i.e., they can follow an analyti-cal risk-based test strategy. Specific test designs and implementation will occurimmediately before test execution, potentially reducing the preventive role oftesting. Failure detection starts very early in the project, at the end of the firstsprint, and continues in repetitive, short cycles throughout the project. In such acase, testing activities in the fundamental testing process overlap and are con-current with each other as well as with major activities in the software lifecycle. No matter what the lifecycle—and indeed, especially with the more fast-paced agile lifecycles—good change management and configuration manage-ment are critical for testing. A lack of proper change management results in aninability of the test team to keep up with what the system is and what it shoulddo. As was discussed in the Foundation syllabus, a lack of proper configura-tion management may lead to loss of artifact changes, an inability to say whatwas tested at what point in time, and severe lack of clarity around the mean-ing of the test results.
  25. 25. 4 1 Test Basics ISTQB Glossary system of systems: Multiple heterogeneous, distributed systems that are embedded in networks at multiple levels and in multiple interconnected domains, addressing large-scale interdisciplinary common problems and purposes, usually without a common management structure. The Foundation syllabus cited four typical test levels: ■ Unit or component ■ Integration ■ System ■ Acceptance The Foundation syllabus mentioned some reasons for variation in these levels, especially with integration and acceptance. Integration testing can mean component integration testing—integrating a set of components to form a system, testing the builds throughout that process. Or it can mean system integration testing—integrating a set of systems to form a system of systems, testing the system of systems as it emerges from the conglomeration of systems. As discussed in the Foundation syllabus, acceptance test variations include user acceptance tests and regulatory acceptance tests. Along with these four levels and their variants, at the Advanced level you need to keep in mind additional test levels that you might need for your projects. These could include the following: ■ Hardware-software integration testing ■ Feature interaction testing ■ Customer product integration testing You should expect to find most if not all of the following for each level: ■ Clearly defined test goals and scope ■ Traceability to the test basis (if available) ■ Entry and exit criteria, as appropriate both for the level and for the system lifecycle ■ Test deliverables, including results reporting
  26. 26. 1.2 Testing in the Software Lifecycle 5■ Test techniques that will be applied, as appropriate for the level, for the team and for the risks inherent in the system■ Measurements and metrics■ Test tools, where applicable and as appropriate for the level■ And, if applicable, compliance with organizational or other standardsWhen RBCS associates perform assessments of test teams, we often find organi-zations that use test levels but that perform them in isolation. Such isolation leadsto inefficiencies and confusion. While these topics are discussed more inAdvanced Software Testing: Volume 2, test analysts should keep in mind that usingdocuments like test policies and frequent contact between test-related staff cancoordinate the test levels to reduce gaps, overlap, and confusion about results. Let’s take a closer look at this concept of alignment. We’ll use the V-modelshown in figure 1-1 as an example. We’ll further assume that we are talkingabout the system test level. Concept System Develop Tests Ca ts ptu es eT re Re nc pta qu ire ce me Ac nts t s Develop Tests Te em De yst sig n/S nS yst tio gra em e Int Develop Tests Im st Te ple nt me ne nt o mp Sy st Co emFigure 1–1 V-model
  27. 27. 6 1 Test Basics In the V-model, with a well-aligned test process, test planning occurs concur- rently with project planning. In other words, the moment that the testing team becomes involved is at the very start of the project. Once the test plan is approved, test control begins. Test control continues through to test closure. Analysis, design, implementation, execution, evaluation of exit criteria, and test results reporting are carried out according to the plan. Deviations from the plan are managed. Test analysis starts immediately after or even concurrently with test plan- ning. Test analysis and test design happen concurrently with requirements, high-level design, and low-level design. Test implementation, including test environment implementation, starts during system design and completes just before test execution begins. Test execution begins when the test entry criteria are met. More realistically, test execution starts when most entry criteria are met and any outstanding entry criteria are waived. In V-model theory, the entry criteria would include success- ful completion of both component test and integration test levels. Test execu- tion continues until the test exit criteria are met, though again some of these may often be waived. Evaluation of test exit criteria and reporting of test results occur throughout test execution. Test closure activities occur after test execution is declared complete. This kind of precise alignment of test activities with each other and with the rest of the system lifecycle will not happen simply by accident. Nor can you expect to instill this alignment continuously throughout the process, without any forethought. Rather, for each test level, no matter what the selected software lifecycle and test process, the test manager must perform this alignment. Not only must this happen during test and project planning, but test control includes acting to ensure ongoing alignment. No matter what test process and software lifecycle are chosen, each project has its own quirks. This is especially true for complex projects such as the sys- tems of systems projects common in the military and among RBCS’s larger cli- ents. In such a case, the test manager must plan not only to align test processes, but also to modify them. Off-the-rack process models, whether for testing alone or for the entire software lifecycle, don’t fit such complex projects well.
  28. 28. 1.3 Specific Systems 71.3 Specific Systems Learning objectives Recall of content onlyIn this section, we are going to talk about how testing affects—and is affectedby—the need to test two particular types of systems. The first type is systems ofsystems. The second type is safety-critical systems. Systems of systems are independent systems tied together to serve a com-mon purpose. Since they are independent and tied together, they often lack asingle, coherent user or operator interface, a unified data model, compatibleexternal interfaces, and so forth.Systems of systems projects include the following characteristics and risks:■ The integration of commercial off-the-shelf (COTS) software along with some amount of custom development, often taking place over a long period.■ Significant technical, lifecycle, and organizational complexity and heterogeneity. This organizational and lifecycle complexity can include issues of confidentiality, company secrets, and regulations.■ Different development lifecycles and other processes among disparate teams, especially—as is frequently the case—when insourcing, outsourcing, and offshoring are involved.■ Serious potential reliability issues due to intersystem coupling, where one inherently weaker system creates ripple-effect failures across the entire system of systems.■ System integration testing, including interoperability testing, is essential. Well-defined interfaces for testing are needed.At the risk of restating the obvious, systems of systems projects are more com-plex than single-system projects. The complexity increase applies organization-ally, technically, processwise, and teamwise. Good project management, formaldevelopment lifecycles and processes, configuration management, and qualityassurance become more important as size and complexity increase. Let’s focus on the lifecycle implications for a moment.
  29. 29. 8 1 Test Basics As mentioned earlier, with systems of systems projects, we are typically going to have multiple levels of integration. First, we will have component inte- gration for each system, and then we’ll have system integration as we build the system of systems. We will also typically have multiple version management and version con- trol systems and processes, unless all the systems happen to be built by the same (presumably large) organization and that organization follows the same approach throughout its software development team. This kind of unified approach to such systems and processes is not something that we commonly see during assessments of large companies, by the way. The duration of projects tends to be long. We have seen them planned for as long as five to seven years. A system of systems project with five or six systems might be considered relatively short and relatively small if it lasted “only” a year and involved “only” 40 or 50 people. Across this project, there are multiple test levels, usually owned by different parties. Because of the size and complexity of the project, it’s easy for handoffs and transfers of responsibility to break down. So, we need formal information trans- fer among project members (especially at milestones), transfers of responsibility within the team, and handoffs. (A handoff, for those of you unfamiliar with the term, is a situation in which some work product is delivered from one group to another and the receiving group must carry out some essential set of activities with that work product.) Even when we’re integrating purely off-the-shelf systems, these systems are evolving. That’s all the more likely to be true with custom systems. So we have the management challenge of coordinating development of the individual sys- tems and the test analyst challenge of proper regression testing at the system of systems level when things change. Especially with off-the-shelf systems, maintenance testing can be trig- gered—sometimes without much warning—by external entities and events such as obsolescence, bankruptcy, or upgrade of an individual system. If you think of the fundamental test process in a system of systems project, the progress of levels is not two-dimensional. Instead, imagine a sort of pyrami- dal structure, as shown in figure 1-2.
  30. 30. 1.3 Specific Systems 9 User acceptance test System of systems Systems test System integration test System test Component integration test Component test System A System BFigure 1–2 Fundamental test process in a system of systems projectAt the base, we have component testing. A separate component test level existsfor each system. Moving up the pyramid, you have component integration testing. A sepa-rate component integration test level exists for each system. Next, we have system testing. A separate system test level exists for eachsystem. Note that, for each of these test levels, we have separate organizational own-ership if the systems come from different vendors. We also probably have sepa-rate team ownership since multiple groups often handle component,integration, and system test. Continuing to move up the pyramid, we come to system integration testing.Now, finally, we are talking about a single test level across all systems. Nextabove that is systems testing, focusing on end-to-end tests that span all the sys-tems. Finally, we have user acceptance testing. For each of these test levels, whilewe have single organizational ownership, we probably have separate team own-ership. Let’s move on to safety-critical systems. Simply put, safety-critical systemsare those systems upon which lives depend. Failure of such a system—or eventemporary performance or reliability degradation or undesirable side effects assupport actions are carried out—can injure or kill people or, in the case of mili-tary systems, fail to injure or kill people at a critical juncture of a battle.
  31. 31. 10 1 Test Basics ISTQB Glossary safety-critical system: A system whose failure or malfunction may result in death or serious injury to people, or loss or severe damage to equipment, or environmental harm. Safety-critical systems, like systems of systems, have certain associated charac- teristics and risks: ■ Since defects can cause death, and deaths can cause civil and criminal penalties, proof of adequate testing can be and often is used to reduce liability. ■ For obvious reasons, various regulations and standards often apply to safety-critical systems. The regulations and standards can constrain the process, the organizational structure, and the product. Unlike the usual constraints on a project, though, these are constructed specifically to increase the level of quality rather than to enable trade-offs to enhance schedule, budget, or feature outcomes at the expense of quality. Overall, there is a focus on quality as a very important project priority. ■ There is typically a rigorous approach to both development and testing. Throughout the lifecycle, traceability extends all the way from regulatory requirements to test results. This provides a means of demonstrating compliance. This requires extensive, detailed documentation but provides high levels of audit ability, even by non-test experts. Audits are common if regulations are imposed. Demonstrating compliance can involve tracing from the regulatory requirement through development to the test results. An outside party typically performs the audits. Therefore, establish- ing traceability up front and carrying out both from a people and a process point of view. During the lifecycle—often as early as design—the project team uses safety analysis techniques to identify potential problems. As with quality risk analysis, safety analysis will identify risk items that require testing. Single points of fail- ure are often resolved through system redundancy, and the ability of that redun- dancy to alleviate the single point of failure must be tested. In some cases, safety-critical systems are complex systems or even systems of systems. In other cases, non-safety-critical components or systems are inte-
  32. 32. 1.4 Metrics and Measurement 11 ISTQB Glossary metric: A measurement scale and the method used for measurement. measurement scale: A scale that constrains the type of data analysis that can be performed on it. measurement: The process of assigning a number or category to an entity to describe an attribute of that entity. measure: The number or category assigned to an attribute of an entity by making a measurement.grated into safety-critical systems or systems of systems. For example, network-ing or communication equipment is not inherently a safety-critical system, butif integrated into an emergency dispatch or military system, it becomes part of asafety-critical system. metric Formal quality risk management is essential in these situations. Fortunately, measurement scalea number of such techniques exist, such as failure mode and effect analysis; fail- measurement measureure mode, effect, and criticality analysis; hazard analysis; and software commoncause failure analysis. We’ll look at a less formal approach to quality risk analysisand management in chapter 3.1.4 Metrics and Measurement Learning objectives Recall of content onlyThroughout this book, we use metrics and measurement to establish expecta-tions and guide testing by those expectations. You can and should apply metricsand measurements throughout the software development lifecycle because well-established metrics and measures, aligned with project goals and objectives, willenable technical test analysts to track and report test and quality results tomanagement in a consistent and coherent way. A lack of metrics and measurements leads to purely subjective assessmentsof quality and testing. This results in disputes over the meaning of test resultstoward the end of the lifecycle. It also results in a lack of clearly perceived andcommunicated value, effectiveness, and efficiency for testing.
  33. 33. 12 1 Test Basics Not only must we have metrics and measurements, we also need goals. What is a “good” result for a given metric? An acceptable result? An unaccept- able result? Without defined goals, successful testing is usually impossible. In fact, when we perform assessments for our clients, we more often than not find ill-defined metrics of test team effectiveness and efficiency with no goals and thus bad and unrealistic expectations (which of course aren’t met). We can establish realistic goals for any given metric by establishing a baseline measure for that metric and checking current capability, comparing that baseline against industry averages, and, if appropriate, setting realistic targets for improvement to meet or exceed the industry average. There’s just about no end to what can be subjected to a metric and tracked through measurement. Consider the following: ■ Planned schedule and coverage ■ Requirements and their schedule, resource, and task implications for testing ■ Workload and resource usage ■ Milestones and scope of testing ■ Planned and actual costs ■ Risks; both quality and project risks ■ Defects, including total found, total fixed, current backlog, average closure periods, and configuration, subsystem, priority, or severity distribution During test planning, we establish expectations in the form of goals for the various metrics. As part of test control, we can measure actual outcomes and trends against these goals. As part of test reporting, we can consistently explain to management various important aspects of the process, product, and project, using objective, agreed-upon metrics with realistic, achievable goals. When thinking about a testing metrics and measurement program, there are three main areas to consider: definition, tracking, and reporting. Let’s start with definition. In a successful testing metrics program, you define a useful, pertinent, and concise set of quality and test metrics for a project. You avoid too large a set of metrics because this will prove difficult and perhaps expensive to measure while often confusing rather than enlightening the viewers and stakeholders.
  34. 34. 1.4 Metrics and Measurement 13 You also want to ensure uniform, agreed-upon interpretations of these met-rics to minimize disputes and divergent opinions about the meaning of certainmeasures of outcomes, analyses, and trends. There’s no point in having a met-rics program if everyone has an utterly divergent opinion about what particularmeasures mean. Finally, define metrics in terms of objectives and goals for a process or task,for components or systems, and for individuals or teams. Victor Basili’s well-known Goal Question Metric technique is one way toevolve meaningful metrics. (We prefer to use the word objective where Basiliuses goal.) Using this technique, we proceed from the objectives of the effort—in this case, testing—to the questions we’d have to answer to know if we wereachieving those objectives to, ultimately, the specific metrics. For example, one typical objective of testing is to build confidence. One nat-ural question that arises in this regard is, How much of the system has beentested? Metrics for coverage include percentage requirements covered by tests,percentage of branches and statements covered by tests, percentage of interfacescovered by tests, percentage of risks covered by tests, and so forth. Let’s move on to tracking. Since tracking is a recurring activity in a metrics program, the use of auto-mated tool support can reduce the time required to capture, track, analyze,report, and measure the metrics. Be sure to apply objective and subjective analyses for specific metrics overtime, especially when trends emerge that could allow for multiple interpreta-tions of meaning. Try to avoid jumping to conclusions or delivering metrics thatencourage others to do so. Be aware of and manage the tendency for people’s interests to affect theinterpretation they place on a particular metric or measure. Everyone likes tothink they are objective—and, of course, right as well as fair!—but usuallypeople’s interests affect their conclusions. Finally, let’s look at reporting. Most importantly, reporting of metrics and measures should enlightenmanagement and other stakeholders, not confuse or misdirect them. In part,this is achieved through smart definition of metrics and careful tracking, butit is possible to take perfectly clear and meaningful metrics and confusepeople with them through bad presentation. Edward Tufte’s series of books,
  35. 35. 14 1 Test Basics ISTQB Glossary ethics: No definition provided in the ISTQB Glossary. ethics starting with The Graphical Display of Quantitative Information, is a treasure trove of ideas about how to develop good charts and graphs for reporting purposes.1 Good testing reports based on metrics should be easily understood, not overly complex and certainly not ambiguous. The reports should draw the viewer’s attention toward what matters most, not toward trivialities. In that way, good testing reports based on metrics and measures will help management guide the project to success. Not all types of graphical displays of metrics are equal—or equally useful. A snapshot of data at a moment in time, as shown in a table, might be the right way to present some information, such as the coverage planned and achieved against certain critical quality risk areas. A graph of a trend over time might be a useful way to present other information, such as the total number of defects reported and the total number of defects resolved since the start of testing. An analysis of causes or relationships might be a useful way to present still other information, such as a scatter plot showing the correlation (or lack thereof) between years of tester experience and percentage of bug reports rejected. 1.5 Ethics Learning objectives Recall of content only Many professions have ethical standards. In the context of professionalism, ethics are “rules of conduct recognized in respect to a particular class of human actions or a particular group, culture, etc.”2 1. The three books of Tufte’s that Rex has read and can strongly recommend on this topic are The Graphical Display of Quantitative Information, Visual Explanations, and Envisioning Information (all published by Graphics Press, Cheshire, CT). 2. Definition from dictionary.com.
  36. 36. 1.5 Ethics 15Since, as a technical test analyst, you’ll often have access to confidential andprivileged information, ethical guidelines can help you to use that informationappropriately. In addition, you should use ethical guidelines to choose the bestpossible behaviors and outcomes for a given situation, given your constraints.The phrase “best possible” means for everyone, not just you. Here is an example of ethics in action. One of the authors, Rex Black, ispresident of three related international software testing consultancies, RBCS,RBCS AU/NZ, and Software TestWorx. He also serves on the ISTQB andASTQB boards of directors. As such, he might have and does have insight intothe direction of the ISTQB program that RBCS’ competitors in the softwaretesting consultancy business don’t have. In some cases, such as helping to develop syllabi, Rex has to make thosebusiness interests clear to people, but he is allowed to help do so. Rex helpedwrite both the Foundation and Advanced syllabi. In other cases, such as developing exam questions, Rex agreed, along withhis colleagues on the ASTQB, that he should not participate. Direct access to theexam questions would make it all too likely that, consciously or unconsciously,RBCS would warp its training materials to “teach the exam.” As you advance in your career as a tester, more and more opportunities toshow your ethical nature—or to be betrayed by a lack of it—will come your way.It’s never too early to inculcate a strong sense of ethics.The ISTQB Advanced syllabus makes it clear that the ISTQB expects certificateholders to adhere to the following code of ethics.PUBLIC – Certified software testers shall act consistently with the public inter-est. For example, if you are working on a safety-critical system and are asked toquietly cancel some defect reports, it’s an ethical problem if you do so.CLIENT AND EMPLOYER – Certified software testers shall act in a mannerthat is in the best interests of their client and employer and consistent with thepublic interest. For example, if you know that your employer’s major project isin trouble and you short-sell the stock and then leak information about theproject problems to the Internet, that’s a real ethical lapse—and probably acriminal one too.PRODUCT – Certified software testers shall ensure that the deliverables theyprovide (on the products and systems they test) meet the highest professional
  37. 37. 16 1 Test Basics standards possible. For example, if you are working as a consultant and you leave out important details from a test plan so that the client has to hire you on the next project, that’s an ethical lapse. JUDGMENT – Certified software testers shall maintain integrity and indepen- dence in their professional judgment. For example, if a project manager asks you not to report defects in certain areas due to potential business sponsor reac- tions, that’s a blow to your independence and an ethical failure on your part if you comply. MANAGEMENT – Certified software test managers and leaders shall subscribe to and promote an ethical approach to the management of software testing. For example, favoring one tester over another because you would like to establish a romantic relationship with the favored tester’s sister is a serious lapse of mana- gerial ethics. PROFESSION – Certified software testers shall advance the integrity and repu- tation of the profession consistent with the public interest. For example, if you have a chance to explain to your child’s classmates or your spouse’s colleagues what you do, be proud of it and explain the ways software testing benefits society. COLLEAGUES – Certified software testers shall be fair to and supportive of their colleagues and promote cooperation with software developers. For exam- ple, it is unethical to manipulate test results to arrange the firing of a program- mer whom you detest. SELF – Certified software testers shall participate in lifelong learning regarding the practice of their profession and shall promote an ethical approach to the practice of the profession. For example, attending courses, reading books, and speaking at conferences about what you do help to advance yourself—and the profession. This is called doing well while doing good, and fortunately, it is very ethical! 1.6 Sample Exam Questions To end each chapter, you can try one or more sample exam questions to rein- force your knowledge and understanding of the material and to prepare for the ISTQB Advanced Level Technical Test Analyst exam.
  38. 38. 1.6 Sample Exam Questions 171 You are working as a test analyst at a bank. At the bank, technical test analysts work closely with users during user acceptance testing. The bank has bought two financial applications as commercial off-the-shelf (COTS) software from large software vendors. Previous history with these vendors has shown that they deliver quality applications that work on their own, but this is the first time the bank will attempt to integrate applications from these two vendors. Which of the following test levels would you expect to be involved in? [Note: There might be more than one right answer.] A Component test B Component integration test C System integration test D Acceptance test2 Which of the following is necessarily true of safety-critical systems? A They are composed of multiple COTS applications. B They are complex systems of systems. C They are systems upon which lives depend. D They are military or intelligence systems.
  39. 39. 18 1 Test Basics
  40. 40. 192 Testing Processes Do not enter. If the fall does not kill you, the crocodile will. A sign blocking the entrance to a parapet above a pool in the Sydney Wildlife Centre, Australia, guiding people in a safe viewing process for one of many dangerous-fauna exhibits.The second chapter of the Advanced syllabus is concerned with the process oftesting and the activities that occur within that process. It establishes a frame-work for all the subsequent material in the syllabus and allows you to visualizeorganizing principles for the rest of the concepts. There are seven sections.1. Introduction2. Test Process Models3. Test Planning and Control4. Test Analysis and Design5. Test Implementation and Execution6. Evaluating Exit Criteria and Reporting7. Test Closure ActivitiesLet’s look at each section and how it relates to technical test analysis.2.1 Introduction Learning objectives Recall of content only
  41. 41. 20 2 Testing Processes The ISTQB Foundation syllabus describes the ISTQB fundamental test process. It provides a generic, customizable test process, shown in figure 2-1. That process consists of the following activities: ■ Planning and control ■ Analysis and design ■ Implementation and execution ■ Evaluating exit criteria and reporting ■ Test closure activities For technical test analysts, we can focus on the middle three activities in the bullet list above. Execution Evaluating exit criteria Reporting test results Control Project Timeline Figure 2–1 ISTQB fundamental test process 2.2 Test Process Models Learning objectives Recall of content only The concepts in this section apply primarily for test managers. There are no learning objectives defined for technical test analysts in this section. In the course of studying for the exam, read this section in chapter 2 of the Advanced syllabus for general recall and familiarity only.
  42. 42. 2.3 Test Planning and Control 21 ISTQB Glossary test planning: The activity of establishing or updating a test plan. test plan: A document describing the scope, approach, resources and schedule of intended test activities. It identifies, among other test items, the features to be tested, the testing tasks, who will do each task, the degree of tester inde- pendence, the test environment, the test design techniques and entry and exit criteria to be used, and the rationale for their choice, and any risks requiring contingency planning. It is a record of the test planning process.2.3 Test Planning and Control Learning objectives Recall of content onlyThe concepts in this section apply primarily for test managers. There are nolearning objectives defined for technical test analysts in this section. In thecourse of studying for the exam, read this section in chapter 2 of the Advancedsyllabus for general recall and familiarity only.2.4 Test Analysis and Design Learning objectives (K2) Explain the stages in an application’s lifecycle where non- functional tests and architecture-based tests may be applied. Explain the causes of non-functional testing taking place only in specific stages of an application’s lifecycle. (K2) Give examples of the criteria that influence the structure and level of test condition development. (K2) Describe how test analysis and design are static testing techniques that can be used to discover defects. (K2) Explain by giving examples the concept of test oracles and how a test oracle can be used in test specifications.

×