SlideShare a Scribd company logo
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 1
© 2014 Carnegie Mellon University
Common System and Software
Testing Pitfalls
STAR East Conference
Orlando, Florida
Software Engineering Institute
Carnegie Mellon University
Pittsburgh, PA 15213
Donald G. Firesmith, Principle Engineer
6 May 2015
2
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Copyright 2015 Carnegie Mellon University
This material is based upon work funded and supported by the Department of Defense under
Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the
Software Engineering Institute, a federally funded research and development center.
Any opinions, findings and conclusions or recommendations expressed in this material are those
of the author(s) and do not necessarily reflect the views of the United States Department of
Defense.
NO WARRANTY. THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING
INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS” BASIS. CARNEGIE MELLON
UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED,
AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR
PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE
OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY
OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR
COPYRIGHT INFRINGEMENT.
This material has been approved for public release and unlimited distribution except as
restricted below.
This material may be reproduced in its entirety, without modification, and freely distributed in
written or electronic form without requesting formal permission. Permission is required for any
other use. Requests for permission should be directed to the Software Engineering Institute at
permission@sei.cmu.edu.
DM-0001886.
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 2
3
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Topics
Testing Pitfalls:
• Software vs. System Testing
• Taxonomy / Ontology
• Testing Challenges
• Addressing these Challenges
• Goals and Potential Uses
• Example Pitfall
Taxonomy of Common Testing Pitfalls (lists of pitfalls by
category):
• General Pitfalls
• Test-Type-Specific Pitfalls
Remaining Limitations and Questions
Future Work
4
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Testing Pitfalls
Testing Pitfall
• A human mistake that unnecessarily and unexpectedly causes testing to
be:
– Less effective at uncovering defects
– Less efficient in terms of time and effort expended
– More frustrating to perform
• A bad decision, an incorrect mindset, a wrong action, or failure to act
• A failure to adequately:
– Meet a testing challenge
– Address a testing problem
• A way to screw up testing
Common Testing Pitfall
• Observed numerous times on different projects
• Having sufficient frequency (and consequences ) to be a significant risk
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 3
5
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Software vs. System vs. Model Testing - 1
These testing pitfalls occur when testing:
• Software (applications, components, units)
• Systems (subsystems, hardware, software, data,
personnel, facilities, equipment, documentation, etc.)
• Executable models (requirements, architecture, design)
Most pitfalls apply to both software and system testing.
6
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Software vs. System vs. Model Testing - 2
The vast majority of software testers must address system
testing issues:
• Software executes on hardware, and how well it executes
depends on:
–That hardware
–Other software running on the same hardware
• Software communicates over:
–“External” networks (Internet, NIPRNet, SIPRNet, WAN,
LAN, MAN, etc.)
–Data-center-internal networks connecting servers and
data libraries (e.g., SAN)
–Busses within systems (embedded software)
• Software must meet quality requirements (thresholds of
relevant quality characteristics and attributes) that are
actually system-level, not software-level.
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 4
7
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Taxonomy and Ontology
Testing Pitfall Taxonomy
A hierarchical classification of testing pitfalls into
categories
Testing Pitfall Ontology
A hierarchy of concepts concerning testing pitfalls, using
a shared vocabulary to denote the types, properties and
interrelationships of these concepts
8
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Testing Pitfall Taxonomy and Ontology
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 5
9
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Testing Challenges
A great many different ways exist to screw up testing.
Multiple testing pitfalls are observed on just about every
project.
Different projects often exhibit different testing pitfalls.
In spite of many excellent how-to testing books, we see
projects falling into these same testing pitfalls over and over
again.
10
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Addressing these Challenges
Testing Pitfalls Taxonomy and Ontology
Anti-Pattern Language of how-not-to do testing
Common System and Software Testing Pitfalls (Addison-Wesley, 2014)
(Note: 35% conference discount):
• 92 pitfalls classified into 14 categories
• Technically reviewed by 47 international testing SMEs
Current taxonomy/repository with new pitfalls and pitfall categories:
• 167 pitfalls classified into 23 categories
• http://sites.google.com/a/firesmith.net/donald-firesmith/home/common-testing-
pitfalls [Work in progress - new content is draft and often incomplete]
Self Survey and Risk Prediction Tool
Eventual 2nd Edition of Pitfalls Book or Supplement to 1st Edition
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 6
11
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Goals and Potential Uses
Goals:
• To become the de facto industry-standard taxonomy of testing pitfalls
• To reduce the incidence of testing pitfalls and thereby improve testing
effectiveness and efficiency
• To improve the quality of the objects under test (OUTs)
Potential Uses:
• Training materials for testers and testing stakeholders
• Standard terminology regarding commonly occurring testing pitfalls
• Checklists for use when:
– Producing test strategies/plans and related documentations
– Evaluating contractor proposals
– Evaluating test strategies/plans and related documentation (quality control)
– Evaluating as-performed test process (quality assurance)
– Identifying test-related risks and their mitigation approaches
• Test metrics collection, analysis, and reporting
12
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Potential Uses
Training materials for testers and testing stakeholders
Standard terminology regarding commonly occurring testing
pitfalls
Checklists for use when:
• Producing test strategies/plans and related
documentations
• Evaluating contractor proposals
• Evaluating test strategies/plans and related
documentation (quality control)
• Evaluating as-performed test process (quality assurance)
• Identifying test-related risks and their mitigation
approaches
Categorization of pitfalls for test metrics collection, analysis,
and reporting
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 7
13
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Example – Testing and Engineering Processes
Not Integrated (GEN-PRO-7)
Description The testing process is not adequately integrated into the overall
system engineering process.
Potential Applicability This pitfall is potentially applicable anytime that
engineering and testing processes both exist.
Characteristic Symptoms
• There is little or no discussion of testing in the system engineering
documentation: System Engineering Management Plan (SEMP), Software
Development Plan (SDP), Work Breakdown Structure (WBS), Project Master
Schedule (PMS), or System Development Cycle (SDC).
• All or most of the testing is done as a completely independent activity
performed by staff members who are not part of the project engineering team.
• Testing is treated as a separate specialty engineering activity with only limited
interfaces with the primary engineering activities.
• Test scheduling is independent of the scheduling of other development
activities.
• Testers are not included in the requirements teams, architecture teams, or any
cross-functional engineering teams.
14
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Example – Testing and Engineering Processes
Not Integrated (GEN-PRO-7)
Potential Negative Consequences
• There is inadequate communication between testers and other system or
software engineers (for example, requirements engineers, architects,
designers, and implementers).
• Few nontesters understand the scope, complexity, and importance of testing.
• Testers do not understand the work being performed by other engineers.
• Testers can produce test cases and automated testing scripts before the
requirements, architecture, and design has stabilized, thereby forcing the
testers to modify their test cases and test scripts as the system or software
changes and incorrect hidden assumptions are uncovered.
• Testing is less effective and takes longer than necessary.
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 8
15
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Example – Testing and Engineering Processes
Not Integrated (GEN-PRO-7)
Potential Causes
• Testers were not involved in determining and documenting the
overall engineering process.
• The people determining and documenting the overall engineering
process did not have significant testing expertise, training, or
experience.
• The testing schedule has not been integrated into the overall project
schedule.
• Testing was outsourced.
16
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Example – Testing and Engineering Processes
Not Integrated (GEN-PRO-7)
Recommendations
• Prepare:
– Include testers in the initial staffing of the project.
• Enable:
– Provide a top-level briefing or training in testing to the chief system engineer, system architect,
and process engineer.
• Perform:
– Subject-matter experts and project testers collaborate closely with the project chief engineer or
technical lead and process engineer when they develop the engineering process descriptions
and associated process documents.
– Provide high-level overviews of testing in the SEMP(s) and SDP(s).
– Document how testing is integrated into the system development or life cycle, regardless of
whether it is traditional waterfall, evolutionary (iterative, incremental, and parallel), or anything in
between.
– For example, document handover points in the development cycle when testing input and output
work products are delivered from one project organization or group to another.
– Incorporate testing into the Project Master Schedule.
– Incorporate testing into the project’s Work Breakdown Structure (WBS).
• Verify:
– Determine whether testers were involved in planning the project’s system or software
development process.
– Determine whether testing is incorporated into the project’s System engineering process,
System development cycle, System Engineering Master Plan and System Development Plan,
Work Breakdown Structure, Master Schedule
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 9
17
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Example – Testing and Engineering Processes
Not Integrated (GEN-PRO-7)
Related Pitfalls
• Testing at the End (GEN-TPS-6) If the testing and engineering processes are not properly integrated, then
testing is more likely to be delayed to the end of development.
• Independent Test Schedule (GEN-TPS-7) If the testing and engineering processes are not integrated, then
the test schedule will be independent of and therefore probably incompatible with the overall project
master schedule.
• Testers Responsible for All Testing (GEN-STF-4) If the testing and engineering processes are not properly
integrated, then the developers are more likely to believe that the testers are responsible for all testing.
• Adversarial Relationship (GEN-STF-9) If the testing and engineering processes are not integrated, then
the developers and the testers are more likely to develop and adversarial rather than cooperative
relationship.
• Testing as a Phase (GEN-PRO-18) If the testing and engineering processes are not properly integrated,
then testing is more likely to be viewed as a phase that is separate from the rest of development.
• Testers Not Involved Early (GEN-PRO-19) If the testing and engineering processes are not properly
integrated, then testers are less likely to be involved early in the development process (such as during
initial planning, requirements engineering, and architecture engineering.
• Testing in Quality (GEN-PRO-23) If the testing and engineering processes are not properly integrated,
then the developers will be more likely to believe that quality is the testers responsibility and that quality
can be testing into the system or software.
• Developers Ignore Testability (GEN-PRO-24) If the testing and engineering processes are not properly
integrated, then the developers are more likely to ignore testability.
• Inadequate Communication Concerning Testing (GEN-COM-5) If the testing and engineering processes
are not properly integrated, then it is more likely that there will be inadequate communication between the
testers and the rest of the engineering staff.
18
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Categories of Testing Pitfalls – General
1 Test Planning and Scheduling
2 Stakeholder Involvement and Commitment
3 Management
4 Staffing
5 Testing Process
6 Test Design [new]
7 Pitfall-Related [new]
8 Test Tools and Environments
9 Automated Testing [new]
10 Test Communication
11 Testing-as-a-Service (TaaS) [new]
12 Requirements
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 10
19
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Test Planning and Scheduling
No Separate Test Planning Documentation (GEN-TPS-1)
Incomplete Test Planning (GEN-TPS-2)
Test Plans Ignored (GEN-TPS-3)
Test-Case Documents as Test Plans (GEN-TPS-4)
Inadequate Test Schedule (GEN-TPS-5)
→ Testing at the End (GEN-TPS-6)
Independent Test Schedule (GEN-TPS-7) [new]
20
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Stakeholder Involvement &
Commitment
Wrong Testing Mindset (GEN-SIC-1)
→ Unrealistic Testing Expectations (GEN-SIC-2)
Assuming Testing Only Verification Method Needed
(GEN-SIC-3)
Mistaking Demonstration for Testing (GEN-SIC-4)
Lack of Stakeholder Commitment to Testing (GEN-SIC-5)
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 11
21
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Management
Inadequate Test Resources (GEN-MGMT-1)
Inappropriate External Pressures (GEN-MGMT-2)
Inadequate Test-Related Risk Management (GEN-MGMT-3)
Inadequate Test Metrics (GEN-MGMT-4)
→ Inconvenient Test Results Ignored (GEN-MGMT-5)
Test Lessons Learned Ignored (GEN-MGMT-6)
22
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Staffing - 1
Lack of Independence (GEN-STF-1)
Unclear Testing Responsibilities (GEN-STF-2)
Developers Responsible for All Testing (GEN-STF-3)
Testers Responsible for All Testing (GEN-STF-4)
Testers Responsible for Ensuring Quality (GEN-STF-5 ) [new]
Testers Fix Defects (GEN-STF-6) [new]
Users Responsible for Testing (GEN-STF-7) [new]
→ Inadequate Testing Expertise (GEN-STF-8)
Inadequate Domain Expertise (GEN-STF-9) [new]
Adversarial Relationship (GEN-STF-10) [new]
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 12
23
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Staffing - 2
Too Few Testers (GEN-STF-11 ) [new]
Allowing Developers to Close Discrepancy Reports
(GEN-STF-12) [new]
Testing Death March (GEN-STF-13) [new]
All Testers Assumed Equal (GEN-STF-14) [new]
24
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Testing Process 1
No Planned Testing Process (GEN-PRO-1) [new pitfall]
Essentially No Testing (GEN-PRO-2) [new pitfall]
Inadequate Testing (GEN-PRO-3)
Testing Process Ignored (GEN-PRO-4) [new pitfall]
→ One-Size-Fits-All Testing (GEN-PRO-5)
Testing and Engineering Processes Not Integrated (GEN-
PRO-6)
Too Immature for Testing (GEN-PRO-7)
Inadequate Evaluations of Test Assets (GEN-PRO-8)
→ Inadequate Maintenance of Test Assets (GEN-PRO-9)
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 13
25
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Testing Process 2
Testing Assets Not Delivered (GEN-PRO-10) [combined 2
existing]
Testing as a Phase (GEN-PRO-11)
Testers Not Involved Early (GEN-PRO-12)
Developmental Testing During Production (GEN-PRO-13) [new]
No Operational Testing (GEN-PRO-14)
→ Testing in Quality (GEN-PRO-15) [new]
Developers Ignore Testability (GEN-PRO-16) [moved from other
category]
Failure to Address the Testing BackBlob (GEN-PRO-17) [new]
26
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Testing Process 3
Failure to Analyze Why Defects Escaped Detection
(GEN-PRO-18) [new]
Official Test Standards are Ignored (GEN-PRO-19) [new]
Official Test Standards are Slavishly Followed (GEN-PRO-20)
[new]
Developing New When Old Fails Tests (GEN-PRO-21) [new]
Integrating New or Updates When Fails Tests (GEN-PRO-22)
[new]
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 14
27
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Test Design
Sunny Day Testing Only (GEN-TD-1) [new]
Inadequate Test Prioritization (GEN-TD-2)
Test-Type Confusion (GEN-TD-3)
Functionality Testing Overemphasized (GEN-TD-4)
System Testing Overemphasized (GEN-TD-5)
System Testing Underemphasized (GEN-TD-6)
Test Preconditions Ignored (GEN-TD-7) [new]
Inadequate Test Data (GEN-TD-8)
Test Oracles Ignore Nondeterministic Behavior (GEN-TD-9)
[new]
28
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Pitfall-Related [new]
Overly Ambitious Process Improvement (GEN-PRP-1) [new]
→ Inadequate Pitfall Prioritization (GEN-PRP-2) [new]
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 15
29
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Test Tools and Environments
Over-Reliance on Testing Tools (GEN-TTE-1)
Target Platform Difficult to Access (GEN-TTE-2)
→ Inadequate Test Environments (GEN-TTE-3)
Poor Fidelity of Test Environments (GEN-TTE-4)
Inadequate Test Environment Quality (GEN-TTE-5)
Test Environments Inadequately Tested (GEN-TTE-6) [new]
Inadequate Test Configuration Management (GEN-TTE-7)
Inadequate Testing in a Staging Environment (GEN-TTE-8)
30
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Automated Testing - 1 [new]
Insufficient Automated Testing (GEN-AUTO-1) [new]
Automated Testing Replaces Manual Testing (GEN-AUTO-2)
[new]
Excessive Automated Testing (GEN-AUTO-3) [new]
Inappropriate Distribution of Automated Tests (GEN-AUTO-4)
[new]
Inadequate Automated Test Quality (GEN-AUTO-5) [new]
Excessively Complex Automated Tests (GEN-AUTO-6) [new]
→ Automated Tests Not Maintained (GEN-AUTO-7) [new]
Insufficient Resources Invested (GEN-AUTO-8) [new]
Inappropriate Automation Tools (GEN-AUTO-9) [new]
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 16
31
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Automated Testing - 2 [new]
Unclear Responsibilities for Automated Testing
(GEN-AUTO-10) [new]
Postponing Automated Testing Until Stable (GEN-AUTO-11)
[new]
Automated Testing as Silver Bullet (GEN-AUTO-12) [new]
32
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Test Communication
Inadequate Architecture or Design Documentation
(GEN-COM-1)
→ Inadequate Discrepancy Reports (GEN-COM-2)
Inadequate Test Documentation (GEN-COM-3)
Source Documents Not Maintained (GEN-COM-4)
Inadequate Communication Concerning Testing (GEN-COM-
5)
Inconsistent Testing Terminology (GEN-COM-6) [new]
Redundant Test Documents (GEN-COM-7) [new]
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 17
33
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Testing-as-a-Service (TaaS)
[new]
→ Cost-Driven Provider Selection (GEN-TaaS-1) [new]
Inadequate Oversight (GEN-TaaS-2) [new]
Lack of Outsourcing Expertise (GEN-TaaS-3) [new]
Inappropriate TaaS Contract (GEN-TaaS-4) [new]
TaaS Provider Improperly Chosen (GEN-TaaS-5) [new]
34
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Requirements - 1
→ Tests as Requirements (GEN-REQ-1) [new]
Ambiguous Requirements (GEN-REQ-2)
Obsolete Requirements (GEN-REQ-3)
Missing Requirements (GEN-REQ-4)
Incomplete Requirements (GEN-REQ-5)
Incorrect Requirements (GEN-REQ-6)
Requirements Churn (GEN-REQ-7)
Improperly Derived Requirements (GEN-REQ-8)
Verification Methods Not Adequately Specified (GEN-REQ-9)
Lack of Requirements Trace (GEN-REQ-10)
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 18
35
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Requirements - 2
Titanic Effect of Deferred Requirements (GEN-REQ-11) [new]
Implicit Requirements Ignored (GEN-REQ-12) [new]
36
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Categories of Testing Pitfalls –
Test-Type-Specific Pitfalls
1 Executable Model Testing [new]
2 Unit Testing
3 Integration Testing
4 Specialty Engineering Testing
5 System Testing
6 User Testing [new]
7 A/B Testing [new]
8 Acceptance Testing [new]
9 Operational Testing (OT) [new]
10 System of Systems (SoS) Testing
11 Regression Testing
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 19
37
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Test Type Specific Pitfalls –
Executable Model Testing [new]
Inadequate Executable Models (TTS-MOD-1) [new]
→ Executable Models Not Tested (TTS-MOD-2) [new]
38
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Test Type Specific Pitfalls – Unit Testing
Testing Does Not Drive Design and Implementation
(TTS-UNT-1)
→ Conflict of Interest (TTS-UNT-2)
Untestable Units (TTS-UNT-3) [new]
Brittle Test Cases (TTS-UNT-4) [new]
No Unit Testing (TTS-UNT-5) [new]
Unit Testing of Automatically Generated Units (TTS-UNT-6)
[new]
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 20
39
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Test Type Specific Pitfalls –
Integration Testing
Integration Decreases Testability Ignored (TTS-INT-1)
Inadequate Self-Testing (TTP-INT-2)
Unavailable Components (TTS-INT-3)
→ System Testing as Integration Testing (TTS-INT-4)
40
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Test Type Specific Pitfalls –
Specialty Engineering Testing - 1
Inadequate Capacity Testing (TTS-SPC-1)
Inadequate Concurrency Testing (TTS-SPC-2)
Inadequate Interface Standards Compliance Testing
(TTS-SPC-3) [new]
Inadequate Internationalization Testing (TTS-SPC-4)
Inadequate Interoperability Testing (TTS-SPC-5)
Inadequate Performance Testing (TTS-SPC-6)
→ Inadequate Portability Testing (TTS-SPC-7) [new]
Inadequate Reliability Testing (TTS-SPC-8)
Inadequate Robustness Testing (TTS-SPC-9)
Inadequate Safety Testing (TTS-SPC-10)
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 21
41
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Test Type Specific Pitfalls –
Specialty Engineering Testing - 2
Inadequate Security Testing (TTS-SPC-11)
Inadequate Usability Testing (TTS-SPC-12)
42
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Test Type Specific Pitfalls – System Testing
Test Hooks Remain (TTS-SYS-1)
Lack of Test Hooks (TTS-SYS-2)
→ Inadequate End-to-End Testing (TTS-SYS-3)
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 22
43
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Test Type Specific Pitfalls – User Testing [new]
Inadequate User Involvement (TTS-UT-1) [new]
Unprepared User Representatives (TTS-UT-2) [new]
User Testing Merely Repeats System Testing (TTS-UT-3)
[new]
User Testing is Mistaken for Acceptance Testing (TTS-UT-4)
[new]
→ Assume Knowledgeable and Careful User (TTS-UT-5)
[new]
User Testing Too Late to Fix Defects (TTS-UT-6) [new]
44
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Test Type Specific Pitfalls – A/B Testing [new]
Poor Key Performance Indicators (TTS-ABT-1) [new]
Misuse of Probability and Statistics (TTS-ABT-2) [new]
→ Confusing Statistical Significance with Business
Significance (TTS-ABT-3) [new]
Source(s) of Error Not Controlled (TTS-ABT-4) [new]
System Variant(s) Changed During Test (TTS-ABT-5) [new]
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 23
45
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Test Type Specific Pitfalls –
Acceptance Testing [new]
→ No Clear System Acceptance Criteria (TTS-AT-1)
[new]
Acceptance Testing Only Tests Functionality (TTS-AT-2)
[new]
Developers Determine Acceptance Tests (TTS-AT-3) [new]
46
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Test Type Specific Pitfalls –
Operational Testing (T) [new]
→ No On-Site Software Developers (TTS-OT-1) [new]
No Operational Testing [new]
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 24
47
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Test Type Specific Pitfalls –
System of System (SoS) Testing
Inadequate SoS Test Planning (TTS-SoS-1)
Unclear SoS Testing Responsibilities (TTS-SoS-2)
Inadequate Resources for SoS Testing (TTS-SoS-3)
SoS Testing not Coordinated Across Projects (TTS-SoS-4)
Inadequate SoS Requirements (TTS-SoS-5)
→ Inadequate Support from Individual System Projects
(TTS-SoS-6)
Inadequate Defect Tracking Across Projects (TTS-SoS-7)
Finger-Pointing (TTS-SoS-8)
48
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
General Pitfalls – Regression Testing
Inadequate Regression Test Automation (GEN-REG-1)
Regression Testing Not Performed (GEN-REG-2)
Inadequate Scope of Regression Testing (GEN-REG-3)
Only Low-Level Regression Tests (GEN-REG-4)
Only Functional Regression Testing (GEN-REG-5)
Inadequate Retesting of Reused Software (TTS-REG-6)
[new]
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 25
49
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Remaining Limitation
Current Taxonomy is Experience Based:
• Based on experience testing and assessing testing
programs
(author, SEI ITAs, technical reviewers)
• Not the result of documentation study or formal academic
research
50
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Remaining Questions
Which pitfalls occur most often? With what frequency?
Which pitfalls cause the most harm?
Which pitfalls have the highest risk
(expected harm = harm frequency x harm)?
What factors (e.g., system/software size and complexity,
application domain, process) influence frequency, harm,
and risk?
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 26
51
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Future Work - 1
Second Edition of Book
Extensive Technical Review:
• New testing pitfall categories
• New and modified testing pitfalls
52
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Future Work – Proper Industry Survey
How likely are the different testing pitfalls? What are the 10
most common?
What pitfalls have the worst consequences? What are the 10
worst pitfalls?
What pitfalls have the highest risk? What are the 10 highest
risk pitfalls?
Do the answers to these questions vary by:
• System (size, complexity, criticality, application domain, software
only vs. HW/SW/people/documentation/facilities/procedures ,
system vs. SoS vs. PL)?
• Project (type, formality, lifecycle scope, schedule, funding,
commercial vs. government/military, )
• Organization (number, size, type, governance,
management/engineering culture, )
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 27
Save 35%* at informit.com
Discount code:
FIRESMITH550
• informit.com - search on Firesmith
• Available as book & eBook
• FREE shipping in the U.S.
*Offer expires Dec 31, 2015
http://sites.google.com/a/firesmith.net/donald-
firesmith/home/common-testing-pitfalls
54
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Contact Information Slide Format
Donald G. Firesmith
Principal Engineer
Software Solutions Division
Telephone: +1 412-268-6874
Email: dgf@sei.cmu.edu
U.S. Mail
Software Engineering Institute
Customer Relations
4500 Fifth Avenue
Pittsburgh, PA 15213-2612
USA
Web
www.sei.cmu.edu
www.sei.cmu.edu/contact.cfm
Customer Relations
Email: info@sei.cmu.edu
Telephone: +1 412-268-5800
SEI Phone: +1 412-268-5800
SEI Fax: +1 412-268-6257
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 28
55
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Backup Slides
56
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
What is Testing?
Testing
The execution of an Object Under Test (OUT) under specific preconditions
(for example, pretest mode, states, stored data, or external conditions) with
specific inputs so that its actual behavior (outputs and postconditions) can be
compared with its expected or required behavior
Notes:
• The OUT can be:
– An executable requirements, architecture, or design model
– A system or subsystem (including hardware, data, personnel, etc.)
– A software application, component, or unit
• Not just software
• Requires execution (do not confuse testing with QE or with other means of
verification such as analysis, certification, demonstration, inspection, )
• Requires controllability to set up preconditions and provide inputs
• Requires observability to determine outputs and postconditions
Author
Program
4/27/2015
© 2014 Carnegie Mellon University 29
57
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Why Test?
In roughly decreasing order of importance, the goals of testing are to:
• Uncover Significant Defects in the object under test (OUT) by causing it to
behave incorrectly (e.g., to fail or enter a faulty state) so that these underlying
defects can be identified and fixed and the OUT can thereby be improved.
• Provide Evidence that can be used to determine the OUT’s:
– Quality
– Fitness for purpose
– Readiness for shipping, deployment, or being placed into operation
• Support Process Improvement by helping to identify:
– Development processes that introduce defects
– Testing processes that fail to uncover defects
• Prevent Defects by:
– Testing executable requirements, architecture, and design models so that
defects in the models are fixed before they can result in defects in the
system/software.
– Using Test Driven Development (TDD) to develop test cases and then using
these test cases to drive development (design and implementation)
58
Common System/SW Testing Pitfalls
Donald G. Firesmith, 6 May 2015
Limitations of Testing
Cannot be exhaustive (or even close)
Cannot uncover all defects:
• Different levels of testing have different defect removal efficiencies (DREs)
• Different types of testing uncover different types of defects
May provide false positive and false negative results
Cannot prove the OUT works properly under all inputs and conditions

More Related Content

What's hot

Lessons Learned in Software Quality 1
Lessons Learned in Software Quality 1Lessons Learned in Software Quality 1
Lessons Learned in Software Quality 1
Belal Raslan
 
ISTQB CTAL - Test Analyst
ISTQB CTAL - Test AnalystISTQB CTAL - Test Analyst
ISTQB CTAL - Test Analyst
Samer Desouky
 
User Experiments in Human-Computer Interaction
User Experiments in Human-Computer InteractionUser Experiments in Human-Computer Interaction
User Experiments in Human-Computer Interaction
Dr. Arindam Dey
 
5WCSQ(CFP) - Quality Improvement by the Real-Time Detection of the Problems
5WCSQ(CFP) - Quality Improvement by the Real-Time Detection of the Problems5WCSQ(CFP) - Quality Improvement by the Real-Time Detection of the Problems
5WCSQ(CFP) - Quality Improvement by the Real-Time Detection of the ProblemsTakanori Suzuki
 
Test Axioms – An Introduction
Test Axioms – An IntroductionTest Axioms – An Introduction
Test Axioms – An Introduction
Paul Gerrard
 
Software Testing-Dynamic testing technique-Mazenet solution
Software Testing-Dynamic testing technique-Mazenet solutionSoftware Testing-Dynamic testing technique-Mazenet solution
Software Testing-Dynamic testing technique-Mazenet solution
Mazenetsolution
 
Software Testing
Software TestingSoftware Testing
Software Testing
Kiran Kumar
 
Bab 1 Fundamentals Of Testing
Bab 1 Fundamentals Of TestingBab 1 Fundamentals Of Testing
Bab 1 Fundamentals Of Testing
lolayoriva
 
Evaluation of eLearning
Evaluation of eLearningEvaluation of eLearning
Evaluation of eLearning
Michael M Grant
 
ISTQB / ISEB Foundation Exam Practice - 4
ISTQB / ISEB Foundation Exam Practice - 4ISTQB / ISEB Foundation Exam Practice - 4
ISTQB / ISEB Foundation Exam Practice - 4
Yogindernath Gupta
 
Research Activities: past, present, and future.
Research Activities: past, present, and future.Research Activities: past, present, and future.
Research Activities: past, present, and future.
Marco Torchiano
 
Evaluation in hci
Evaluation in hciEvaluation in hci
Evaluation in hci
sajid rao
 
Tool support for testing
Tool support for testingTool support for testing
Tool support for testing
Amelia Septia Roza
 
Bottom-up Adoption of Continuous Delivery in a Stage-gate Managed Software Or...
Bottom-up Adoption of Continuous Delivery in a Stage-gate Managed Software Or...Bottom-up Adoption of Continuous Delivery in a Stage-gate Managed Software Or...
Bottom-up Adoption of Continuous Delivery in a Stage-gate Managed Software Or...
Eero Laukkanen
 
Human Computer Interaction Evaluation
Human Computer Interaction EvaluationHuman Computer Interaction Evaluation
Human Computer Interaction Evaluation
LGS, GBHS&IC, University Of South-Asia, TARA-Technologies
 
Chapter3-evaluation techniques HCI
Chapter3-evaluation techniques HCIChapter3-evaluation techniques HCI
Chapter3-evaluation techniques HCIShafy Fify
 
A Regression Analysis Approach for Building a Prediction Model for System Tes...
A Regression Analysis Approach for Building a Prediction Model for System Tes...A Regression Analysis Approach for Building a Prediction Model for System Tes...
A Regression Analysis Approach for Building a Prediction Model for System Tes...
MIMOS Berhad/Open University Malaysia/Universiti Teknologi Malaysia
 
Introduction of software engineering
Introduction of software engineeringIntroduction of software engineering
Introduction of software engineering
BhagyashriMore10
 

What's hot (20)

Lessons Learned in Software Quality 1
Lessons Learned in Software Quality 1Lessons Learned in Software Quality 1
Lessons Learned in Software Quality 1
 
ISTQB CTAL - Test Analyst
ISTQB CTAL - Test AnalystISTQB CTAL - Test Analyst
ISTQB CTAL - Test Analyst
 
User Experiments in Human-Computer Interaction
User Experiments in Human-Computer InteractionUser Experiments in Human-Computer Interaction
User Experiments in Human-Computer Interaction
 
5WCSQ(CFP) - Quality Improvement by the Real-Time Detection of the Problems
5WCSQ(CFP) - Quality Improvement by the Real-Time Detection of the Problems5WCSQ(CFP) - Quality Improvement by the Real-Time Detection of the Problems
5WCSQ(CFP) - Quality Improvement by the Real-Time Detection of the Problems
 
Test Axioms – An Introduction
Test Axioms – An IntroductionTest Axioms – An Introduction
Test Axioms – An Introduction
 
Qa Faqs
Qa FaqsQa Faqs
Qa Faqs
 
Software Testing-Dynamic testing technique-Mazenet solution
Software Testing-Dynamic testing technique-Mazenet solutionSoftware Testing-Dynamic testing technique-Mazenet solution
Software Testing-Dynamic testing technique-Mazenet solution
 
Software Testing
Software TestingSoftware Testing
Software Testing
 
Bab 1 Fundamentals Of Testing
Bab 1 Fundamentals Of TestingBab 1 Fundamentals Of Testing
Bab 1 Fundamentals Of Testing
 
Evaluation of eLearning
Evaluation of eLearningEvaluation of eLearning
Evaluation of eLearning
 
ISTQB / ISEB Foundation Exam Practice - 4
ISTQB / ISEB Foundation Exam Practice - 4ISTQB / ISEB Foundation Exam Practice - 4
ISTQB / ISEB Foundation Exam Practice - 4
 
Research Activities: past, present, and future.
Research Activities: past, present, and future.Research Activities: past, present, and future.
Research Activities: past, present, and future.
 
Evaluation in hci
Evaluation in hciEvaluation in hci
Evaluation in hci
 
01
0101
01
 
Tool support for testing
Tool support for testingTool support for testing
Tool support for testing
 
Bottom-up Adoption of Continuous Delivery in a Stage-gate Managed Software Or...
Bottom-up Adoption of Continuous Delivery in a Stage-gate Managed Software Or...Bottom-up Adoption of Continuous Delivery in a Stage-gate Managed Software Or...
Bottom-up Adoption of Continuous Delivery in a Stage-gate Managed Software Or...
 
Human Computer Interaction Evaluation
Human Computer Interaction EvaluationHuman Computer Interaction Evaluation
Human Computer Interaction Evaluation
 
Chapter3-evaluation techniques HCI
Chapter3-evaluation techniques HCIChapter3-evaluation techniques HCI
Chapter3-evaluation techniques HCI
 
A Regression Analysis Approach for Building a Prediction Model for System Tes...
A Regression Analysis Approach for Building a Prediction Model for System Tes...A Regression Analysis Approach for Building a Prediction Model for System Tes...
A Regression Analysis Approach for Building a Prediction Model for System Tes...
 
Introduction of software engineering
Introduction of software engineeringIntroduction of software engineering
Introduction of software engineering
 

Viewers also liked

DevBrasil Open Day 2013 - Desenvolvimento para Windows Phone 8
DevBrasil Open Day 2013 - Desenvolvimento para Windows Phone 8DevBrasil Open Day 2013 - Desenvolvimento para Windows Phone 8
DevBrasil Open Day 2013 - Desenvolvimento para Windows Phone 8
Thiago Lunardi
 
4gvze.pdf
4gvze.pdf4gvze.pdf
4gvze.pdf
Jeff Smith
 
Контекстная реклама по минимальной цене
Контекстная реклама по минимальной ценеКонтекстная реклама по минимальной цене
Контекстная реклама по минимальной цене
Astra Media Group, Russia
 
Petrobras
PetrobrasPetrobras
Petrobras
Jailson Lima
 
VERTIGO@MINIGHTPOSTERFINAL
VERTIGO@MINIGHTPOSTERFINALVERTIGO@MINIGHTPOSTERFINAL
VERTIGO@MINIGHTPOSTERFINALValorie Thomas
 
Upstage Video Letter of Recommendation
Upstage Video Letter of RecommendationUpstage Video Letter of Recommendation
Upstage Video Letter of RecommendationVincent Kershner
 
Emad Experiance Certificate Global
Emad Experiance Certificate GlobalEmad Experiance Certificate Global
Emad Experiance Certificate GlobalEmad Mittias
 
Контекстная реклама по минимальной цене
Контекстная реклама по минимальной ценеКонтекстная реклама по минимальной цене
Контекстная реклама по минимальной цене
Astra Media Group, Russia
 
Procedimentos argumentativos
Procedimentos argumentativosProcedimentos argumentativos
Procedimentos argumentativos
Gedalias .
 

Viewers also liked (20)

DevBrasil Open Day 2013 - Desenvolvimento para Windows Phone 8
DevBrasil Open Day 2013 - Desenvolvimento para Windows Phone 8DevBrasil Open Day 2013 - Desenvolvimento para Windows Phone 8
DevBrasil Open Day 2013 - Desenvolvimento para Windows Phone 8
 
4gvze.pdf
4gvze.pdf4gvze.pdf
4gvze.pdf
 
Контекстная реклама по минимальной цене
Контекстная реклама по минимальной ценеКонтекстная реклама по минимальной цене
Контекстная реклама по минимальной цене
 
Petrobras
PetrobrasPetrobras
Petrobras
 
VERTIGO@MINIGHTPOSTERFINAL
VERTIGO@MINIGHTPOSTERFINALVERTIGO@MINIGHTPOSTERFINAL
VERTIGO@MINIGHTPOSTERFINAL
 
NZ Diploma in Business
NZ Diploma in BusinessNZ Diploma in Business
NZ Diploma in Business
 
Benedix Certificate
Benedix CertificateBenedix Certificate
Benedix Certificate
 
Degree
DegreeDegree
Degree
 
Upstage Video Letter of Recommendation
Upstage Video Letter of RecommendationUpstage Video Letter of Recommendation
Upstage Video Letter of Recommendation
 
Emad Experiance Certificate Global
Emad Experiance Certificate GlobalEmad Experiance Certificate Global
Emad Experiance Certificate Global
 
Bachelor degree
Bachelor degreeBachelor degree
Bachelor degree
 
RefLetter_Erlangga
RefLetter_ErlanggaRefLetter_Erlangga
RefLetter_Erlangga
 
77 guarda o contacto
77   guarda o contacto77   guarda o contacto
77 guarda o contacto
 
56 tudo em cristo
56   tudo em cristo56   tudo em cristo
56 tudo em cristo
 
71 santo és tu senhor
71   santo és tu senhor71   santo és tu senhor
71 santo és tu senhor
 
Контекстная реклама по минимальной цене
Контекстная реклама по минимальной ценеКонтекстная реклама по минимальной цене
Контекстная реклама по минимальной цене
 
42 saudai, jesus
42   saudai, jesus42   saudai, jesus
42 saudai, jesus
 
Procedimentos argumentativos
Procedimentos argumentativosProcedimentos argumentativos
Procedimentos argumentativos
 
32 meu cristo
32   meu cristo32   meu cristo
32 meu cristo
 
50 sempre fiéis
50   sempre fiéis50   sempre fiéis
50 sempre fiéis
 

Similar to Common System and Software Testing Pitfalls

Testing Types and Paradigms - 2015-07-13 - V11
Testing Types and Paradigms - 2015-07-13 - V11Testing Types and Paradigms - 2015-07-13 - V11
Testing Types and Paradigms - 2015-07-13 - V11Donald Firesmith
 
Fundamentals of Testing Section 1/6
Fundamentals of Testing   Section 1/6Fundamentals of Testing   Section 1/6
Fundamentals of Testing Section 1/6
International Personal Finance Plc
 
powerpoint template for testing training
powerpoint template for testing trainingpowerpoint template for testing training
powerpoint template for testing trainingJohn Roddy
 
Unit 1 part 2
Unit 1 part 2Unit 1 part 2
Unit 1 part 2
Roselin Mary S
 
IT8076 - SOFTWARE TESTING
IT8076 - SOFTWARE TESTINGIT8076 - SOFTWARE TESTING
IT8076 - SOFTWARE TESTING
Sathya R
 
CTFL chapter 05
CTFL chapter 05CTFL chapter 05
CTFL chapter 05
Davis Thomas
 
Test Management
Test ManagementTest Management
Test Management
Suci Ayu Mawarni
 
chapter-no-4-test-management fudhg ddh j
chapter-no-4-test-management fudhg ddh jchapter-no-4-test-management fudhg ddh j
chapter-no-4-test-management fudhg ddh j
AmitDeshai
 
Fundamentals of testing
Fundamentals of testingFundamentals of testing
Fundamentals of testing
Muhammad Khairil
 
Schiable
SchiableSchiable
SchiableNASAPMC
 
Fundamentals_of_Software_testing.pptx
Fundamentals_of_Software_testing.pptxFundamentals_of_Software_testing.pptx
Fundamentals_of_Software_testing.pptx
MusaBashir9
 
Fundamentals of Testing (2013)
Fundamentals of Testing (2013)Fundamentals of Testing (2013)
Fundamentals of Testing (2013)
Jana Gierloff
 
Testing 3 test design techniques
Testing 3 test design techniquesTesting 3 test design techniques
Testing 3 test design techniques
Mini Marsiah
 
Istqb lesson1
Istqb lesson1Istqb lesson1
Istqb lesson1
Sunday Ayandele
 
Online testing strategy
Online testing strategyOnline testing strategy
Online testing strategy
Cloud9 Consulting
 
ISTQB - What's testing
ISTQB - What's testingISTQB - What's testing
ISTQB - What's testing
HoangThiHien1
 
01. foundamentals of testing
01. foundamentals of testing01. foundamentals of testing
01. foundamentals of testing
Tricia Karina
 

Similar to Common System and Software Testing Pitfalls (20)

Testing Types and Paradigms - 2015-07-13 - V11
Testing Types and Paradigms - 2015-07-13 - V11Testing Types and Paradigms - 2015-07-13 - V11
Testing Types and Paradigms - 2015-07-13 - V11
 
Fundamentals of Testing Section 1/6
Fundamentals of Testing   Section 1/6Fundamentals of Testing   Section 1/6
Fundamentals of Testing Section 1/6
 
powerpoint template for testing training
powerpoint template for testing trainingpowerpoint template for testing training
powerpoint template for testing training
 
Unit 1 part 2
Unit 1 part 2Unit 1 part 2
Unit 1 part 2
 
IT8076 - SOFTWARE TESTING
IT8076 - SOFTWARE TESTINGIT8076 - SOFTWARE TESTING
IT8076 - SOFTWARE TESTING
 
CTFL chapter 05
CTFL chapter 05CTFL chapter 05
CTFL chapter 05
 
Test Management
Test ManagementTest Management
Test Management
 
chapter-no-4-test-management fudhg ddh j
chapter-no-4-test-management fudhg ddh jchapter-no-4-test-management fudhg ddh j
chapter-no-4-test-management fudhg ddh j
 
Fundamentals of testing
Fundamentals of testingFundamentals of testing
Fundamentals of testing
 
Schiable
SchiableSchiable
Schiable
 
Check upload1
Check upload1Check upload1
Check upload1
 
Prvt file test
Prvt file testPrvt file test
Prvt file test
 
Fundamentals_of_Software_testing.pptx
Fundamentals_of_Software_testing.pptxFundamentals_of_Software_testing.pptx
Fundamentals_of_Software_testing.pptx
 
Fundamentals of Testing (2013)
Fundamentals of Testing (2013)Fundamentals of Testing (2013)
Fundamentals of Testing (2013)
 
Testing 3 test design techniques
Testing 3 test design techniquesTesting 3 test design techniques
Testing 3 test design techniques
 
Istqb lesson1
Istqb lesson1Istqb lesson1
Istqb lesson1
 
Software Testing 2/5
Software Testing 2/5Software Testing 2/5
Software Testing 2/5
 
Online testing strategy
Online testing strategyOnline testing strategy
Online testing strategy
 
ISTQB - What's testing
ISTQB - What's testingISTQB - What's testing
ISTQB - What's testing
 
01. foundamentals of testing
01. foundamentals of testing01. foundamentals of testing
01. foundamentals of testing
 

More from TechWell

Failing and Recovering
Failing and RecoveringFailing and Recovering
Failing and Recovering
TechWell
 
Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization
TechWell
 
Test Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTest Design for Fully Automated Build Architecture
Test Design for Fully Automated Build Architecture
TechWell
 
System-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartSystem-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good Start
TechWell
 
Build Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyBuild Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test Strategy
TechWell
 
Testing Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTesting Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for Success
TechWell
 
Implement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowImplement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlow
TechWell
 
Develop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityDevelop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your Sanity
TechWell
 
Ma 15
Ma 15Ma 15
Ma 15
TechWell
 
Eliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyEliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps Strategy
TechWell
 
Transform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTransform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOps
TechWell
 
The Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipThe Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—Leadership
TechWell
 
Resolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsResolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile Teams
TechWell
 
Pin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GamePin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile Game
TechWell
 
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsAgile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
TechWell
 
A Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationA Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps Implementation
TechWell
 
Databases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessDatabases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery Process
TechWell
 
Mobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateMobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to Automate
TechWell
 
Cultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessCultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for Success
TechWell
 
Turn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTurn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile Transformation
TechWell
 

More from TechWell (20)

Failing and Recovering
Failing and RecoveringFailing and Recovering
Failing and Recovering
 
Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization
 
Test Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTest Design for Fully Automated Build Architecture
Test Design for Fully Automated Build Architecture
 
System-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartSystem-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good Start
 
Build Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyBuild Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test Strategy
 
Testing Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTesting Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for Success
 
Implement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowImplement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlow
 
Develop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityDevelop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your Sanity
 
Ma 15
Ma 15Ma 15
Ma 15
 
Eliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyEliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps Strategy
 
Transform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTransform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOps
 
The Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipThe Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—Leadership
 
Resolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsResolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile Teams
 
Pin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GamePin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile Game
 
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsAgile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
 
A Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationA Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps Implementation
 
Databases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessDatabases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery Process
 
Mobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateMobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to Automate
 
Cultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessCultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for Success
 
Turn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTurn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile Transformation
 

Common System and Software Testing Pitfalls

  • 1. Author Program 4/27/2015 © 2014 Carnegie Mellon University 1 © 2014 Carnegie Mellon University Common System and Software Testing Pitfalls STAR East Conference Orlando, Florida Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Donald G. Firesmith, Principle Engineer 6 May 2015 2 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Copyright 2015 Carnegie Mellon University This material is based upon work funded and supported by the Department of Defense under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Department of Defense. NO WARRANTY. THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS” BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. This material has been approved for public release and unlimited distribution except as restricted below. This material may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. DM-0001886.
  • 2. Author Program 4/27/2015 © 2014 Carnegie Mellon University 2 3 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Topics Testing Pitfalls: • Software vs. System Testing • Taxonomy / Ontology • Testing Challenges • Addressing these Challenges • Goals and Potential Uses • Example Pitfall Taxonomy of Common Testing Pitfalls (lists of pitfalls by category): • General Pitfalls • Test-Type-Specific Pitfalls Remaining Limitations and Questions Future Work 4 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Testing Pitfalls Testing Pitfall • A human mistake that unnecessarily and unexpectedly causes testing to be: – Less effective at uncovering defects – Less efficient in terms of time and effort expended – More frustrating to perform • A bad decision, an incorrect mindset, a wrong action, or failure to act • A failure to adequately: – Meet a testing challenge – Address a testing problem • A way to screw up testing Common Testing Pitfall • Observed numerous times on different projects • Having sufficient frequency (and consequences ) to be a significant risk
  • 3. Author Program 4/27/2015 © 2014 Carnegie Mellon University 3 5 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Software vs. System vs. Model Testing - 1 These testing pitfalls occur when testing: • Software (applications, components, units) • Systems (subsystems, hardware, software, data, personnel, facilities, equipment, documentation, etc.) • Executable models (requirements, architecture, design) Most pitfalls apply to both software and system testing. 6 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Software vs. System vs. Model Testing - 2 The vast majority of software testers must address system testing issues: • Software executes on hardware, and how well it executes depends on: –That hardware –Other software running on the same hardware • Software communicates over: –“External” networks (Internet, NIPRNet, SIPRNet, WAN, LAN, MAN, etc.) –Data-center-internal networks connecting servers and data libraries (e.g., SAN) –Busses within systems (embedded software) • Software must meet quality requirements (thresholds of relevant quality characteristics and attributes) that are actually system-level, not software-level.
  • 4. Author Program 4/27/2015 © 2014 Carnegie Mellon University 4 7 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Taxonomy and Ontology Testing Pitfall Taxonomy A hierarchical classification of testing pitfalls into categories Testing Pitfall Ontology A hierarchy of concepts concerning testing pitfalls, using a shared vocabulary to denote the types, properties and interrelationships of these concepts 8 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Testing Pitfall Taxonomy and Ontology
  • 5. Author Program 4/27/2015 © 2014 Carnegie Mellon University 5 9 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Testing Challenges A great many different ways exist to screw up testing. Multiple testing pitfalls are observed on just about every project. Different projects often exhibit different testing pitfalls. In spite of many excellent how-to testing books, we see projects falling into these same testing pitfalls over and over again. 10 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Addressing these Challenges Testing Pitfalls Taxonomy and Ontology Anti-Pattern Language of how-not-to do testing Common System and Software Testing Pitfalls (Addison-Wesley, 2014) (Note: 35% conference discount): • 92 pitfalls classified into 14 categories • Technically reviewed by 47 international testing SMEs Current taxonomy/repository with new pitfalls and pitfall categories: • 167 pitfalls classified into 23 categories • http://sites.google.com/a/firesmith.net/donald-firesmith/home/common-testing- pitfalls [Work in progress - new content is draft and often incomplete] Self Survey and Risk Prediction Tool Eventual 2nd Edition of Pitfalls Book or Supplement to 1st Edition
  • 6. Author Program 4/27/2015 © 2014 Carnegie Mellon University 6 11 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Goals and Potential Uses Goals: • To become the de facto industry-standard taxonomy of testing pitfalls • To reduce the incidence of testing pitfalls and thereby improve testing effectiveness and efficiency • To improve the quality of the objects under test (OUTs) Potential Uses: • Training materials for testers and testing stakeholders • Standard terminology regarding commonly occurring testing pitfalls • Checklists for use when: – Producing test strategies/plans and related documentations – Evaluating contractor proposals – Evaluating test strategies/plans and related documentation (quality control) – Evaluating as-performed test process (quality assurance) – Identifying test-related risks and their mitigation approaches • Test metrics collection, analysis, and reporting 12 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Potential Uses Training materials for testers and testing stakeholders Standard terminology regarding commonly occurring testing pitfalls Checklists for use when: • Producing test strategies/plans and related documentations • Evaluating contractor proposals • Evaluating test strategies/plans and related documentation (quality control) • Evaluating as-performed test process (quality assurance) • Identifying test-related risks and their mitigation approaches Categorization of pitfalls for test metrics collection, analysis, and reporting
  • 7. Author Program 4/27/2015 © 2014 Carnegie Mellon University 7 13 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) Description The testing process is not adequately integrated into the overall system engineering process. Potential Applicability This pitfall is potentially applicable anytime that engineering and testing processes both exist. Characteristic Symptoms • There is little or no discussion of testing in the system engineering documentation: System Engineering Management Plan (SEMP), Software Development Plan (SDP), Work Breakdown Structure (WBS), Project Master Schedule (PMS), or System Development Cycle (SDC). • All or most of the testing is done as a completely independent activity performed by staff members who are not part of the project engineering team. • Testing is treated as a separate specialty engineering activity with only limited interfaces with the primary engineering activities. • Test scheduling is independent of the scheduling of other development activities. • Testers are not included in the requirements teams, architecture teams, or any cross-functional engineering teams. 14 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) Potential Negative Consequences • There is inadequate communication between testers and other system or software engineers (for example, requirements engineers, architects, designers, and implementers). • Few nontesters understand the scope, complexity, and importance of testing. • Testers do not understand the work being performed by other engineers. • Testers can produce test cases and automated testing scripts before the requirements, architecture, and design has stabilized, thereby forcing the testers to modify their test cases and test scripts as the system or software changes and incorrect hidden assumptions are uncovered. • Testing is less effective and takes longer than necessary.
  • 8. Author Program 4/27/2015 © 2014 Carnegie Mellon University 8 15 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) Potential Causes • Testers were not involved in determining and documenting the overall engineering process. • The people determining and documenting the overall engineering process did not have significant testing expertise, training, or experience. • The testing schedule has not been integrated into the overall project schedule. • Testing was outsourced. 16 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) Recommendations • Prepare: – Include testers in the initial staffing of the project. • Enable: – Provide a top-level briefing or training in testing to the chief system engineer, system architect, and process engineer. • Perform: – Subject-matter experts and project testers collaborate closely with the project chief engineer or technical lead and process engineer when they develop the engineering process descriptions and associated process documents. – Provide high-level overviews of testing in the SEMP(s) and SDP(s). – Document how testing is integrated into the system development or life cycle, regardless of whether it is traditional waterfall, evolutionary (iterative, incremental, and parallel), or anything in between. – For example, document handover points in the development cycle when testing input and output work products are delivered from one project organization or group to another. – Incorporate testing into the Project Master Schedule. – Incorporate testing into the project’s Work Breakdown Structure (WBS). • Verify: – Determine whether testers were involved in planning the project’s system or software development process. – Determine whether testing is incorporated into the project’s System engineering process, System development cycle, System Engineering Master Plan and System Development Plan, Work Breakdown Structure, Master Schedule
  • 9. Author Program 4/27/2015 © 2014 Carnegie Mellon University 9 17 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) Related Pitfalls • Testing at the End (GEN-TPS-6) If the testing and engineering processes are not properly integrated, then testing is more likely to be delayed to the end of development. • Independent Test Schedule (GEN-TPS-7) If the testing and engineering processes are not integrated, then the test schedule will be independent of and therefore probably incompatible with the overall project master schedule. • Testers Responsible for All Testing (GEN-STF-4) If the testing and engineering processes are not properly integrated, then the developers are more likely to believe that the testers are responsible for all testing. • Adversarial Relationship (GEN-STF-9) If the testing and engineering processes are not integrated, then the developers and the testers are more likely to develop and adversarial rather than cooperative relationship. • Testing as a Phase (GEN-PRO-18) If the testing and engineering processes are not properly integrated, then testing is more likely to be viewed as a phase that is separate from the rest of development. • Testers Not Involved Early (GEN-PRO-19) If the testing and engineering processes are not properly integrated, then testers are less likely to be involved early in the development process (such as during initial planning, requirements engineering, and architecture engineering. • Testing in Quality (GEN-PRO-23) If the testing and engineering processes are not properly integrated, then the developers will be more likely to believe that quality is the testers responsibility and that quality can be testing into the system or software. • Developers Ignore Testability (GEN-PRO-24) If the testing and engineering processes are not properly integrated, then the developers are more likely to ignore testability. • Inadequate Communication Concerning Testing (GEN-COM-5) If the testing and engineering processes are not properly integrated, then it is more likely that there will be inadequate communication between the testers and the rest of the engineering staff. 18 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Categories of Testing Pitfalls – General 1 Test Planning and Scheduling 2 Stakeholder Involvement and Commitment 3 Management 4 Staffing 5 Testing Process 6 Test Design [new] 7 Pitfall-Related [new] 8 Test Tools and Environments 9 Automated Testing [new] 10 Test Communication 11 Testing-as-a-Service (TaaS) [new] 12 Requirements
  • 10. Author Program 4/27/2015 © 2014 Carnegie Mellon University 10 19 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Test Planning and Scheduling No Separate Test Planning Documentation (GEN-TPS-1) Incomplete Test Planning (GEN-TPS-2) Test Plans Ignored (GEN-TPS-3) Test-Case Documents as Test Plans (GEN-TPS-4) Inadequate Test Schedule (GEN-TPS-5) → Testing at the End (GEN-TPS-6) Independent Test Schedule (GEN-TPS-7) [new] 20 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Stakeholder Involvement & Commitment Wrong Testing Mindset (GEN-SIC-1) → Unrealistic Testing Expectations (GEN-SIC-2) Assuming Testing Only Verification Method Needed (GEN-SIC-3) Mistaking Demonstration for Testing (GEN-SIC-4) Lack of Stakeholder Commitment to Testing (GEN-SIC-5)
  • 11. Author Program 4/27/2015 © 2014 Carnegie Mellon University 11 21 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Management Inadequate Test Resources (GEN-MGMT-1) Inappropriate External Pressures (GEN-MGMT-2) Inadequate Test-Related Risk Management (GEN-MGMT-3) Inadequate Test Metrics (GEN-MGMT-4) → Inconvenient Test Results Ignored (GEN-MGMT-5) Test Lessons Learned Ignored (GEN-MGMT-6) 22 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Staffing - 1 Lack of Independence (GEN-STF-1) Unclear Testing Responsibilities (GEN-STF-2) Developers Responsible for All Testing (GEN-STF-3) Testers Responsible for All Testing (GEN-STF-4) Testers Responsible for Ensuring Quality (GEN-STF-5 ) [new] Testers Fix Defects (GEN-STF-6) [new] Users Responsible for Testing (GEN-STF-7) [new] → Inadequate Testing Expertise (GEN-STF-8) Inadequate Domain Expertise (GEN-STF-9) [new] Adversarial Relationship (GEN-STF-10) [new]
  • 12. Author Program 4/27/2015 © 2014 Carnegie Mellon University 12 23 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Staffing - 2 Too Few Testers (GEN-STF-11 ) [new] Allowing Developers to Close Discrepancy Reports (GEN-STF-12) [new] Testing Death March (GEN-STF-13) [new] All Testers Assumed Equal (GEN-STF-14) [new] 24 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Testing Process 1 No Planned Testing Process (GEN-PRO-1) [new pitfall] Essentially No Testing (GEN-PRO-2) [new pitfall] Inadequate Testing (GEN-PRO-3) Testing Process Ignored (GEN-PRO-4) [new pitfall] → One-Size-Fits-All Testing (GEN-PRO-5) Testing and Engineering Processes Not Integrated (GEN- PRO-6) Too Immature for Testing (GEN-PRO-7) Inadequate Evaluations of Test Assets (GEN-PRO-8) → Inadequate Maintenance of Test Assets (GEN-PRO-9)
  • 13. Author Program 4/27/2015 © 2014 Carnegie Mellon University 13 25 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Testing Process 2 Testing Assets Not Delivered (GEN-PRO-10) [combined 2 existing] Testing as a Phase (GEN-PRO-11) Testers Not Involved Early (GEN-PRO-12) Developmental Testing During Production (GEN-PRO-13) [new] No Operational Testing (GEN-PRO-14) → Testing in Quality (GEN-PRO-15) [new] Developers Ignore Testability (GEN-PRO-16) [moved from other category] Failure to Address the Testing BackBlob (GEN-PRO-17) [new] 26 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Testing Process 3 Failure to Analyze Why Defects Escaped Detection (GEN-PRO-18) [new] Official Test Standards are Ignored (GEN-PRO-19) [new] Official Test Standards are Slavishly Followed (GEN-PRO-20) [new] Developing New When Old Fails Tests (GEN-PRO-21) [new] Integrating New or Updates When Fails Tests (GEN-PRO-22) [new]
  • 14. Author Program 4/27/2015 © 2014 Carnegie Mellon University 14 27 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Test Design Sunny Day Testing Only (GEN-TD-1) [new] Inadequate Test Prioritization (GEN-TD-2) Test-Type Confusion (GEN-TD-3) Functionality Testing Overemphasized (GEN-TD-4) System Testing Overemphasized (GEN-TD-5) System Testing Underemphasized (GEN-TD-6) Test Preconditions Ignored (GEN-TD-7) [new] Inadequate Test Data (GEN-TD-8) Test Oracles Ignore Nondeterministic Behavior (GEN-TD-9) [new] 28 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Pitfall-Related [new] Overly Ambitious Process Improvement (GEN-PRP-1) [new] → Inadequate Pitfall Prioritization (GEN-PRP-2) [new]
  • 15. Author Program 4/27/2015 © 2014 Carnegie Mellon University 15 29 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Test Tools and Environments Over-Reliance on Testing Tools (GEN-TTE-1) Target Platform Difficult to Access (GEN-TTE-2) → Inadequate Test Environments (GEN-TTE-3) Poor Fidelity of Test Environments (GEN-TTE-4) Inadequate Test Environment Quality (GEN-TTE-5) Test Environments Inadequately Tested (GEN-TTE-6) [new] Inadequate Test Configuration Management (GEN-TTE-7) Inadequate Testing in a Staging Environment (GEN-TTE-8) 30 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Automated Testing - 1 [new] Insufficient Automated Testing (GEN-AUTO-1) [new] Automated Testing Replaces Manual Testing (GEN-AUTO-2) [new] Excessive Automated Testing (GEN-AUTO-3) [new] Inappropriate Distribution of Automated Tests (GEN-AUTO-4) [new] Inadequate Automated Test Quality (GEN-AUTO-5) [new] Excessively Complex Automated Tests (GEN-AUTO-6) [new] → Automated Tests Not Maintained (GEN-AUTO-7) [new] Insufficient Resources Invested (GEN-AUTO-8) [new] Inappropriate Automation Tools (GEN-AUTO-9) [new]
  • 16. Author Program 4/27/2015 © 2014 Carnegie Mellon University 16 31 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Automated Testing - 2 [new] Unclear Responsibilities for Automated Testing (GEN-AUTO-10) [new] Postponing Automated Testing Until Stable (GEN-AUTO-11) [new] Automated Testing as Silver Bullet (GEN-AUTO-12) [new] 32 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Test Communication Inadequate Architecture or Design Documentation (GEN-COM-1) → Inadequate Discrepancy Reports (GEN-COM-2) Inadequate Test Documentation (GEN-COM-3) Source Documents Not Maintained (GEN-COM-4) Inadequate Communication Concerning Testing (GEN-COM- 5) Inconsistent Testing Terminology (GEN-COM-6) [new] Redundant Test Documents (GEN-COM-7) [new]
  • 17. Author Program 4/27/2015 © 2014 Carnegie Mellon University 17 33 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Testing-as-a-Service (TaaS) [new] → Cost-Driven Provider Selection (GEN-TaaS-1) [new] Inadequate Oversight (GEN-TaaS-2) [new] Lack of Outsourcing Expertise (GEN-TaaS-3) [new] Inappropriate TaaS Contract (GEN-TaaS-4) [new] TaaS Provider Improperly Chosen (GEN-TaaS-5) [new] 34 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Requirements - 1 → Tests as Requirements (GEN-REQ-1) [new] Ambiguous Requirements (GEN-REQ-2) Obsolete Requirements (GEN-REQ-3) Missing Requirements (GEN-REQ-4) Incomplete Requirements (GEN-REQ-5) Incorrect Requirements (GEN-REQ-6) Requirements Churn (GEN-REQ-7) Improperly Derived Requirements (GEN-REQ-8) Verification Methods Not Adequately Specified (GEN-REQ-9) Lack of Requirements Trace (GEN-REQ-10)
  • 18. Author Program 4/27/2015 © 2014 Carnegie Mellon University 18 35 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Requirements - 2 Titanic Effect of Deferred Requirements (GEN-REQ-11) [new] Implicit Requirements Ignored (GEN-REQ-12) [new] 36 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Categories of Testing Pitfalls – Test-Type-Specific Pitfalls 1 Executable Model Testing [new] 2 Unit Testing 3 Integration Testing 4 Specialty Engineering Testing 5 System Testing 6 User Testing [new] 7 A/B Testing [new] 8 Acceptance Testing [new] 9 Operational Testing (OT) [new] 10 System of Systems (SoS) Testing 11 Regression Testing
  • 19. Author Program 4/27/2015 © 2014 Carnegie Mellon University 19 37 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Test Type Specific Pitfalls – Executable Model Testing [new] Inadequate Executable Models (TTS-MOD-1) [new] → Executable Models Not Tested (TTS-MOD-2) [new] 38 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Test Type Specific Pitfalls – Unit Testing Testing Does Not Drive Design and Implementation (TTS-UNT-1) → Conflict of Interest (TTS-UNT-2) Untestable Units (TTS-UNT-3) [new] Brittle Test Cases (TTS-UNT-4) [new] No Unit Testing (TTS-UNT-5) [new] Unit Testing of Automatically Generated Units (TTS-UNT-6) [new]
  • 20. Author Program 4/27/2015 © 2014 Carnegie Mellon University 20 39 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Test Type Specific Pitfalls – Integration Testing Integration Decreases Testability Ignored (TTS-INT-1) Inadequate Self-Testing (TTP-INT-2) Unavailable Components (TTS-INT-3) → System Testing as Integration Testing (TTS-INT-4) 40 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Test Type Specific Pitfalls – Specialty Engineering Testing - 1 Inadequate Capacity Testing (TTS-SPC-1) Inadequate Concurrency Testing (TTS-SPC-2) Inadequate Interface Standards Compliance Testing (TTS-SPC-3) [new] Inadequate Internationalization Testing (TTS-SPC-4) Inadequate Interoperability Testing (TTS-SPC-5) Inadequate Performance Testing (TTS-SPC-6) → Inadequate Portability Testing (TTS-SPC-7) [new] Inadequate Reliability Testing (TTS-SPC-8) Inadequate Robustness Testing (TTS-SPC-9) Inadequate Safety Testing (TTS-SPC-10)
  • 21. Author Program 4/27/2015 © 2014 Carnegie Mellon University 21 41 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Test Type Specific Pitfalls – Specialty Engineering Testing - 2 Inadequate Security Testing (TTS-SPC-11) Inadequate Usability Testing (TTS-SPC-12) 42 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Test Type Specific Pitfalls – System Testing Test Hooks Remain (TTS-SYS-1) Lack of Test Hooks (TTS-SYS-2) → Inadequate End-to-End Testing (TTS-SYS-3)
  • 22. Author Program 4/27/2015 © 2014 Carnegie Mellon University 22 43 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Test Type Specific Pitfalls – User Testing [new] Inadequate User Involvement (TTS-UT-1) [new] Unprepared User Representatives (TTS-UT-2) [new] User Testing Merely Repeats System Testing (TTS-UT-3) [new] User Testing is Mistaken for Acceptance Testing (TTS-UT-4) [new] → Assume Knowledgeable and Careful User (TTS-UT-5) [new] User Testing Too Late to Fix Defects (TTS-UT-6) [new] 44 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Test Type Specific Pitfalls – A/B Testing [new] Poor Key Performance Indicators (TTS-ABT-1) [new] Misuse of Probability and Statistics (TTS-ABT-2) [new] → Confusing Statistical Significance with Business Significance (TTS-ABT-3) [new] Source(s) of Error Not Controlled (TTS-ABT-4) [new] System Variant(s) Changed During Test (TTS-ABT-5) [new]
  • 23. Author Program 4/27/2015 © 2014 Carnegie Mellon University 23 45 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Test Type Specific Pitfalls – Acceptance Testing [new] → No Clear System Acceptance Criteria (TTS-AT-1) [new] Acceptance Testing Only Tests Functionality (TTS-AT-2) [new] Developers Determine Acceptance Tests (TTS-AT-3) [new] 46 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Test Type Specific Pitfalls – Operational Testing (T) [new] → No On-Site Software Developers (TTS-OT-1) [new] No Operational Testing [new]
  • 24. Author Program 4/27/2015 © 2014 Carnegie Mellon University 24 47 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Test Type Specific Pitfalls – System of System (SoS) Testing Inadequate SoS Test Planning (TTS-SoS-1) Unclear SoS Testing Responsibilities (TTS-SoS-2) Inadequate Resources for SoS Testing (TTS-SoS-3) SoS Testing not Coordinated Across Projects (TTS-SoS-4) Inadequate SoS Requirements (TTS-SoS-5) → Inadequate Support from Individual System Projects (TTS-SoS-6) Inadequate Defect Tracking Across Projects (TTS-SoS-7) Finger-Pointing (TTS-SoS-8) 48 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 General Pitfalls – Regression Testing Inadequate Regression Test Automation (GEN-REG-1) Regression Testing Not Performed (GEN-REG-2) Inadequate Scope of Regression Testing (GEN-REG-3) Only Low-Level Regression Tests (GEN-REG-4) Only Functional Regression Testing (GEN-REG-5) Inadequate Retesting of Reused Software (TTS-REG-6) [new]
  • 25. Author Program 4/27/2015 © 2014 Carnegie Mellon University 25 49 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Remaining Limitation Current Taxonomy is Experience Based: • Based on experience testing and assessing testing programs (author, SEI ITAs, technical reviewers) • Not the result of documentation study or formal academic research 50 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Remaining Questions Which pitfalls occur most often? With what frequency? Which pitfalls cause the most harm? Which pitfalls have the highest risk (expected harm = harm frequency x harm)? What factors (e.g., system/software size and complexity, application domain, process) influence frequency, harm, and risk?
  • 26. Author Program 4/27/2015 © 2014 Carnegie Mellon University 26 51 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Future Work - 1 Second Edition of Book Extensive Technical Review: • New testing pitfall categories • New and modified testing pitfalls 52 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Future Work – Proper Industry Survey How likely are the different testing pitfalls? What are the 10 most common? What pitfalls have the worst consequences? What are the 10 worst pitfalls? What pitfalls have the highest risk? What are the 10 highest risk pitfalls? Do the answers to these questions vary by: • System (size, complexity, criticality, application domain, software only vs. HW/SW/people/documentation/facilities/procedures , system vs. SoS vs. PL)? • Project (type, formality, lifecycle scope, schedule, funding, commercial vs. government/military, ) • Organization (number, size, type, governance, management/engineering culture, )
  • 27. Author Program 4/27/2015 © 2014 Carnegie Mellon University 27 Save 35%* at informit.com Discount code: FIRESMITH550 • informit.com - search on Firesmith • Available as book & eBook • FREE shipping in the U.S. *Offer expires Dec 31, 2015 http://sites.google.com/a/firesmith.net/donald- firesmith/home/common-testing-pitfalls 54 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Contact Information Slide Format Donald G. Firesmith Principal Engineer Software Solutions Division Telephone: +1 412-268-6874 Email: dgf@sei.cmu.edu U.S. Mail Software Engineering Institute Customer Relations 4500 Fifth Avenue Pittsburgh, PA 15213-2612 USA Web www.sei.cmu.edu www.sei.cmu.edu/contact.cfm Customer Relations Email: info@sei.cmu.edu Telephone: +1 412-268-5800 SEI Phone: +1 412-268-5800 SEI Fax: +1 412-268-6257
  • 28. Author Program 4/27/2015 © 2014 Carnegie Mellon University 28 55 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Backup Slides 56 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 What is Testing? Testing The execution of an Object Under Test (OUT) under specific preconditions (for example, pretest mode, states, stored data, or external conditions) with specific inputs so that its actual behavior (outputs and postconditions) can be compared with its expected or required behavior Notes: • The OUT can be: – An executable requirements, architecture, or design model – A system or subsystem (including hardware, data, personnel, etc.) – A software application, component, or unit • Not just software • Requires execution (do not confuse testing with QE or with other means of verification such as analysis, certification, demonstration, inspection, ) • Requires controllability to set up preconditions and provide inputs • Requires observability to determine outputs and postconditions
  • 29. Author Program 4/27/2015 © 2014 Carnegie Mellon University 29 57 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Why Test? In roughly decreasing order of importance, the goals of testing are to: • Uncover Significant Defects in the object under test (OUT) by causing it to behave incorrectly (e.g., to fail or enter a faulty state) so that these underlying defects can be identified and fixed and the OUT can thereby be improved. • Provide Evidence that can be used to determine the OUT’s: – Quality – Fitness for purpose – Readiness for shipping, deployment, or being placed into operation • Support Process Improvement by helping to identify: – Development processes that introduce defects – Testing processes that fail to uncover defects • Prevent Defects by: – Testing executable requirements, architecture, and design models so that defects in the models are fixed before they can result in defects in the system/software. – Using Test Driven Development (TDD) to develop test cases and then using these test cases to drive development (design and implementation) 58 Common System/SW Testing Pitfalls Donald G. Firesmith, 6 May 2015 Limitations of Testing Cannot be exhaustive (or even close) Cannot uncover all defects: • Different levels of testing have different defect removal efficiencies (DREs) • Different types of testing uncover different types of defects May provide false positive and false negative results Cannot prove the OUT works properly under all inputs and conditions