SlideShare a Scribd company logo
1 of 18
Download to read offline
© 2014 Carnegie Mellon University 
Common System and Software Testing Pitfalls ER-2014 Conference Atlanta, Georgia 
Software Engineering Institute 
Carnegie Mellon University 
Pittsburgh, PA 15213 
Donald G. Firesmith, Principle Engineer 
27 October 2014
2 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
NO WARRANTY 
THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS" BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. 
Use of any trademarks in this presentation is not intended in any way to infringe on the rights of the trademark holder. 
This Presentation may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. 
This work was created in the performance of Federal Government Contract Number FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The Government of the United States has a royalty-free government-purpose license to use, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so, for government purposes pursuant to the copyright license under the clause at 252.227-7013.
3 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Topics 
Extremely brief introduction to Common System and Software Testing Pitfalls Current status of Testing Pitfalls Repository since book was published by Addison-Wesley in January 2014. Future Work
4 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Terminology 
Testing Pitfall 
• 
A human mistake that unnecessarily and unexpectedly causes testing to be: 
• 
Less effective at uncovering defects 
• 
Less efficient in terms of time and effort expended 
• 
More frustrating to perform 
• 
A bad decision, an incorrect mindset, a wrong action, or failure to act 
• 
A way to screw up testing Common Testing Pitfall 
• 
Observed numerous times on different projects 
• 
Having sufficient frequency (and consequences ) to be a significant risk Testing Pitfalls Taxonomy and Ontology 
Anti-Pattern Language of how-not-to do testing
5 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Past and Present 
Common System and Software Testing Pitfalls (Addison-Wesley, 2014) (Note: 35% conference discount): 
• 
92 pitfalls classified into 14 categories 
• 
Technically reviewed by 47 testing international SMEs Current taxonomy/repository with new pitfalls and pitfall categories: 
• 
139 pitfalls classified into 20 categories 
• 
http://sites.google.com/a/firesmith.net/donald-firesmith/home/common-testing- pitfalls [Work in progress - new content is draft and may be incomplete]
6 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Goals and Potential Uses 
Goals: 
• 
To become the de facto industry-standard taxonomy of testing pitfalls 
• 
To reduce the incidence of testing pitfalls and thereby improve testing effectiveness and efficiency 
• 
To improve the quality of the objects under test (OUTs) Potential Uses: 
• 
Training materials for testers and testing stakeholders 
• 
Standard terminology regarding commonly occurring testing pitfalls 
• 
Checklists for use when: 
– 
Producing test strategies/plans and related documentations 
– 
Evaluating contractor proposals 
– 
Evaluating test strategies/plans and related documentation (quality control) 
– 
Evaluating as-performed test process (quality assurance) 
– 
Identifying test-related risks and their mitigation approaches 
• 
Categorization of pitfalls for test metrics collection, analysis, and reporting
7 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Testing Pitfall Taxonomy and Ontology
8 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) 
Description The testing process is not adequately integrated into the overall system engineering process. Potential Applicability This pitfall is potentially applicable anytime that engineering and testing processes both exist. Characteristic Symptoms 
•There is little or no discussion of testing in the system engineering documentation: System Engineering Management Plan (SEMP), Software Development Plan (SDP), Work Breakdown Structure (WBS), Project Master Schedule (PMS), or System Development Cycle (SDC). 
•All or most of the testing is done as a completely independent activity performed by staff members who are not part of the project engineering team. 
•Testing is treated as a separate specialty engineering activity with only limited interfaces with the primary engineering activities. 
•Test scheduling is independent of the scheduling of other development activities. 
•Testers are not included in the requirements teams, architecture teams, or any cross-functional engineering teams.
9 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) 
Potential Negative Consequences 
• 
There is inadequate communication between testers and other system or software engineers (for example, requirements engineers, architects, designers, and implementers). 
• 
Few nontesters understand the scope, complexity, and importance of testing. 
• 
Testers do not understand the work being performed by other engineers. 
• 
Testers can produce test cases and automated testing scripts before the requirements, architecture, and design has stabilized, thereby forcing the testers to modify their test cases and test scripts as the system or software changes and incorrect hidden assumptions are uncovered. 
• 
Testing is less effective and takes longer than necessary.
10 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) 
Potential Causes 
• 
Testers were not involved in determining and documenting the overall engineering process. 
• 
The people determining and documenting the overall engineering process did not have significant testing expertise, training, or experience. 
• 
The testing schedule has not been integrated into the overall project schedule. 
• 
Testing was outsourced.
11 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) 
Recommendations 
• 
Prepare: 
– 
Include testers in the initial staffing of the project. 
• 
Enable: 
– 
Provide a top-level briefing or training in testing to the chief system engineer, system architect, and process engineer. 
• 
Perform: 
– 
Subject-matter experts and project testers collaborate closely with the project chief engineer or technical lead and process engineer when they develop the engineering process descriptions and associated process documents. 
– 
Provide high-level overviews of testing in the SEMP(s) and SDP(s). 
– 
Document how testing is integrated into the system development or life cycle, regardless of whether it is traditional waterfall, evolutionary (iterative, incremental, and parallel), or anything in between. 
– 
For example, document handover points in the development cycle when testing input and output work products are delivered from one project organization or group to another. 
– 
Incorporate testing into the Project Master Schedule. 
– 
Incorporate testing into the project’s Work Breakdown Structure (WBS). 
• 
Verify: 
– 
Determine whether testers were involved in planning the project’s system or software development process. 
– 
Determine whether testing is incorporated into the project’s System engineering process, System development cycle, System Engineering Master Plan and System Development Plan, Work Breakdown Structure, Master Schedule
12 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) 
Related Pitfalls 
• 
Testing at the End (GEN-TPS-6) If the testing and engineering processes are not properly integrated, then testing is more likely to be delayed to the end of development. 
• 
Independent Test Schedule (GEN-TPS-7) If the testing and engineering processes are not integrated, then the test schedule will be independent of and therefore probably incompatible with the overall project master schedule. 
• 
Testers Responsible for All Testing (GEN-STF-4) If the testing and engineering processes are not properly integrated, then the developers are more likely to believe that the testers are responsible for all testing. 
• 
Adversarial Relationship (GEN-STF-9) If the testing and engineering processes are not integrated, then the developers and the testers are more likely to develop and adversarial rather than cooperative relationship. 
• 
Testing as a Phase (GEN-PRO-18) If the testing and engineering processes are not properly integrated, then testing is more likely to be viewed as a phase that is separate from the rest of development. 
• 
Testers Not Involved Early (GEN-PRO-19) If the testing and engineering processes are not properly integrated, then testers are less likely to be involved early in the development process (such as during initial planning, requirements engineering, and architecture engineering. 
• 
Testing in Quality (GEN-PRO-23) If the testing and engineering processes are not properly integrated, then the developers will be more likely to believe that quality is the testers responsibility and that quality can be testing into the system or software. 
• 
Developers Ignore Testability (GEN-PRO-24) If the testing and engineering processes are not properly integrated, then the developers are more likely to ignore testability. 
• 
Inadequate Communication Concerning Testing (GEN-COM-5) If the testing and engineering processes are not properly integrated, then it is more likely that there will be inadequate communication between the testers and the rest of the engineering staff.
13 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Categories of Testing Pitfalls – General 
1 Test Planning and Scheduling 2 Stakeholder Involvement and Commitment 3 Management 4 Staffing 5 Testing Process 6 Pitfall-Related [new pitfall category since book] 7 Test Tools and Environments 8 Automated Testing [new pitfall category since book] 9 Test Communication 10 Testing-as-a-Service (TaaS) [new pitfall category since book] 11 Requirements
14 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Categories of Testing Pitfalls – Test-Type-Specific Pitfalls 
1 Executable Model Testing [new pitfall category] 2 Unit Testing 3 Integration Testing 4 Specialty Engineering Testing 5 System Testing 6 User Testing [new pitfall category] 7 Acceptance Testing [new pitfall category] 8 System of Systems (SoS) Testing 9 Regression Testing
15 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Remaining Limitation and Questions 
Current Taxonomy is Experience Based: 
• 
Based on experience testing and assessing testing programs (author, SEI ITAs, technical reviewers) 
• 
Not the result of documentation study or formal academic research Remaining Questions: 
• 
Which pitfalls occur most often? With what frequency? 
• 
Which pitfalls cause the most harm? 
• 
Which pitfalls have the highest risk (expected harm = harm frequency x harm)? 
• 
What factors (e.g., system/software size and complexity, application domain, process) influence frequency, harm, and risk?
16 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Future Work 
Second Edition of Book or Supplement to First Book Extensive Technical Review: 
• 
New testing pitfall categories 
• 
New and modified testing pitfalls Proper Industry Survey: 
• 
How likely are the different testing pitfalls? What are the 10 most common? 
• 
What pitfalls have the worst consequences? What are the 10 worst pitfalls? 
• 
What pitfalls have the highest risk? What are the 10 highest risk pitfalls? 
• 
Do the answers to these questions vary by: 
– 
System (size, complexity, criticality, application domain, software only vs. HW/SW/people/documentation/facilities/procedures…, system vs. SoS vs. PL)? 
– 
Project (type, formality, lifecycle scope, schedule, funding, commercial vs. government/military,…) 
– 
Organization (number, size, type, governance, management/engineering culture,…) Wiki
Save 35%* at informit.com 
Discount code: FIRESMITH550 
• 
informit.com - search on Firesmith 
• 
Available as book & eBook 
• 
FREE shipping in the U.S. *Offer expires Dec 31, 2014 
http://sites.google.com/a/firesmith.net/donald- firesmith/home/common-testing-pitfalls
18 Common System/SW Testing Pitfalls 
Donald G. Firesmith, 27 October 2014 
Contact Information Slide Format 
Donald G. Firesmith 
Principal Engineer 
Software Solutions Division 
Telephone: +1 412-268-6874 
Email: dgf@sei.cmu.edu 
U.S. Mail 
Software Engineering Institute 
Customer Relations 
4500 Fifth Avenue 
Pittsburgh, PA 15213-2612 
USA 
Web 
www.sei.cmu.edu 
www.sei.cmu.edu/contact.cfm 
Customer Relations 
Email: info@sei.cmu.edu 
Telephone: +1 412-268-5800 
SEI Phone: +1 412-268-5800 
SEI Fax: +1 412-268-6257

More Related Content

What's hot

HCLT Whitepaper: Landmines of Software Testing Metrics
HCLT Whitepaper: Landmines of Software Testing MetricsHCLT Whitepaper: Landmines of Software Testing Metrics
HCLT Whitepaper: Landmines of Software Testing Metrics
HCL Technologies
 
Testing throughout the software life cycle & statistic techniques
Testing throughout the software life cycle & statistic techniquesTesting throughout the software life cycle & statistic techniques
Testing throughout the software life cycle & statistic techniques
YAObbiIkhsan
 

What's hot (19)

Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010
Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010
Peter Zimmerer - Passion For Testing, By Examples - EuroSTAR 2010
 
Testing throughout the software life cycle (test types)
Testing throughout the software life cycle (test types)Testing throughout the software life cycle (test types)
Testing throughout the software life cycle (test types)
 
ISTQB, ISEB Lecture Notes- 2
ISTQB, ISEB Lecture Notes- 2ISTQB, ISEB Lecture Notes- 2
ISTQB, ISEB Lecture Notes- 2
 
Testing Throughout the Software Life Cycle - Section 2
Testing Throughout the Software Life Cycle - Section 2Testing Throughout the Software Life Cycle - Section 2
Testing Throughout the Software Life Cycle - Section 2
 
Software Testing
Software TestingSoftware Testing
Software Testing
 
Qa Faqs
Qa FaqsQa Faqs
Qa Faqs
 
HCLT Whitepaper: Landmines of Software Testing Metrics
HCLT Whitepaper: Landmines of Software Testing MetricsHCLT Whitepaper: Landmines of Software Testing Metrics
HCLT Whitepaper: Landmines of Software Testing Metrics
 
Testing 1 static techniques
Testing 1 static techniquesTesting 1 static techniques
Testing 1 static techniques
 
CTFL Module 01
CTFL Module 01CTFL Module 01
CTFL Module 01
 
Effects Based Planning And Assessment
Effects Based Planning And AssessmentEffects Based Planning And Assessment
Effects Based Planning And Assessment
 
Testing 1 - the Basics
Testing 1 - the BasicsTesting 1 - the Basics
Testing 1 - the Basics
 
Usability Testing for Survey Research:How to and Best Practices
Usability Testing for Survey Research:How to and Best PracticesUsability Testing for Survey Research:How to and Best Practices
Usability Testing for Survey Research:How to and Best Practices
 
Chapter Three Static Techniques
Chapter Three Static TechniquesChapter Three Static Techniques
Chapter Three Static Techniques
 
CTFL Module 02
CTFL Module 02CTFL Module 02
CTFL Module 02
 
Testing throughout the software life cycle & statistic techniques
Testing throughout the software life cycle & statistic techniquesTesting throughout the software life cycle & statistic techniques
Testing throughout the software life cycle & statistic techniques
 
Introduction to knowledge discovery
Introduction to knowledge discoveryIntroduction to knowledge discovery
Introduction to knowledge discovery
 
ISTQB, ISEB Lecture Notes
ISTQB, ISEB Lecture NotesISTQB, ISEB Lecture Notes
ISTQB, ISEB Lecture Notes
 
Unit 3 for st
Unit 3 for stUnit 3 for st
Unit 3 for st
 
@#$@#$@#$"""@#$@#$"""
@#$@#$@#$"""@#$@#$"""@#$@#$@#$"""@#$@#$"""
@#$@#$@#$"""@#$@#$"""
 

Similar to Common testing pitfalls er-2014 - 2014-10-27

Schiable
SchiableSchiable
Schiable
NASAPMC
 
2015-NextGenTesting-Testing-Types-Updated
2015-NextGenTesting-Testing-Types-Updated2015-NextGenTesting-Testing-Types-Updated
2015-NextGenTesting-Testing-Types-Updated
Donald Firesmith
 
Testing Types and Paradigms - 2015-07-13 - V11
Testing Types and Paradigms - 2015-07-13 - V11Testing Types and Paradigms - 2015-07-13 - V11
Testing Types and Paradigms - 2015-07-13 - V11
Donald Firesmith
 
Importanc of softwaretesting
Importanc of softwaretestingImportanc of softwaretesting
Importanc of softwaretesting
Isaiah Edem
 
Test Cases Maintaining & Documenting
Test Cases Maintaining & DocumentingTest Cases Maintaining & Documenting
Test Cases Maintaining & Documenting
Seyed Ali Marjaie
 

Similar to Common testing pitfalls er-2014 - 2014-10-27 (20)

Common testing pitfalls tsp-2014 - 2014-11-03 v10
Common testing pitfalls   tsp-2014 - 2014-11-03 v10Common testing pitfalls   tsp-2014 - 2014-11-03 v10
Common testing pitfalls tsp-2014 - 2014-11-03 v10
 
Schiable
SchiableSchiable
Schiable
 
2015-NextGenTesting-Testing-Types-Updated
2015-NextGenTesting-Testing-Types-Updated2015-NextGenTesting-Testing-Types-Updated
2015-NextGenTesting-Testing-Types-Updated
 
Testing Types and Paradigms - 2015-07-13 - V11
Testing Types and Paradigms - 2015-07-13 - V11Testing Types and Paradigms - 2015-07-13 - V11
Testing Types and Paradigms - 2015-07-13 - V11
 
Common System and Software Testing Pitfalls Checklist - 2014
Common System and Software Testing Pitfalls Checklist - 2014Common System and Software Testing Pitfalls Checklist - 2014
Common System and Software Testing Pitfalls Checklist - 2014
 
Unit 1 part 2
Unit 1 part 2Unit 1 part 2
Unit 1 part 2
 
IT8076 - SOFTWARE TESTING
IT8076 - SOFTWARE TESTINGIT8076 - SOFTWARE TESTING
IT8076 - SOFTWARE TESTING
 
Importanc of softwaretesting
Importanc of softwaretestingImportanc of softwaretesting
Importanc of softwaretesting
 
Software Testing 2/5
Software Testing 2/5Software Testing 2/5
Software Testing 2/5
 
Test management
Test managementTest management
Test management
 
Positivityofnegative
PositivityofnegativePositivityofnegative
Positivityofnegative
 
Testing strategies
Testing strategiesTesting strategies
Testing strategies
 
Fundamental Test Process
Fundamental Test ProcessFundamental Test Process
Fundamental Test Process
 
Fundamental test process
Fundamental test processFundamental test process
Fundamental test process
 
Test Cases Maintaining & Documenting
Test Cases Maintaining & DocumentingTest Cases Maintaining & Documenting
Test Cases Maintaining & Documenting
 
ISTQB - What's testing
ISTQB - What's testingISTQB - What's testing
ISTQB - What's testing
 
IT 8076 Software Testing Unit1
IT 8076 Software Testing Unit1IT 8076 Software Testing Unit1
IT 8076 Software Testing Unit1
 
CTFL chapter 05
CTFL chapter 05CTFL chapter 05
CTFL chapter 05
 
Fundamental test process (andika m)
Fundamental test process (andika m)Fundamental test process (andika m)
Fundamental test process (andika m)
 
Exploratory Testing: Make It Part of Your Test Strategy
Exploratory Testing: Make It Part of Your Test StrategyExploratory Testing: Make It Part of Your Test Strategy
Exploratory Testing: Make It Part of Your Test Strategy
 

Recently uploaded

TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service provider
mohitmore19
 
AI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
AI Mastery 201: Elevating Your Workflow with Advanced LLM TechniquesAI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
AI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
VictorSzoltysek
 

Recently uploaded (20)

Exploring the Best Video Editing App.pdf
Exploring the Best Video Editing App.pdfExploring the Best Video Editing App.pdf
Exploring the Best Video Editing App.pdf
 
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
 
The Top App Development Trends Shaping the Industry in 2024-25 .pdf
The Top App Development Trends Shaping the Industry in 2024-25 .pdfThe Top App Development Trends Shaping the Industry in 2024-25 .pdf
The Top App Development Trends Shaping the Industry in 2024-25 .pdf
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
 
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdfPayment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
 
The Guide to Integrating Generative AI into Unified Continuous Testing Platfo...
The Guide to Integrating Generative AI into Unified Continuous Testing Platfo...The Guide to Integrating Generative AI into Unified Continuous Testing Platfo...
The Guide to Integrating Generative AI into Unified Continuous Testing Platfo...
 
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
 
%in ivory park+277-882-255-28 abortion pills for sale in ivory park
%in ivory park+277-882-255-28 abortion pills for sale in ivory park %in ivory park+277-882-255-28 abortion pills for sale in ivory park
%in ivory park+277-882-255-28 abortion pills for sale in ivory park
 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service provider
 
Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...
Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...
Crypto Cloud Review - How To Earn Up To $500 Per DAY Of Bitcoin 100% On AutoP...
 
%in Bahrain+277-882-255-28 abortion pills for sale in Bahrain
%in Bahrain+277-882-255-28 abortion pills for sale in Bahrain%in Bahrain+277-882-255-28 abortion pills for sale in Bahrain
%in Bahrain+277-882-255-28 abortion pills for sale in Bahrain
 
AI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
AI Mastery 201: Elevating Your Workflow with Advanced LLM TechniquesAI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
AI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
 
ManageIQ - Sprint 236 Review - Slide Deck
ManageIQ - Sprint 236 Review - Slide DeckManageIQ - Sprint 236 Review - Slide Deck
ManageIQ - Sprint 236 Review - Slide Deck
 
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
 
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time ApplicationsUnveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
 
%in tembisa+277-882-255-28 abortion pills for sale in tembisa
%in tembisa+277-882-255-28 abortion pills for sale in tembisa%in tembisa+277-882-255-28 abortion pills for sale in tembisa
%in tembisa+277-882-255-28 abortion pills for sale in tembisa
 
10 Trends Likely to Shape Enterprise Technology in 2024
10 Trends Likely to Shape Enterprise Technology in 202410 Trends Likely to Shape Enterprise Technology in 2024
10 Trends Likely to Shape Enterprise Technology in 2024
 
Pharm-D Biostatistics and Research methodology
Pharm-D Biostatistics and Research methodologyPharm-D Biostatistics and Research methodology
Pharm-D Biostatistics and Research methodology
 
8257 interfacing 2 in microprocessor for btech students
8257 interfacing 2 in microprocessor for btech students8257 interfacing 2 in microprocessor for btech students
8257 interfacing 2 in microprocessor for btech students
 

Common testing pitfalls er-2014 - 2014-10-27

  • 1. © 2014 Carnegie Mellon University Common System and Software Testing Pitfalls ER-2014 Conference Atlanta, Georgia Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Donald G. Firesmith, Principle Engineer 27 October 2014
  • 2. 2 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS" BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. Use of any trademarks in this presentation is not intended in any way to infringe on the rights of the trademark holder. This Presentation may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. This work was created in the performance of Federal Government Contract Number FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The Government of the United States has a royalty-free government-purpose license to use, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so, for government purposes pursuant to the copyright license under the clause at 252.227-7013.
  • 3. 3 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Topics Extremely brief introduction to Common System and Software Testing Pitfalls Current status of Testing Pitfalls Repository since book was published by Addison-Wesley in January 2014. Future Work
  • 4. 4 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Terminology Testing Pitfall • A human mistake that unnecessarily and unexpectedly causes testing to be: • Less effective at uncovering defects • Less efficient in terms of time and effort expended • More frustrating to perform • A bad decision, an incorrect mindset, a wrong action, or failure to act • A way to screw up testing Common Testing Pitfall • Observed numerous times on different projects • Having sufficient frequency (and consequences ) to be a significant risk Testing Pitfalls Taxonomy and Ontology Anti-Pattern Language of how-not-to do testing
  • 5. 5 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Past and Present Common System and Software Testing Pitfalls (Addison-Wesley, 2014) (Note: 35% conference discount): • 92 pitfalls classified into 14 categories • Technically reviewed by 47 testing international SMEs Current taxonomy/repository with new pitfalls and pitfall categories: • 139 pitfalls classified into 20 categories • http://sites.google.com/a/firesmith.net/donald-firesmith/home/common-testing- pitfalls [Work in progress - new content is draft and may be incomplete]
  • 6. 6 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Goals and Potential Uses Goals: • To become the de facto industry-standard taxonomy of testing pitfalls • To reduce the incidence of testing pitfalls and thereby improve testing effectiveness and efficiency • To improve the quality of the objects under test (OUTs) Potential Uses: • Training materials for testers and testing stakeholders • Standard terminology regarding commonly occurring testing pitfalls • Checklists for use when: – Producing test strategies/plans and related documentations – Evaluating contractor proposals – Evaluating test strategies/plans and related documentation (quality control) – Evaluating as-performed test process (quality assurance) – Identifying test-related risks and their mitigation approaches • Categorization of pitfalls for test metrics collection, analysis, and reporting
  • 7. 7 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Testing Pitfall Taxonomy and Ontology
  • 8. 8 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) Description The testing process is not adequately integrated into the overall system engineering process. Potential Applicability This pitfall is potentially applicable anytime that engineering and testing processes both exist. Characteristic Symptoms •There is little or no discussion of testing in the system engineering documentation: System Engineering Management Plan (SEMP), Software Development Plan (SDP), Work Breakdown Structure (WBS), Project Master Schedule (PMS), or System Development Cycle (SDC). •All or most of the testing is done as a completely independent activity performed by staff members who are not part of the project engineering team. •Testing is treated as a separate specialty engineering activity with only limited interfaces with the primary engineering activities. •Test scheduling is independent of the scheduling of other development activities. •Testers are not included in the requirements teams, architecture teams, or any cross-functional engineering teams.
  • 9. 9 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) Potential Negative Consequences • There is inadequate communication between testers and other system or software engineers (for example, requirements engineers, architects, designers, and implementers). • Few nontesters understand the scope, complexity, and importance of testing. • Testers do not understand the work being performed by other engineers. • Testers can produce test cases and automated testing scripts before the requirements, architecture, and design has stabilized, thereby forcing the testers to modify their test cases and test scripts as the system or software changes and incorrect hidden assumptions are uncovered. • Testing is less effective and takes longer than necessary.
  • 10. 10 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) Potential Causes • Testers were not involved in determining and documenting the overall engineering process. • The people determining and documenting the overall engineering process did not have significant testing expertise, training, or experience. • The testing schedule has not been integrated into the overall project schedule. • Testing was outsourced.
  • 11. 11 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) Recommendations • Prepare: – Include testers in the initial staffing of the project. • Enable: – Provide a top-level briefing or training in testing to the chief system engineer, system architect, and process engineer. • Perform: – Subject-matter experts and project testers collaborate closely with the project chief engineer or technical lead and process engineer when they develop the engineering process descriptions and associated process documents. – Provide high-level overviews of testing in the SEMP(s) and SDP(s). – Document how testing is integrated into the system development or life cycle, regardless of whether it is traditional waterfall, evolutionary (iterative, incremental, and parallel), or anything in between. – For example, document handover points in the development cycle when testing input and output work products are delivered from one project organization or group to another. – Incorporate testing into the Project Master Schedule. – Incorporate testing into the project’s Work Breakdown Structure (WBS). • Verify: – Determine whether testers were involved in planning the project’s system or software development process. – Determine whether testing is incorporated into the project’s System engineering process, System development cycle, System Engineering Master Plan and System Development Plan, Work Breakdown Structure, Master Schedule
  • 12. 12 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Example – Testing and Engineering Processes Not Integrated (GEN-PRO-7) Related Pitfalls • Testing at the End (GEN-TPS-6) If the testing and engineering processes are not properly integrated, then testing is more likely to be delayed to the end of development. • Independent Test Schedule (GEN-TPS-7) If the testing and engineering processes are not integrated, then the test schedule will be independent of and therefore probably incompatible with the overall project master schedule. • Testers Responsible for All Testing (GEN-STF-4) If the testing and engineering processes are not properly integrated, then the developers are more likely to believe that the testers are responsible for all testing. • Adversarial Relationship (GEN-STF-9) If the testing and engineering processes are not integrated, then the developers and the testers are more likely to develop and adversarial rather than cooperative relationship. • Testing as a Phase (GEN-PRO-18) If the testing and engineering processes are not properly integrated, then testing is more likely to be viewed as a phase that is separate from the rest of development. • Testers Not Involved Early (GEN-PRO-19) If the testing and engineering processes are not properly integrated, then testers are less likely to be involved early in the development process (such as during initial planning, requirements engineering, and architecture engineering. • Testing in Quality (GEN-PRO-23) If the testing and engineering processes are not properly integrated, then the developers will be more likely to believe that quality is the testers responsibility and that quality can be testing into the system or software. • Developers Ignore Testability (GEN-PRO-24) If the testing and engineering processes are not properly integrated, then the developers are more likely to ignore testability. • Inadequate Communication Concerning Testing (GEN-COM-5) If the testing and engineering processes are not properly integrated, then it is more likely that there will be inadequate communication between the testers and the rest of the engineering staff.
  • 13. 13 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Categories of Testing Pitfalls – General 1 Test Planning and Scheduling 2 Stakeholder Involvement and Commitment 3 Management 4 Staffing 5 Testing Process 6 Pitfall-Related [new pitfall category since book] 7 Test Tools and Environments 8 Automated Testing [new pitfall category since book] 9 Test Communication 10 Testing-as-a-Service (TaaS) [new pitfall category since book] 11 Requirements
  • 14. 14 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Categories of Testing Pitfalls – Test-Type-Specific Pitfalls 1 Executable Model Testing [new pitfall category] 2 Unit Testing 3 Integration Testing 4 Specialty Engineering Testing 5 System Testing 6 User Testing [new pitfall category] 7 Acceptance Testing [new pitfall category] 8 System of Systems (SoS) Testing 9 Regression Testing
  • 15. 15 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Remaining Limitation and Questions Current Taxonomy is Experience Based: • Based on experience testing and assessing testing programs (author, SEI ITAs, technical reviewers) • Not the result of documentation study or formal academic research Remaining Questions: • Which pitfalls occur most often? With what frequency? • Which pitfalls cause the most harm? • Which pitfalls have the highest risk (expected harm = harm frequency x harm)? • What factors (e.g., system/software size and complexity, application domain, process) influence frequency, harm, and risk?
  • 16. 16 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Future Work Second Edition of Book or Supplement to First Book Extensive Technical Review: • New testing pitfall categories • New and modified testing pitfalls Proper Industry Survey: • How likely are the different testing pitfalls? What are the 10 most common? • What pitfalls have the worst consequences? What are the 10 worst pitfalls? • What pitfalls have the highest risk? What are the 10 highest risk pitfalls? • Do the answers to these questions vary by: – System (size, complexity, criticality, application domain, software only vs. HW/SW/people/documentation/facilities/procedures…, system vs. SoS vs. PL)? – Project (type, formality, lifecycle scope, schedule, funding, commercial vs. government/military,…) – Organization (number, size, type, governance, management/engineering culture,…) Wiki
  • 17. Save 35%* at informit.com Discount code: FIRESMITH550 • informit.com - search on Firesmith • Available as book & eBook • FREE shipping in the U.S. *Offer expires Dec 31, 2014 http://sites.google.com/a/firesmith.net/donald- firesmith/home/common-testing-pitfalls
  • 18. 18 Common System/SW Testing Pitfalls Donald G. Firesmith, 27 October 2014 Contact Information Slide Format Donald G. Firesmith Principal Engineer Software Solutions Division Telephone: +1 412-268-6874 Email: dgf@sei.cmu.edu U.S. Mail Software Engineering Institute Customer Relations 4500 Fifth Avenue Pittsburgh, PA 15213-2612 USA Web www.sei.cmu.edu www.sei.cmu.edu/contact.cfm Customer Relations Email: info@sei.cmu.edu Telephone: +1 412-268-5800 SEI Phone: +1 412-268-5800 SEI Fax: +1 412-268-6257

Editor's Notes

  1. 10/25/2014
  2. 10/25/2014
  3. 10/25/2014