Testing enterprise software requires effective resource management to prevent costly delays, budget overruns, and failed projects. In many software projects, more than 50 percent of development costs are attributed to software testing activities. With testing accounting for such a large portion of development efforts, it is critical for software engineering teams to avoid and eliminate wasteful testing tasks. Tariq King applies lean and agile principles to test management as a way of reducing waste in the testing process. Join Tariq as he describes an integrated approach for identifying and eliminating waste in test planning, automation, and execution. He demonstrates how to combine tools and techniques for test case management, exploratory testing, test automation, continuous integration, and impact analysis to keep testing activities lean and lightweight. Learn how to avoid common sources of waste such as over-documentation, redundant tests, brittle automation, and testing unimportant or unaffected parts of the system.
why an Opensea Clone Script might be your perfect match.pdf
Lean Test Management: Reduce Waste in Planning, Automation, and Execution
1. QA Organization Meeting
May 15, 2015
LEAN TEST
MANAGEMENT
PLANNING,
AUTOMATION,
& EXECUTION.
inREDUCE WASTE
OCTOBER 1,2015 ANAHEIM, CALIFORNIA DISNEYLAND HOTEL
TARIQ KING
22
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
2. 33
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
Increasing Productivity
without reducing Quality
Quality = Productivity
Consider… Improve Quality
Lower Costs
Less Rework, Fewer Mistakes
Productivity Rises
However…
Force Productivity Up
Move Fast, Break Things
Higher Costs
Quality Suffers
Q
P
Measuring Productivity
Measuring Quality
CHALLENGE
44
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
Moving Faster Qualitywithout Compromising
3. 55
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
DESIGN FOR
TESTABILITY
LIGHTWEIGHT
PLANNING
TEST IMPACT
ANALYSIS
RISK-BASED
TESTING
66
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
W
A S
T E
!Brittle Automation
Over-Testing
Test Redundancy
Untestability
Over-Documentation
4. 77
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
DESIGN FOR
TESTABILITY
88
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
You are not allowed
to come near them,
nor poke them, nor
probe them.
AFFECT TESTING?
COVERED CLOCKS
5. 99
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
DESIGN FOR TESTABILITY
Testability refers to the degree to which a system
facilitates the establishment of test criteria and
the performance of tests to determine whether
those criteria have been met.
Design for Testability (DFT) is an approach in which
testability is engineered into the product at the
design stage to support validation and verification.
DFT should be performed at all levels of the design,
and may involve modifying existing designs.
1010
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
TESTABILITY ATTRIBUTES
Attribute Description
Controllability Must be able to set up preconditions and apply input
Observability Ability to recognize and interpret the results
Availability To test it we have to be able to get at it
Simplicity The less complicated it is, the easier it is to test
Stability The fewer the disruptions, the faster the testing
Operability The better it works, the more efficient the testing
Information
The more information we have about the system, the
smarter we can test
6. 1111
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
EXECUTIVE BUY-IN
Testability is usually not a technical
issue. It is often a people issue.
It has a difficult time competing
with functionality, performance,
and other aspects of software
development.
Management must allow time for
testability to be implemented so it
can provide long-term benefits.
1212
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
BENEFITS FOR TESTING EFFORT
A = Design done with
testability in mind
B = Design made without
testability in mind but
good fault coverage
due to large test effort
C = Design that is very
difficult to test
7. 1313
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
DEVELOPER BUY-IN
Developers usually adopt design for
testability techniques readily if they
can see the immediate return.
Return in the form of fewer defects,
less time dealing with those defects,
better code and higher productivity.
If they see these techniques as a
way of helping them write better
code, they’ll do it.
1414
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
Design for testability is not going to
radically change the way you code
but it may radically change the way
you think about it…
How do my coding
practices affect testing?
Polymorphism KISS TDD
YAGNI
SOC
Encapsulation SOLID DRY
8. 1515
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
SOLID PRINCIPLE
Single Responsibility
Each class only has one responsibility.
Open/Closed
Modules should be open to extension but closed for modification.
Liskov Substitution
Subclasses should be substitutable for any parent class.
Interface Segregation
Split large interfaces into smaller more client-specific ones.
Dependency Inversion
High-level modules should be independent of low-level modules.
1616
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
SOLID PRINCIPLE: TEST PERSPECTIVE
Single Responsibility
Makes the class definition easier to test.
Open/Closed
Reduces test maintenance since tests for existing objects still work.
Liskov Substitution
Polymorphism cleans up conditionals and promotes mocking.
Interface Segregation
Results in small, well-defined interfaces that are easier to test.
Dependency Inversion
Facilitates the use of stubs and mocks through loose coupling.
9. 1717
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
FOSTERING DESIGN FOR TESTABILITY
Activity/Area Description
Choice of
Technologies
Libraries, frameworks, repositories, and services should support testability.
Design
Conventions
Proper abstraction and design principles. Good test automation practices.
Isolation
Frameworks
Tools and approaches for creating stubs, mocks, and spies promote unit,
component, and integration testing.
Logging and
Dumps
In large systems, often system-level tests fail but unit tests pass. Mechanisms
for logging errors and events, and creating memory dumps support testing
and debugging.
Flexible
Configuration
Support for specifying the desired test environment, data sources, and
mock objects through updating a configuration file make testing more
convenient.
1818
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
LIGHTWEIGHT
PLANNING
10. 1919
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
AGILE TESTING
Adaptive
Rapid
Responsive
Flexible
Evolutionary
Continuous
Do we need
to plan?
2020
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
Types of Testing
Functional, Usability, Security, Performance
Automation vs. Manual Testing
Continuous Integration, Exploratory Testing
Levels of Testing
Unit, Integration, System (Pyramid Goals)
Test Infrastructure
Environments, Hardware and Software
Estimation of Testing Effort
At release planning, test plan
documentation is developed
to describe the testing strategy.
During each sprint of a release,
focus on defining test cases to
validate stories and features.
11. 2121
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
LIGHTWEIGHT TEST PLANNING
Conduct test planning breadth-first
Write one-line descriptions of each test case indicating its purpose.
Review these with relevant stakeholders before filling in the details.
Favor self-documenting test automation over comprehensive
detailed manual test documentation.
Leverage recorders for capturing test documentation during
exploratory testing sessions.
Store all test information on a central test management server that
is accessible to stakeholders.
2222
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
TEST MANAGEMENT INFRASTRUCTURE
Test Management
Server
Test Cases
Exploratory Testing
Sessions
Self-Documenting
Test Automation
Test Management Tools
Check-In
Attachments
Test Planning, Exploratory Testing
Sync
(Nightly)
Code Repository
12. 2323
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
CAPTURING EXPLORATORY SESSIONS
Rapid Reporter
Session Tester
Microsoft Test Manager
2424
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
SELF-DOCUMENTING AUTOMATION
Builder Pattern Fluent APIs Domain-Specific Languages
[TestMethod]
public void GoogleSearchStory()
{
new Story("Google Search").Tag("Sprint 1")
.InOrderTo("find public information")
.AsA("user")
.IWant("to search the Web for documents")
.WithScenario("simple text search")
.Given(IOpenGoogleSearch)
.When(IEnterSearchCriteria, "Pi")
.And(ISubmitTheRequest)
.Then(TheResultsPageContains, "3.1415")
.ExecuteWithReport(GetCurrentMethod());
}
13. 2525
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
DSLs FOR FUNCTIONAL TESTING
getgauge.io
specflow.orgcucumber.io
docs.behat.org
storyq.codeplex.com
,
jbehave.org
2626
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
STABILIZING TEST AUTOMATION
14. 2727
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
STABILIZING TEST AUTOMATION
Statistics
2828
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
TEST IMPACT
ANALYSIS
15. 2929
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
REGRESSION TESTING
Process of validating modified software to detect whether new
defects have been introduced into previously tested code
Provides confidence that modifications are correct
Occurs at different levels:
– Unit
– Integration
– System
– System Integration
Regression testing is an expensive process
3030
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
REGRESSION
TESTING
STRATEGIES
Several test selection strategies
have been proposed to reduce
this expense.
Testing Objectives
Retest Changed Components
Retest Affected Components
Retest Integration (Re-Integration)
Testing Challenges
Identifying Changed Parts
Identifying Impacted Parts
Selecting/Reducing Test Suites
Achieving Adequate Coverage
16. 3131
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
MODULE FIREWALL CONCEPT
A module firewall in a
program refers to a
changed software
module and a transitive
closure of all possible
affected modules and
related integration links in
a program based on a
control flow graph
a
b
c
3232
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
With this firewall concept we can reduce regression testing to
a smaller scope.
Retest all modules and integration links within firewall
This implies retest of all the changed modules themselves
and re-integration for all affected modules
Firewall concept applies to several different test models,
e.g., Class, Feature, Data, State.
FIREWALL REGRESSION TESTING
17. 3333
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
FIREWALL TEST EXAMPLE
Main
M1 M2 M3
M4 M5 M6 M7
Changed
Module
M8
A module firewall:
M5, M1, Main
Unit Level Re-Testing:
M5
Re-Integration?
(M1, M5)
(Main, M1)
(M5, M8)
3434
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
Testing Smarter, Not Harder
IntelliTest
18. 3535
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
Periodically analyzes changesets to determine
which classes have changed since the last build.
Generate a list of all classes impacted by the
changes through dependency analysis.
Identify all of the tests that verify the behavior of
the affected classes.
Execute the test cases associated with the code
changes and report the results.
Analyze Code
Changes
Generate List of
Impacted Classes
Identify Tests for
Impacted Classes
Execute Tests and
Generate Results
Check-In
BUILD INTEGRATION
3636
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
VISUAL STUDIO EXTENSION
Key Features
Visualize Source Code
Multi-Product Support
Firewall Impact Analysis
Custom Dependencies
Integration Test Ordering
19. 3737
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
FEATURE-LEVEL TEST IMPACTS
3838
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
RISK-BASED
TESTING
20. 3939
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
RISK-BASED TESTING
A technique for prioritizing testing activities so that we
can get the most out of our testing efforts.
Test parts of the software that pose the highest threat
to project success most heavily.
Allows us to optimize testing efforts by:
– Testing error-prone features and modules.
– Testing critical features and modules.
– Searching for the most harmful defects.
– Limited or no testing on low-impact areas.
4040
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
RISK-BASED TESTING (cont.)
How to identify risk areas?
How to calculate risk?
How to test based on risk calculations?
21. 4141
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
RISK HEURISTICS
Risks associated with the project, staff, management, or the
software itself can be used to guide testing.
Categories of heuristics for identifying testing-related risks include:
Business Facing Technology FacingGeneral
Requirements
Popularity (Frequency)
Criticality
Market
Bad Publicity
Liability
Complexity
Changes
Bad Quality
Scheduling
Resources
Budget
Untestability
Integration
New Technology
Programming Language
Weak Testing Tools
Unfixability
4242
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
RISK HEURISTICS: CHEATSHEETS
22. 4343
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
RISK CALCULATION
1. Choose factors for functional, technical, or other quality risks.
2. Assign weights to chosen factors: (1-Low, 3-Medium, 10-High).
3. Assign points to factors in every area: (1 - 2 - 3 - 4 - 5)
4. Calculate the weighted sum:
Risk = Cost * Probability
Cost
(Weight for Impact Factor 1 * Value for Factor) +
(Weight for Impact Factor 2 * Value for Factor) +
(Weight for Impact Factor n * Value for Factor)
Probability
(Weight for Probability Factor 1 * Value for Factor) +
(Weight for Probability Factor 2 * Value for Factor) +
(Weight for Probability Factor n * Value for Factor)
4444
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
RISK CALCULATION: EXAMPLE
Risk Area Criticality
(Weight: 10)
Frequency
(Weight: 3)
Complexity
(Weight: 3)
Schedule
(Weight: 10)
Risk
Feature A 4 4 4 4
(4*10+4*3)*(4*
3+4*10)= 2704
Feature B 3 1 2 5
(3*10+1*3)*(2*
3+5*10)= 1848
Feature C 3 3 2 1
(3*10+3*3)*(2*
3+1*10)= 624
Criticality: 1- Unimportant; 5 – Business Critical
Complexity: 1 – Simple; 5 – Highly Complex
Popularity/Frequency: 1 – Rarely Used; 5 - Always Used
Schedule: 1 – No Time Pressure; 5 – Very Aggressive Schedule
23. 4545
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
RISK CALCULATION: SCHEMA
https://www.dropbox.com/s/xbtmy2wra9zcrev/RiskBasedTestingCalculationSchema.xlsx?dl=0
4646
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
PRIORITIZING TESTING EFFORTS
Cost (Impact)
Probability(Likelihood)
Risk Assessment
High Risk. Thorough Testing
Medium Risk. Moderate Testing
Low Risk. Light Testing1 2 3 4 5
5
4
3
2
1
High
Low
Medium
24. 4747
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
PRIORITIZING TESTING EFFORTS (cont.)
Area Risk Testing Criteria and Techniques
Component A Low
At least 1 Positive and 2 Negative Tests
Statement Coverage
Component B Medium
At least 3 Positive and 5 Negative Tests
Branch Coverage
Subsystem High Basis Path for All Components
Feature High
State Transition Testing
Robust Boundary
Story Low
GUI Interaction Testing
Basic Boundary
4848
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
PRIORITIZING TESTING EFFORTS (cont.)
Risk Area Importance (%)
Functionality 50
Performance 20
Security 15
Usability 10
Accessibility 5
Portability 0
An even higher level of
prioritization can be
achieved by determining
the relative importance of
other quality areas.
25. 4949
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
CONCLUSION
Common sources of waste in testing activities include untestability, test
redundancy, over-documentation, brittle automation and over-testing.
Focusing on added value and considering the relative importance of
testing tasks can help us to avoid and eliminate such waste.
Design for test, lightweight test planning, test impact analysis, and risk-
based testing are some of the ways you can optimize testing.
By optimizing your testing strategy, you can reduce your time spent
testing while maintaining test coverage and product quality.
5050
Lean Test Management: Reduce Waste in Planning, Automation and Execution
Tariq King
THANK YOU!
Acknowledgments
Robert Vanderwall Dionny Santiago
Gabriel Nunez Denise Krentz
26. Risk-Based Testing Heuristics
Business Facing
Complexity
□ Feature or requirement may contain many complicated input, processing, and output steps.
Changes
□ New things – newer features may be the source of failures.
□ Changed things – modifications may introduce errors into previously tested features.
□ Dependencies – failed features/components may trigger other failures (↑Dependencies, ↑Risk).
Bad Quality
□ Lack of System Testing – Defects can hide in untested features (↓Test Coverage, ↑Risk).
□ Domain Knowledge – mistakes can be made by analysts, developers, and testers due to lack of
understanding of the problem area.
□ Bugginess – features with many known bugs may also have many unknown bugs.
□ Construction History – previous development and/or testing strategy was narrow and inadequate.
Requirements
□ Ambiguous Requirements – unclear or imprecise story descriptions and acceptance criteria
□ Conflicting Requirements – contradictions in story descriptions and/or acceptance criteria
□ Unknown Requirements – incomplete stories and/or missing acceptance criteria
□ Evolving Requirements – product vision changes during the course of development
Customer-Driven
□ Popularity/Frequency – feature is or will be heavily used by customers
□ Criticality – feature is very important to the customer
□ Market – feature is a key differentiator that separates our product from that of our competitors
□ Bad Publicity – bug could appear in the media (CNN, PC Week)
□ Liability – bug could cause us or our customers to get sued (e.g., Compliance)
Schedule, Resources, and Budget
□ Rushed work – moving fast to catch up after falling behind schedule can lead to reduced quality.
□ Scope creep – growth in requirements without increasing the schedule, resources, and/or budget.
□ Tired staff members – long overtime over several weeks or months causes inefficiencies and errors.
□ Late changes – introducing changes late in the development cycle leads to work being done poorly.
□ Other staff issues – alcoholic, family member died, staff member rivalry, turnover …
27. Risk-Based Testing Heuristics
Technology Facing
Complexity
□ Subsystem or class may have high measurements for cyclomatic complexity, lines of code, Halstead.
Changes
□ New things – newer code modules may be the source of failures.
□ Changed things – modifications may introduce errors into previously tested code.
□ Dependencies – failed features/components may trigger other failures (↑Dependencies, ↑Risk).
Bad Quality
□ Lack of Unit Testing – Defects can hide in untested code (↓Code Coverage, ↑Risk).
□ Domain Knowledge – mistakes can be made by analysts, developers, and testers due to lack of
understanding of the problem area.
□ Bugginess – code with many known bugs may also have many unknown bugs.
□ Construction History – previous development and/or testing strategy was narrow and inadequate.
Design
□ Untestability – systems designed without testability in mind run risk of slow, inefficient testing.
□ Integration – interconnections with other components (especially third-party) can cause issues.
Implementation
□ New Technology – implementing new concepts and constructs can lead to programming mistakes.
□ Programming Language – some errors are language or paradigm specific (e.g., wild pointers in C,
lambda expressions in C# vs. Java).
□ Weak Testing Tools – if tools don’t exist to help identify certain types of errors, such errors are likely
to survive testing.
□ Unfixability – a decision may be made to not go back to fix this area after it is developed (one-shot).
Schedule, Resources, and Budget
□ Rushed work – moving fast to catch up after falling behind schedule can lead to reduced quality.
□ Scope creep – growth in requirements without increasing the schedule, resources, and/or budget.
□ Tired staff members – long overtime over several weeks or months causes inefficiencies and errors.
□ Late changes – introducing changes late in the development cycle leads to work being done poorly.
□ Other staff issues – alcoholic, family member died, staff member rivalry, turnover …