Tool support for testing (CAST)
Software Testing
ISTQB Foundation Certificate Course
1 Principles 2 Lifecycle
4 Dynamic test
techniques
3 Static testing
5 Management 6 Tools
Types of CAST tool
Tool selection and implementation
Contents
ISTQB Foundation Certificate
Course
Tool support
1 2
4 5
3
6
Testing tool
classification
Requirements testing tools
Static analysis tools
Test design tools
Test data preparation tools
Test running tools - character-based, GUI
Comparison tools
Test harnesses and drivers
Performance test tools
Dynamic analysis tools
Debugging tools
Test management tools
Coverage measurement
Static
analysis
Test management tools
Test design
Test data
preparation
Coverage
measures
Test running
Dynamic
analysis
Debug
Performance
measurement
Comp. Test
Where tools fit
Req Anal
Code
Function
Design
Sys Test
Int Test
Acc Test
Requirements
testing
Test harness
& drivers
Comparison
Requirements testing tools

Automated support for verification and
validation of requirements models
- consistency checking
consistency checking
- animation
animation
Tool information available from:
Ovum Evaluates Software Testing Tools (subscription service)
CAST Report, 1999
World Wide Web
Static analysis tools

Provide information about the quality of
software

Code is examined, not executed

Objective measures
- cyclomatic complexity
cyclomatic complexity
- others: nesting levels, size
others: nesting levels, size
Test design tools

Generate test inputs
- from a formal specification or CASE repository
from a formal specification or CASE repository
- from code (e.g. code not covered yet)
from code (e.g. code not covered yet)
Test data preparation tools

Data manipulation
- selected from existing databases or files
selected from existing databases or files
- created according to some rules
created according to some rules
- edited from other sources
edited from other sources
Test running tools 1

Interface to the software being tested

Run tests as though run by a human tester

Test scripts in a programmable language

Data, test inputs and expected results held in
test repositories

Most often used to automate regression
testing
Test running tools 2

Character-based
- simulates user interaction from dumb terminals
simulates user interaction from dumb terminals
- capture keystrokes and screen responses
capture keystrokes and screen responses

GUI (Graphical User Interface)
- simulates user interaction for WIMP applications
simulates user interaction for WIMP applications
(Windows, Icons, Mouse, Pointer)
(Windows, Icons, Mouse, Pointer)
- capture mouse movement, button clicks, and keyboard
capture mouse movement, button clicks, and keyboard
inputs
inputs
- capture screens, bitmaps, characters, object states
capture screens, bitmaps, characters, object states
Comparison tools

Detect differences between actual test results
and expected results
- screens, characters, bitmaps
screens, characters, bitmaps
- masking and filtering
masking and filtering

Test running tools normally include
comparison capability

Stand-alone comparison tools for files or
databases
Test harnesses and drivers

Used to exercise software which does not
have a user interface (yet)

Used to run groups of automated tests or
comparisons

Often custom-build

Simulators (where testing in real environment
would be too costly or dangerous)
Performance testing tools

Load generation
- drive application via user interface or test harness
drive application via user interface or test harness
- simulates realistic load on the system & logs the
simulates realistic load on the system & logs the
number of transactions
number of transactions

Transaction measurement
- response times for selected transactions via user
response times for selected transactions via user
interface
interface

Reports based on logs, graphs of load versus
response times
Dynamic analysis tools

Provide run-time information on software
(while tests are run)
- allocation, use and de-allocation of resources, e.g.
allocation, use and de-allocation of resources, e.g.
memory leaks
memory leaks
- flag unassigned pointers or pointer arithmetic faults
flag unassigned pointers or pointer arithmetic faults
Debugging tools

Used by programmers when investigating,
fixing and testing faults

Used to reproduce faults and examine
program execution in detail
- single-stepping
single-stepping
- breakpoints or watchpoints at any statement
breakpoints or watchpoints at any statement
- examine contents of variables and other data
examine contents of variables and other data
Test management tools

Management of testware: test plans,
specifications, results

Project management of the test process, e.g.
estimation, schedule tests, log results

Incident management tools (may include
workflow facilities to track allocation,
correction and retesting)

Traceability (of tests to requirements,
designs)
Coverage measurement tools

Objective measure of what parts of the
software structure was executed by tests

Code is instrumented in a static analysis pass

Tests are run through the instrumented code

Tool reports what has and has not been
covered by those tests, line by line and
summary statistics

Different types of coverage: statement, branch,
condition, LCSAJ, et al
Types of CAST tool
Tool selection and implementation
Contents
ISTQB Foundation Certificate
Course
Tool support
1 2
4 5
3
6
Tool Selection and Implementation
Select
Evaluate
Lots of tools One tool Tool used by lots of people
Implementation process
Selection process
The Tool Selection Process
Analyse needs,
Define problem
Consider
tool support
as a solution
Make
business
case
Define
required
features
Constraints
Short list
Evaluate
Demo
[Trial]
Evaluators
Decide
Long
list
Where to start with tools?

Do not start
- with a vendor visit & tool demonstration
with a vendor visit & tool demonstration
- with a list of tool features and functions
with a list of tool features and functions
- while your testing process is chaotic (good testing is more
while your testing process is chaotic (good testing is more
important than tools) (“CAST readiness”)
important than tools) (“CAST readiness”)

Do start
- by identifying your needs - which test activities have the
by identifying your needs - which test activities have the
worst problems, prioritise
worst problems, prioritise
- consider constraints, e.g. hardware, OS, integration with
consider constraints, e.g. hardware, OS, integration with
other tools (cosmetic only?)
other tools (cosmetic only?)
Tool selection process

After automation requirements are agreed:
- create shortlist of candidate tools
create shortlist of candidate tools
- arrange demos
arrange demos
- evaluate selected tool(s)
evaluate selected tool(s)
- review and select tool
review and select tool

Don’t underestimate “people issues”, e.g.
politics, resistance to change, territories
The Tool Implementation Process
Assemble
team
Management
commitment
Publicity
Internal
market
research
Pilot
Pilot
evaluation
Phased
implementation
Post
implementation
review
Pilot project and implementation

Objectives of the pilot project
- gain experience in the use of the tool
gain experience in the use of the tool
- identify changes in test process
identify changes in test process
- set internal standards and conventions
set internal standards and conventions
- assess costs and achievable benefits
assess costs and achievable benefits

Implementation
- based on successful pilot
based on successful pilot
- needs strong commitment from tool users & managers
needs strong commitment from tool users & managers
(overcome resistance, overheads for learning curve)
(overcome resistance, overheads for learning curve)
Buy
Sell
(internally)
Support
Infrastructure
Tool implementation iceberg
Summary: Key Points
ISTQB Foundation Certificate
Course
There are many different types of tool support for
testing, covering all areas of the life cycle.
Selecting and implementing tools needs attention and
effort if benefits are to be realised.
Tool support
1 2
4 5
3
6
Which of the following pairs of test tools ate likely to be most useful
during the test analysis and test analysis and design stage of the
fundamental test process?

Test execution tool

Test data preparation tool

Test management tool

Requirements management tool

a. (i) and (ii)

b. (i) and (iv)

c. (ii) and (iii)

d. (iii) and (iv)
Which of the following is most likely to cause failure in
the implementation of a test rule?

a. Underestimating the demand for a tool

b. The purchase price of the tool

c. No agreed requirements for the tool

d. The cost of resources to implement and maintain
the tool
What benefits do static analysis tools have over
Test execution tools?

a. Static analysis tools finds defects earlier in the life
cycle.

b. Static analysis tools can be used before code is
written.

c. Static analysis tools test that the delivered code
meets the business requirements.

d. Static analysis tools are particularly effective for
regression testing.
For which of the following activities in the fundamental test
process would an incident management tool be most useful?

a. Test planning and control.

b. Test Analysis and design.

c. Test implementation and execution.

d. Evaluating exit criteria and reporting.
Which of the following principles should be followed when
introducing a Test tool into an organization?

(i) Assessing organizational maturity to establish whether a
tool will provide expected benefits

(ii) Requiring a quick payback on the initial investment.

(iii) Including a requirement for the tool to be easy to use
without having to train unskilled testers.

(iv) Identifying and agreeing requirements before evaluating
test tools
- a. (i) and (ii)
a. (i) and (ii)
- b. (i) and (iv)
b. (i) and (iv)
- c. (ii) and (iii)
c. (ii) and (iii)
- d. (iii) and (iv)
d. (iii) and (iv)
Which of the following defects is most likely to be
found by a test harness?

a. Variance from programming standards.

b. A defect in middleware.

c. Memory leaks.

d. Regression defects.
How can test execution tools be of most benefit during
exploratory testing?

a. They can record user actions so that defects are
easier to recreate.

b. They can be used to perform the regression
aspects of exploratory testing.

c. They can help to mitigate the risk of low test
coverage.

d. They can use data-driven tests to increase the
amount of exploratory testing performed.
How can test execution tools be of most benefit during
exploratory testing?

a. They can record user actions so that defects are
easier to recreate.

b. They can be used to perform the regression
aspects of exploratory testing.

c. They can help to mitigate the risk of low test
coverage.

d. They can use data-driven tests to increase the
amount of exploratory testing performed.
How can test execution tools be of most benefit during
exploratory testing?

(i) Performance monitoring tool

(ii) Requirements tool

(iii) Configuration management tool

(iv) Static analysis

a. (i) and (ii)

b. (i) and (iv)

c. (ii) and (iii)

d. (iii) and (iv)
How can test execution tools be of most benefit during
exploratory testing?

(i) Performance monitoring tool

(ii) Requirements tool

(iii) Configuration management tool

(iv) Static analysis

a. (i) and (ii)

b. (i) and (iv)

c. (ii) and (iii)

d. (iii) and (iv)
A test management tool is most likely to integrate with
which of the following tools?

a. Performance management tool.

b. Test data preparation tool.

c. Static analysis tool.

d. Requirements management tool.

Software Testing ISTQB study material part2.ppt

  • 1.
    Tool support fortesting (CAST) Software Testing ISTQB Foundation Certificate Course 1 Principles 2 Lifecycle 4 Dynamic test techniques 3 Static testing 5 Management 6 Tools
  • 2.
    Types of CASTtool Tool selection and implementation Contents ISTQB Foundation Certificate Course Tool support 1 2 4 5 3 6
  • 3.
    Testing tool classification Requirements testingtools Static analysis tools Test design tools Test data preparation tools Test running tools - character-based, GUI Comparison tools Test harnesses and drivers Performance test tools Dynamic analysis tools Debugging tools Test management tools Coverage measurement
  • 4.
    Static analysis Test management tools Testdesign Test data preparation Coverage measures Test running Dynamic analysis Debug Performance measurement Comp. Test Where tools fit Req Anal Code Function Design Sys Test Int Test Acc Test Requirements testing Test harness & drivers Comparison
  • 5.
    Requirements testing tools  Automatedsupport for verification and validation of requirements models - consistency checking consistency checking - animation animation Tool information available from: Ovum Evaluates Software Testing Tools (subscription service) CAST Report, 1999 World Wide Web
  • 6.
    Static analysis tools  Provideinformation about the quality of software  Code is examined, not executed  Objective measures - cyclomatic complexity cyclomatic complexity - others: nesting levels, size others: nesting levels, size
  • 7.
    Test design tools  Generatetest inputs - from a formal specification or CASE repository from a formal specification or CASE repository - from code (e.g. code not covered yet) from code (e.g. code not covered yet)
  • 8.
    Test data preparationtools  Data manipulation - selected from existing databases or files selected from existing databases or files - created according to some rules created according to some rules - edited from other sources edited from other sources
  • 9.
    Test running tools1  Interface to the software being tested  Run tests as though run by a human tester  Test scripts in a programmable language  Data, test inputs and expected results held in test repositories  Most often used to automate regression testing
  • 10.
    Test running tools2  Character-based - simulates user interaction from dumb terminals simulates user interaction from dumb terminals - capture keystrokes and screen responses capture keystrokes and screen responses  GUI (Graphical User Interface) - simulates user interaction for WIMP applications simulates user interaction for WIMP applications (Windows, Icons, Mouse, Pointer) (Windows, Icons, Mouse, Pointer) - capture mouse movement, button clicks, and keyboard capture mouse movement, button clicks, and keyboard inputs inputs - capture screens, bitmaps, characters, object states capture screens, bitmaps, characters, object states
  • 11.
    Comparison tools  Detect differencesbetween actual test results and expected results - screens, characters, bitmaps screens, characters, bitmaps - masking and filtering masking and filtering  Test running tools normally include comparison capability  Stand-alone comparison tools for files or databases
  • 12.
    Test harnesses anddrivers  Used to exercise software which does not have a user interface (yet)  Used to run groups of automated tests or comparisons  Often custom-build  Simulators (where testing in real environment would be too costly or dangerous)
  • 13.
    Performance testing tools  Loadgeneration - drive application via user interface or test harness drive application via user interface or test harness - simulates realistic load on the system & logs the simulates realistic load on the system & logs the number of transactions number of transactions  Transaction measurement - response times for selected transactions via user response times for selected transactions via user interface interface  Reports based on logs, graphs of load versus response times
  • 14.
    Dynamic analysis tools  Providerun-time information on software (while tests are run) - allocation, use and de-allocation of resources, e.g. allocation, use and de-allocation of resources, e.g. memory leaks memory leaks - flag unassigned pointers or pointer arithmetic faults flag unassigned pointers or pointer arithmetic faults
  • 15.
    Debugging tools  Used byprogrammers when investigating, fixing and testing faults  Used to reproduce faults and examine program execution in detail - single-stepping single-stepping - breakpoints or watchpoints at any statement breakpoints or watchpoints at any statement - examine contents of variables and other data examine contents of variables and other data
  • 16.
    Test management tools  Managementof testware: test plans, specifications, results  Project management of the test process, e.g. estimation, schedule tests, log results  Incident management tools (may include workflow facilities to track allocation, correction and retesting)  Traceability (of tests to requirements, designs)
  • 17.
    Coverage measurement tools  Objectivemeasure of what parts of the software structure was executed by tests  Code is instrumented in a static analysis pass  Tests are run through the instrumented code  Tool reports what has and has not been covered by those tests, line by line and summary statistics  Different types of coverage: statement, branch, condition, LCSAJ, et al
  • 18.
    Types of CASTtool Tool selection and implementation Contents ISTQB Foundation Certificate Course Tool support 1 2 4 5 3 6
  • 19.
    Tool Selection andImplementation Select Evaluate Lots of tools One tool Tool used by lots of people Implementation process Selection process
  • 20.
    The Tool SelectionProcess Analyse needs, Define problem Consider tool support as a solution Make business case Define required features Constraints Short list Evaluate Demo [Trial] Evaluators Decide Long list
  • 21.
    Where to startwith tools?  Do not start - with a vendor visit & tool demonstration with a vendor visit & tool demonstration - with a list of tool features and functions with a list of tool features and functions - while your testing process is chaotic (good testing is more while your testing process is chaotic (good testing is more important than tools) (“CAST readiness”) important than tools) (“CAST readiness”)  Do start - by identifying your needs - which test activities have the by identifying your needs - which test activities have the worst problems, prioritise worst problems, prioritise - consider constraints, e.g. hardware, OS, integration with consider constraints, e.g. hardware, OS, integration with other tools (cosmetic only?) other tools (cosmetic only?)
  • 22.
    Tool selection process  Afterautomation requirements are agreed: - create shortlist of candidate tools create shortlist of candidate tools - arrange demos arrange demos - evaluate selected tool(s) evaluate selected tool(s) - review and select tool review and select tool  Don’t underestimate “people issues”, e.g. politics, resistance to change, territories
  • 23.
    The Tool ImplementationProcess Assemble team Management commitment Publicity Internal market research Pilot Pilot evaluation Phased implementation Post implementation review
  • 24.
    Pilot project andimplementation  Objectives of the pilot project - gain experience in the use of the tool gain experience in the use of the tool - identify changes in test process identify changes in test process - set internal standards and conventions set internal standards and conventions - assess costs and achievable benefits assess costs and achievable benefits  Implementation - based on successful pilot based on successful pilot - needs strong commitment from tool users & managers needs strong commitment from tool users & managers (overcome resistance, overheads for learning curve) (overcome resistance, overheads for learning curve)
  • 25.
  • 26.
    Summary: Key Points ISTQBFoundation Certificate Course There are many different types of tool support for testing, covering all areas of the life cycle. Selecting and implementing tools needs attention and effort if benefits are to be realised. Tool support 1 2 4 5 3 6
  • 27.
    Which of thefollowing pairs of test tools ate likely to be most useful during the test analysis and test analysis and design stage of the fundamental test process?  Test execution tool  Test data preparation tool  Test management tool  Requirements management tool  a. (i) and (ii)  b. (i) and (iv)  c. (ii) and (iii)  d. (iii) and (iv)
  • 28.
    Which of thefollowing is most likely to cause failure in the implementation of a test rule?  a. Underestimating the demand for a tool  b. The purchase price of the tool  c. No agreed requirements for the tool  d. The cost of resources to implement and maintain the tool
  • 29.
    What benefits dostatic analysis tools have over Test execution tools?  a. Static analysis tools finds defects earlier in the life cycle.  b. Static analysis tools can be used before code is written.  c. Static analysis tools test that the delivered code meets the business requirements.  d. Static analysis tools are particularly effective for regression testing.
  • 30.
    For which ofthe following activities in the fundamental test process would an incident management tool be most useful?  a. Test planning and control.  b. Test Analysis and design.  c. Test implementation and execution.  d. Evaluating exit criteria and reporting.
  • 31.
    Which of thefollowing principles should be followed when introducing a Test tool into an organization?  (i) Assessing organizational maturity to establish whether a tool will provide expected benefits  (ii) Requiring a quick payback on the initial investment.  (iii) Including a requirement for the tool to be easy to use without having to train unskilled testers.  (iv) Identifying and agreeing requirements before evaluating test tools - a. (i) and (ii) a. (i) and (ii) - b. (i) and (iv) b. (i) and (iv) - c. (ii) and (iii) c. (ii) and (iii) - d. (iii) and (iv) d. (iii) and (iv)
  • 32.
    Which of thefollowing defects is most likely to be found by a test harness?  a. Variance from programming standards.  b. A defect in middleware.  c. Memory leaks.  d. Regression defects.
  • 33.
    How can testexecution tools be of most benefit during exploratory testing?  a. They can record user actions so that defects are easier to recreate.  b. They can be used to perform the regression aspects of exploratory testing.  c. They can help to mitigate the risk of low test coverage.  d. They can use data-driven tests to increase the amount of exploratory testing performed.
  • 34.
    How can testexecution tools be of most benefit during exploratory testing?  a. They can record user actions so that defects are easier to recreate.  b. They can be used to perform the regression aspects of exploratory testing.  c. They can help to mitigate the risk of low test coverage.  d. They can use data-driven tests to increase the amount of exploratory testing performed.
  • 35.
    How can testexecution tools be of most benefit during exploratory testing?  (i) Performance monitoring tool  (ii) Requirements tool  (iii) Configuration management tool  (iv) Static analysis  a. (i) and (ii)  b. (i) and (iv)  c. (ii) and (iii)  d. (iii) and (iv)
  • 36.
    How can testexecution tools be of most benefit during exploratory testing?  (i) Performance monitoring tool  (ii) Requirements tool  (iii) Configuration management tool  (iv) Static analysis  a. (i) and (ii)  b. (i) and (iv)  c. (ii) and (iii)  d. (iii) and (iv)
  • 37.
    A test managementtool is most likely to integrate with which of the following tools?  a. Performance management tool.  b. Test data preparation tool.  c. Static analysis tool.  d. Requirements management tool.