You have just been assigned a new testing project. Where do you start? How do you develop a plan and begin testing? How will you report on your progress? Paul Holland shares new test project approaches that enable you to plan, test, and report effectively. Paul demonstrates ideas, based on the Heuristic Software Test Model from Rapid Software Testing, that can be directly applied or adapted to your environment. In this hands-on tutorial, you’ll be given a product to test. Start by creating three raw lists (Product Coverage Outline, Potential Risks, and Test Ideas) that help ensure comprehensive testing. Use these lists to create an initial set of test charters. We employ “advanced” test management tools (Excel and whiteboards with Sticky Notes) to create useful test reports without using “bad metrics” (counts of pass/fail test cases, % of test cases executed vs. plan). Look forward to your next testing project with these new ideas and your improved planning, testing, and reporting skills.
Tata AIG General Insurance Company - Insurer Innovation Award 2024
End-to-End Testing with the Heuristic Software Test Model
1. TM
PM Tutorial
10/14/2014 1:00:00 PM
"End-to-End Testing with the Heuristic
Software Test Model"
Presented by:
Paul Holland
Testing Thoughts
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
2. Paul Holland
Doran Jones, Inc.
Paul Holland is currently the Managing Director of the Testing Practice at Doran Jones. He has
more than nineteen years of hands-on testing and test management experience, primarily at
Alcatel-Lucent where he led a transformation of the testing approach for two product divisions,
making them more efficient and effective. As a test manager and tester, Paul focused on
exploratory testing, test automation, and improving testing techniques. For the past five years,
he has been consulting and delivering training within Alcatel-Lucent and externally to companies
such as Intel, Intuit, Progressive Insurance, HP, RIM, and General Dynamics. Paul is one of
three people who teach the Rapid Software Testing course. For more information visit
testingthoughts.com.
3. 1
End-to-End Testing
with a Heuristic Software Test Model
Paul Holland
Managing Director –
Head of Testing Practice
at Doran Jones
My Background
Independent S/W Testing consultant since Apr 2012
19+ years testing telecommunications equipment and reworking test
methodologies at Alcatel-Lucent
12+ years as a test manager
Presenter at CAST, STAREast, STARWest, Let’s Test, STARCanada, and
EuroStar
Keynote at KWSQA conference & Ho Chi Minh City Software Testing
Conference
Facilitator of 50+ peer conferences and workshops
Teacher of S/W testing for the past 6 years
Teacher of Rapid Software Testing
Military Helicopter pilot – Canadian Sea Kings 2
4. 2
Attributions
Many of the concepts that I am presenting
come from the Rapid Software Testing class
developed by James Bach and Michael
Bolton who are often inspired by Jon Bach
and Cem Kaner
3
Agenda
Test Strategy Development – lecture
Project Statement – incl. Bug Reporting
Hands-on Test Strategy Development
Reporting – lecture
Hands-on Testing and Reporting
Delivering Final Reports
4
5. 3
Test Strategy Development
Each feature/product is different from other
features/products. Templates != Good.
Start with “raw” project specific information:
Detailed description of the elements of the
feature/product
Identify specific risks related to the project
Develop a list of “test ideas”
5
Product Coverage Outline
A Product Coverage Outline can be
described as:
A parts list
An inventory of elements (including data)
Components of a product
Essentially a list of items of a product that can
be examined in some way
6
6. 4
Benefits of a PCO
Helps eliminate tunnel vision
Will help display your thoroughness to others
Help you gain respect
Will allow you to get “buy-in” at a high level of what
you are going to consider when testing
Helps prioritize and divide your testing
Helps in reporting
7
Product Coverage Outline
How to create a
PCO
8
7. 5
Product Coverage Outline
Consider the following aspects (SFDIPOT):
Structure of the program (smallest components)
Functionality (individual features)
Data (I/O, create, store, manipulate, backup, …)
Interface (User interfaces, APIs, …)
Platform (Computer, CPU, OS, browser, …)
Operations (How is it used by customers)
Timings (Race conds, time of day/wk/mth/yr, …)
(Taken from the Rapid Software Testing Course, Bach
and Bolton)
9
Product Coverage Outline
Consider touring of the feature/product
Investigate every menu item
List sub-menus
Context specific items
Text boxes and/or Pop-up boxes
Radio buttons / Check boxes
Visual elements (pictures, background, logos)
Complexity (find the most complex thing)
10
8. 6
Product Coverage Outline
Talk with designers, business people, customers,
customer service, etc.
Look at previous versions
Look at marketing information
Look at competing products
Read the functional specification document
(*although, I would do this later – seriously)
11
Product Coverage Outline
As you are investigating the feature/product
Create a list of ALL the elements that MAY need to be
examined in some way
Make a mind map (I use XMind)
Enter the elements into Excel (or a word processor)
Create a list on paper, flip chart, or whiteboard
Make a bunch of sticky notes
(I would encourage electronic versions for ease of sharing
and re-use – but it is not mandatory)
12
9. 7
Risks
13
Identify potential risks to the product that you
might be able to test
Create a separate list or mind map of risks or add to
the PCO if time is limited
Identify ways to test the product for the risks
and how to test PCO elements
Such a combination represents a Test Strategy
Prioritize test selection then start testing
Our Mission
Our mission is to discover the quality status of the newly
reworked version of linkwok.com.
We must find critical bugs in Linkwok before this version is
deployed. We must advise if there is any good reason to consider
postponing the deployment of this release. We are also interested in
non-critical bugs, but your preference should be to find critical
problems QUICKLY and get them reported SOON.
RECENT CHANGES
The “sticky note” functionality was rewritten in this release. But, there
shouldn't be any problems because the functionality has not
changed.
(I will act as the customer representative if you have questions)
14
10. 8
Exercise
Learn about Linkwok.com
Create a PCO for Linkwok.com
This should factor out the components that
MAY be of interest to some testing mission
You have 30 minutes
Please email me your PCO:
paul@testingthoughts.com
15
Documentation
16
One approach (can be followed or modified or ignored)
Create a spreadsheet that lists all charters with details
(see next slide)
Optionally create a Word document as well that lists
the details in a more readable format for customers
Visual tracking with whiteboard
11. 9
Sample Charter w/Test Ideas
17
ARQ Verification under different RA Modes (approx. 2 sessions)
• Verifying ARQ in all RA modes
• Fixed rate
• Rate adaptive
• SRA -> not supported, should revert to rate adaptive
• Params in XXXX profile should override service profile
Check every parameter in service profile except SRA
Verify no alarms if service profile cannot be met
Does invalid service profile still raise CE alarm if it is overridden
• Does ARQ fixed rate profile follow same rules as old fixed rate service profile?
(i.e. follows planned bitrate)
• Test various rates (very low to very high) in both service profiles
• Will ARQ still work under rate limitations?
• Will ARQ profile with narrow range of min/max rates behave as similar service
profile would?
• Look for holes in successful inits with varying US/DS rates & fall-back rules.
• supported by CO and CPE -> ARQ ok
• supported by CO but not CPE -> falls back to IFEC for both DS + US
• supported by CPE but not CO -> falls back to IFEC for both DS + US
Test Tracking
An Excel Spreadsheet with:
List of Charters
Area
Estimated Effort
Expended Effort
Remaining Effort
Tester(s)
Start Date
Completed Date
Issues
Comments
Does NOT include pass/fail percentage or number of test
cases
18
12. 10
Sample Spreadsheet
19
Charter Area
Estimated
Effort
Expende
d Effort
Remaining
Effort Tester Date Started
Date
Completed
Issues
Found Comments
Investigation for high QLN spikes on EVLT
H/W
Performance 0 20 0 acode 12/10/2011 01/14/2012
ALU01617
032
Lots of investigation. Problem was on 2-3
out of 48 ports which just happened to be 2
of the 6 ports I tested.
ARQ Verification under different RA
Modes ARQ 2 2 0
ncowa
n 12/14/2011 12/15/2011
POTS interference ARQ 2 0 0 --- 01/08/2012 01/08/2012
Decided not to test as the H/W team
already tested this functionality and time
was tight.
Expected throughput testing ARQ 5 5 0 acode 01/10/2012 01/14/2012
INP vs. SHINE ARQ 6 6 0
ncowa
n 12/01/2011 12/04/2011
INP vs. REIN ARQ 6 7 5 jbright 01/06/2012 01/10/2012
To translate the files properly, had to install
Python solution from Antwerp. Some
overhead to begin testing (installation,
config test) but was fairly quick to execute
afterwards
INP vs. REIN + SHINE ARQ 12 12
Traffic delay and jitter from RTX ARQ 2 2 0
ncowa
n 12/05/2011 12/05/2011
Attainable Throughput ARQ 1 4 0 jbright 01/05/2012 01/08/2012
Took longer because was not behaving as
expected and I had to make sure I was
testing correctly. My expectations were
wrong based on virtual noise not being
exact.
Weekly Report
A PowerPoint slide indicating the important
issues (not a count but a list)
“Show stopping” bugs
New bugs found since last report
Important issues with testing (blocking bugs,
equipment issues, people issues, etc.)
Risks – both new and mitigated
Tester concerns (if different from above)
The slide on the next page indicating
progress
20
13. 11
Sample Report
21
0
10
20
30
40
50
60
70
80
90
ARQ SRA Vectoring Regression H/W Performance
Effort(personhalfdays)
Feature
"Awesome Product" Test Progress as of 02/01/2012
Original Planned
Effort
Expended Effort
Total Expected
Effort
Direction of lines
indicates effort trend
since last report
Solid centre bar=finished
Green: No concerns
Yellow: Some concerns
Red: Major concerns
Final Report
A complete story that will include:
Status of the product
Testing coverage (use PCO as baseline)
Risk coverage
“Show stopping” and important bugs
Other bugs – if warranted to be in report
Plus tell a compelling story about your
testing…
22
14. 12
23
To test is to compose, edit, narrate, and justify
THREE stories.
A story about the status of the PRODUCT
…about how it failed, and how it might fail...
…in ways that matter to your various clients.
A story about HOW YOU TESTED it
…how you configured, operated and observed it…
…about what you haven’t tested, yet…
…and won’t test, at all, unless our client requests
otherwise…
A story about how GOOD that testing was
…what the risks and costs of testing are…
…what made testing harder or slower…
…how testable (or not) the product is…
…what you need and what
you recommend.
Whiteboard
Used for planning and tracking of test
execution
Suitable for use in waterfall or agile (as long as
you have control over your own team’s
process)
Use colors to track:
Features, or
Main Areas, or
Test styles (performance, robustness, system)
24
15. 13
Whiteboard
Divide the board into four areas:
Work to be done (divided in two)
Work in Progress
Cancelled or Work not being done
Completed work
Red stickies indicate issues (not just bugs)
Create a sticky note for each half day of work (or mark #
of half days expected on the sticky note)
Prioritize stickies daily (or at least twice/wk)
Finish “on-time” with low priority work incomplete
25
Whiteboard Example
26
End of week 1
Out of 7 weeks
16. 14
Whiteboard Example
27
End of week 6
Out of 7 weeks
Exercise
Develop a quick test strategy to test Linkwok for the
next hour
I will be asking for brief 30 sec to 1 min interim
3-level test reports
Create a final 3-level test report (must include PCO
& bug list)
With 30 minutes to go in the class, we will have
maximum 5 minute presentations from each team
28