Effective Software Test Case Design Approach highlights typical wrong approaches to software test case design and focuses on an effective methodology in test case design from a collaborative approach.
Through the use of an example requirement/user story, this presentation highlights the "interactions" between the stakeholders, i.e. Product Owner, Developer, and Test Engineer in the development of user story acceptance criteria, details, test scope, and effective, consistent and valid test cases.
2. I know the product, I can test off the top of
my head!
I don’t have time to write down test cases…
I have a few test cases just because. I don’t
need to worry about them ever, and only I
need to understand them.
My tests are just to “complicated” to
document…
3. Consistency/Structure
Proper test configuration
Understanding/No ambiguity
◦ Exactly how is the test run
Accurate representation of User
Story/Requirement traceability/validation
Consistent test run results/indication of software
quality/conformance
Better communication with stakeholders i.e.
Project owner, development, sales, customer etc.
in verification of software quality (test results)
4. What it is
◦ Input
◦ Action/Event
◦ Response
Why
◦ Validation
◦ Standardized approach to validation
◦ No “ad-hoc” approach
5. Verify our software conforms to specification
◦ Detect non-conformance
◦ Communicate non-conformance
◦ Contain non-conformance
Test activity accountability
◦ Test Planning
◦ Test Design
◦ Test Execution/Reporting
◦ Metrics
7. Test cases spawn from a Requirement/User
Story/Use Case of some sort. But, first there
is the User Story as in this case, or a short
description of something that the end user
wants.
Example User Story
“US001: As a System User, I want to view the number
of critical alerts at any given time per network so
that I can monitor critical alerts more readily from
a webpage.”
8. Just visually check it to “validate” the User
Story!
◦ We call this “Check the box testing”
No test case or just posting the User Story
No definitive steps
No logical path
No real verifiable results
Conclusion “Yep, works as designed…I guess?”
9. Yes, the User Story seems broad
o Yeah, I know this stuff (tribal knowledge)
o I can just ask the developer, he/she might know.
o Blah, blah, blah…
So what is really effective???
11. How do we leverage this…working together?
◦ The User Story
Acceptance criteria
The boundaries of the User Story
Confirm if the story is complete
Will the story work
Gathering information as detailed in the interactions
The Product Owner/System Architect
Ask questions the what and the why
The Developer
The how
12. “US001: As a System User, I want to
view the number of critical alerts at
any given time per network so that I
can monitor critical alerts more
readily from a webpage.”
13. Use Case acceptance criteria…From
interaction with the Product Owner/System
Architect/Developer
◦ What type of critical alerts can be viewed
◦ What is the minimum and maximum count of
critical alerts that are viewable
◦ How much time does it require for a critical alert to
update
◦ How are alerts generated and components involved
◦ Are the critical alerts accurate e.g. alert counts
◦ Are non-critical alerts viewable
14. Conversation – Details behind the story come
out through conversations with the Product
Owner/System Architect/Developer
Confirmation – Acceptance tests confirm the
story is finished and working as intended
Confidence - We are “done” with the
Requirement/User Story
15. User story acceptance criteria
◦ Get the team to think through how a feature or
piece of functionality will work from the user’s
perspective
◦ Remove ambiguity from requirements
◦ For QA, acceptance criteria form the tests that will
confirm that a feature or piece of functionality is
working and complete
16. Scope of User Story
◦ First, developers breakdown the user stories into
discrete development focused tasks that are necessary
to achieve what the User Story is describing.
◦ Knowledge of the tasks functions/participation at code
design phase
◦ Must be “testable”
◦ Put yourself in the “shoes” of the person using the
feature
17. US001: As a System User, I want to view the number of critical alerts at any given time
per network so that I can monitor critical alerts more readily from a webpage.
Test Step Expected Results Status User Story/SCR
Generate and verify
that critical alerts
are present in
“system being
monitored” log files.
Critical alerts are
present in log file.
Pass US001
Verify log file counts
on each system.
Log file counts
should match
number of physical
alerts generated.
Pass US001
On the System Alerts
page, verify “system
being monitored”
counts are present.
Monitored systems
should be present
on System alerts
page. Each system
should accurately
display respected
counts
Fail US001/SCR3245
18. Back to the “short list” We now have in our test cases;
◦ Consistency/Structure
◦ Proper test configuration
◦ Understanding/No ambiguity
Exactly how is the test run
◦ Accurate representation of User Story/Requirement
traceability/validation
◦ Consistent test run results/indication of software
quality/conformance
◦ Better communication with stakeholders i.e. Product
owner, development, sales, customer etc. in verification
of software quality (test results)