Are your
test cases fit for automation?
Powered by HBT

Webinar Presentation Nov 27, 2013
T Ashok
Founder & CEO
STAG Softwa...
Good automation is more than frameworks, tools and scripting.

The inherent structure of scenarios/cases plays a vital rol...
Expectation and Challenges

๏

๏
๏

Complexity of scenario

Develop scripts rapidly
Adapt to newer functionally quickly

©...
Structure of a script

Setup

Prerequisites to be done before “excitation”

Execute

“Excite” the system with inputs

Orac...
What contributes to complexity?

Setup

Prerequisites to be done before “excitation”
#Steps to do this

Execute

“Excite” ...
“Clarity of purpose”

What “type of bug” are we looking for?
...enables focused & simpler checking

Setup

Prerequisites t...
“Clarity of purpose”

... results in ‘just-enough’ steps,
enables smaller/crisper script

Setup

Prerequisites to be done ...
Characteristics of TS that affect ‘automatability’

1

Clarity of purpose
What types of defects are being targeted by the ...
Quality Levels & PDTs - Powered by HBT
1

Clarity of purpose

L9

End user value

That user expectations are met
PDTs rela...
Functional automation covers...
L9

End user value

That user expectations are met
PDTs related to User flows, User experie...
Scripts automate scenarios, not cases
2
3

Entity
under test

Distinctiveness of behaviours
Similarity of inputs

Behaviou...
Fitness for automation

1

Does the TS look for a variety for defects?
What types of defects are being targeted by the tes...
What if TS are “unfit”?

1

The TS is looking for a variety for defects?
Simplify a scenario.
Simplify it by decomposing i...
Hypothesis Based Testing - HBT

impeded by

should satisfy

Cleanliness Criteria
Potential Defect
Types

System Under Test...
HBT Overview
SIX staged purposeful activities, powered by EIGHT disciplines of thinking

SIX Stages of DOING
S1: Understan...
Are your
test cases fit for automation?
Powered by HBT

Thank you.

Connect with us...
@stagsoft
blog.stagsoftware.com
HBT ...
Upcoming SlideShare
Loading in …5
×

Are Your Test Cases Fit For Automation?

2,265 views

Published on

This presentation deck is part of the webinar conducted on Nov 27/Dec 4, 2013 by T Ashok, Founder & CEO of STAG Software.

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
2,265
On SlideShare
0
From Embeds
0
Number of Embeds
529
Actions
Shares
0
Downloads
22
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Are Your Test Cases Fit For Automation?

  1. 1. Are your test cases fit for automation? Powered by HBT Webinar Presentation Nov 27, 2013 T Ashok Founder & CEO STAG Software Private Limited in.linkedin.com/in/AshokSTAG Ash_Thiru
  2. 2. Good automation is more than frameworks, tools and scripting. The inherent structure of scenarios/cases plays a vital role in ensuring rapid development and optimised maintenance. An ill formed structure makes the script complex making it error prone and costly to maintain. We will discuss an interesting idea "Fitness for automation" wherein we examine properties of test scenarios/cases to ascertain if they are good enough to automate. Also what needs to be done in the case where it is "unfit" ? This is one of the applications of "Hypothesis Based Testing" (HBT) to ensure rapid automation and optimised maintenance. © 2013 STAG Software Private Limited. All rights reserved. 2
  3. 3. Expectation and Challenges ๏ ๏ ๏ Complexity of scenario Develop scripts rapidly Adapt to newer functionally quickly © 2013 STAG Software Private Limited. All rights reserved. 3
  4. 4. Structure of a script Setup Prerequisites to be done before “excitation” Execute “Excite” the system with inputs Oracle Check outcomes to determine pass/fail Cleanup Inverse of ‘setup’ © 2013 STAG Software Private Limited. All rights reserved. 4
  5. 5. What contributes to complexity? Setup Prerequisites to be done before “excitation” #Steps to do this Execute “Excite” the system with inputs #Inputs and the ‘kind of inputs(+/-)’ Order, dependency, source Oracle Check outcomes to determine pass/fail The variety of behaviours to check & #Checks for each behaviour Cleanup Inverse of ‘setup’ #Steps to do this © 2013 STAG Software Private Limited. All rights reserved. 5
  6. 6. “Clarity of purpose” What “type of bug” are we looking for? ...enables focused & simpler checking Setup Prerequisites to be done before “excitation” Execute “Excite” the system with inputs Oracle Check outcomes to determine pass/fail Cleanup Inverse of ‘setup’ © 2013 STAG Software Private Limited. All rights reserved. 6
  7. 7. “Clarity of purpose” ... results in ‘just-enough’ steps, enables smaller/crisper script Setup Prerequisites to be done before “excitation” Execute “Excite” the system with inputs Oracle Check outcomes to determine pass/fail Cleanup Inverse of ‘setup’ © 2013 STAG Software Private Limited. All rights reserved. 7
  8. 8. Characteristics of TS that affect ‘automatability’ 1 Clarity of purpose What types of defects are being targeted by the test cases? 2 Distinctiveness of behaviours (Single output) Do TS contain multiple behaviours ? or Single minded? 3 Similarity of inputs Are the various inputs for the TS similar ? © 2013 STAG Software Private Limited. All rights reserved. 8
  9. 9. Quality Levels & PDTs - Powered by HBT 1 Clarity of purpose L9 End user value That user expectations are met PDTs related to User flows, User experience L8 Clean Deployment That it deploys well in the real environment PDTs related to Compatibility, Migration L7 Attributes met That the stated attributes are met PDTs related to Performance, Load,Volume...... L6 Environment cleanliness That it does not mess up the environment PDTs related to Resource leaks, Compatibility... L5 Flow correctness That end-to-end flows work correctly PDTs related to Flow behaviour, Interactions L4 Behaviour correctness That the functional behaviour is correct PDTs related to Functionality L3 Structural integrity That the internal structure is robust PDTs related to Structural aspects L2 Input interface cleanliness That the user interface is clean PDTs related to UI L1 Input cleanliness That inputs are handled well PDTs related to Input data correctness © 2013 STAG Software Private Limited. All rights reserved. 9
  10. 10. Functional automation covers... L9 End user value That user expectations are met PDTs related to User flows, User experience L8 Clean Deployment That it deploys well in the real environment PDTs related to Compatibility, Migration L7 Attributes met That the stated attributes are met PDTs related to Performance, Load,Volume...... L6 Environment cleanliness That it does not mess up the environment PDTs related to Resource leaks, Compatibility... L5 Flow correctness That end-to-end flows work correctly PDTs related to Flow behaviour, Interactions L4 Behaviour correctness That the functional behaviour is correct PDTs related to Functionality L3 Structural integrity That the internal structure is robust PDTs related to Structural aspects L2 Input interface cleanliness That the user interface is clean PDTs related to UI L1 Input cleanliness That inputs are handled well PDTs related to Input data correctness © 2013 STAG Software Private Limited. All rights reserved. 10
  11. 11. Scripts automate scenarios, not cases 2 3 Entity under test Distinctiveness of behaviours Similarity of inputs Behaviour Stimuli Conditions govern behaviour Outcome is Test Scenarios Conditions Inputs i1 i2 i3 } © 2013 STAG Software Private Limited. All rights reserved. Outputs 1 C4 C2 C1 C3 Input (data) is Stimuli Outcome is Test Cases 2 3 4 5 11
  12. 12. Fitness for automation 1 Does the TS look for a variety for defects? What types of defects are being targeted by the test cases? 2 Does the TS check for a specific behaviour? Is a specific behaviour being checked? © 2013 STAG Software Private Limited. All rights reserved. 12
  13. 13. What if TS are “unfit”? 1 The TS is looking for a variety for defects? Simplify a scenario. Simplify it by decomposing it into various levels. “Levelize” 2 The TS is checking a variety of behaviours Make the single minded. Split the scenarios such that each TS checks a specific combination of conditions. © 2013 STAG Software Private Limited. All rights reserved. 13
  14. 14. Hypothesis Based Testing - HBT impeded by should satisfy Cleanliness Criteria Potential Defect Types System Under Test Requirements traceability “what to test” © 2013 STAG Software Private Limited. All rights reserved. Test Cases Fault traceability “test for what” 14
  15. 15. HBT Overview SIX staged purposeful activities, powered by EIGHT disciplines of thinking SIX Stages of DOING S1: Understand expectations S2: Understand context S3: Formulate hypothesis S4: Devise proof S5: Tooling support S6: Assess & Analyse S1 S6 D1 D2 D8 S5 Core Concepts D7 D6 D3 D5 D4 S4 S3 S2 EIGHT Disciplines of Thinking D1: Business value understanding D2: Defect hypothesis D3: Strategy & Planning D4: Test design D5: Tooling D6:Visibility D7: Execution & Reporting D8: Analysis & Management Uses 32 Core Concepts For Problem solving - Techniques, Principles, Guideline Click here to know more about HBT. http://stagsoftware.com/blog?p=570 © 2013 STAG Software Private Limited. All rights reserved. 15
  16. 16. Are your test cases fit for automation? Powered by HBT Thank you. Connect with us... @stagsoft blog.stagsoftware.com HBT is the intellectual property of STAG Software Private Limited. STEMTM is the trademark of STAG Software Private Limited. © 2013 STAG Software Private Limited. All rights reserved. www.stagsoftware.com

×