• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Do you know the potency of your test cases?
 

Do you know the potency of your test cases?

on

  • 1,809 views

This slide-deck from STAG software highlights good testing is to uncover those faults that have the potential to cause severe failures. The big question is “how good are the test cases to detect ...

This slide-deck from STAG software highlights good testing is to uncover those faults that have the potential to cause severe failures. The big question is “how good are the test cases to detect these faults?”

Statistics

Views

Total Views
1,809
Views on SlideShare
1,231
Embed Views
578

Actions

Likes
0
Downloads
39
Comments
0

1 Embed 578

http://www.stagsoftware.com 578

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Do you know the potency of your test cases? Do you know the potency of your test cases? Presentation Transcript

    • Do you know the “Potency” of your test cases?T Ashokash@stagsoftware.com in.linkedin.com/in/AshokSTAG ash_thiruWebinar: Jan 19, 2012, 1430-1530 IST
    • Potency is ... “the strength of the drug, as measured by the amount needed to produce a certain response”. Dictionary definition. ...associated with efficacy of an entity, the ability or capacity to achieve or bring about a particular result.© 2012. STAG Software Private Limited. All rights reserved. 2
    • Affected area Drug(s) Ability to get to target The ‘Bug’© 2012. STAG Software Private Limited. All rights reserved. 3
    • Affected area Where is the target? Drug(s) Ability to get to target The ‘Bug’ Who is the target? Is it strong enough? Have I developed immunity?© 2012. STAG Software Private Limited. All rights reserved. 4
    • Affected area Where is the target? Drug(s) Requirements, Features, Modules, Components. Subsystems we looking in.. Ability to get to target The ‘Bug’ Who is the target? What types of defects Is it strong enough? are we looking for? Drugs ≈ Test cases How “strong” are the test cases? Have I developed immunity? Over time, some parts of the system “harden”. Implies that issues may not be there.© 2012. STAG Software Private Limited. All rights reserved. 5
    • Potency is about : “Area” The least number of test cases with the ability Use case to target specific types of defects in specific Feature areas to ensure “clean software”. Entity Subsystem Module e t t arg Screen, API ... to “Drug” ere Wh Test Cases Potency Wh o t “Bug” o targ et? PDT Potential Defect Type Immunity Resistant to bugs i.e. hardened entities© 2012. STAG Software Private Limited. All rights reserved. 6
    • How “Potent”? The Typical Approach Requirements “Area” traceability “External area that I am Entity covering” e t t arg Code to “Drug” ere coverage Wh “Internal area that I am Test Cases Potency Wh covering” o t “Bug” o targ et? PDT ? ? Potential Defect Type Immunity ? Resistant to bugs i.e. hardened entities© 2012. STAG Software Private Limited. All rights reserved. 7
    • How “Potent”? The Typical Approach Requirements “Area” traceability The least number of test cases with the ability “External area to target specific types of defects in specific that I am areas to ensure “clean software”. Entity covering” e t t arg Code to “Drug” ere coverage Wh “Internal area that I am Test Cases Potency Wh covering” o t “Bug” o targ et? PDT ? ? Potential Defect Type Typically based on experience. Trust me! Immunity ? Resistant to bugs i.e. hardened entities© 2012. STAG Software Private Limited. All rights reserved. 8
    • Assessing Potency Requirements “Area” traceability The least number of test cases with the ability “External area to target specific types of defects in specific that I am areas to ensure “clean software”. Entity covering” e t t arg Code to ere coverage Wh “Drug” “Internal area that I am Test Cases Potency Wh covering” o t “Bug” o targ et? Fault Coverage Countability PDT “What PDTs “Proving sufficiency of test cases” Potential Defect Type uncovered by test cases” Conformance:Robustness “Distribution of +Ve/-Ve test cases” Test case immunity Immunity Levelization “No defect yield Resistant to bugs “Optimal targeting” from test cases” i.e. hardened entities© 2012. STAG Software Private Limited. All rights reserved. 9
    • HBT : Hypothesis Based Testing A Quick Introduction Personal, scientific test methodology. SIX stage methodology powered by EIGHT disciplines of thinking (STEMTM). Setup Hypothesize Cleanliness Criteria Potential Defect Types SUT Nine Stage Cleanliness Assessment Defect Removal Filter Click here to know more about HBT. http://stagsoftware.com/blog?p=570© 2012. STAG Software Private Limited. All rights reserved. 10
    • Targeting... should satisfy impeded by Cleanliness Criteria Potential Defect Entity under test Types Fault Requirements Test Cases traceability traceability “test for what” “what to test” It is necessary for test cases to be purposeful/goal-focused. Tracing test cases to requirement is a necessary condition for this, but not sufficient! Two way tracing i.e. Requirements + FAULT makes this sufficient.© 2012. STAG Software Private Limited. All rights reserved. 11
    • Snapshot of Defect Hypothesis Views Error injection Fault proneness Failure What kinds of erroneous data What kind of issues could data What kinds of bad data can be Data may be injected? cause? generated? What conditions/values can be How can conditions be messed What can be incorrect results Business Logic missed? up? when conditions are combined?Aspects How can we setup incorrect How can structure mess up the What kinds of structure can Structure “structure”? behavior? yield incorrect results? What is incorrect environment How can resources in the How can environment be Environment setup? environment cause problems? messed up? In what ways can we use the What kinds of usage may be be What can be poor usage Usage entity interestingly? inherently faulty? experience?© 2012. STAG Software Private Limited. All rights reserved. 12
    • Fault Coverage - “What PDT’s uncovered by test cases” Fault Fault traceability traceability Test for what? PDT1 R1 TC1 PDT1 PDT2 R2 TC2 PDT2 PDT3 R3 TC3 PDT3 ... ... ... ... PDTn Rm TCi PDTn Requirements traceability What to test? Do the test cases for a given entity have the ability to detect specific types of defects?© 2012. STAG Software Private Limited. All rights reserved. 13
    • Levelization- “Optimal targeting” Staged “Filtration” Objective Issues That user expectations are met L9 End user value User flows, experience That it deploys well in the real environment L8 Clean Deployment Compatibility, migration That the stated attributes are met L7 Attributes met Performance, security, volume, load... That it does not mess up the environment L6 Environment cleanliness Resource leaks, Compatibility... That end-to-end flows work correctly L5 Flow correctness Business flow conditions, Linkages That the functional behaviour is correct L4 Behaviour correctness Functionality L3 Structural integrity That the internal structure is robust Internal structural issues That the user interface is clean L2 Input interface cleanliness UI issues That inputs are handled well L1 Input cleanliness Input data handling© 2012. STAG Software Private Limited. All rights reserved. 14
    • Countability - “Proving sufficiency of test cases” Entity Behaviour Stimuli under test Conditions govern behaviour Input (data) is Stimuli Outcome is Test Scenarios Outcome is Test Cases Inputs Conditions Outputs If #conditions & values for } 1 each conditions are know, C4 then #scenarios i1 2 can be proved to be C2 3 sufficient. i2 C1 4 For each scenario, knowing C3 #values for each input i3 5 can prove sufficiency of test cases.© 2012. STAG Software Private Limited. All rights reserved. 15
    • Conformance:Robustness- “Distribution of +ve/-ve test cases” L9 ... How do the test cases distribute? L3 Note at higher levels test cases progressively become conformance L2 oriented? L1 +ve test cases -ve test cases What do you think the ratio depends on?© 2012. STAG Software Private Limited. All rights reserved. 16
    • Test Case Immunity - “No defect yield from test cases” Analysis of ‘defect type trends’ over ‘time’ by ‘entity under test’ helps us understand “what type of defects” are not present in “what entities”. Example: 1. As product matures, certain types of defects become harder to uncover. Say at Level-1 where the focus is on data validation issues, at later builds these kinds of issues do not surface. This kinda implies that “the system has developed immunity for this type of defect”. 2. Continuing from (1), it may be plausible that one of the entities may still continue to have data validation type of issues. Hence the phrase “entity under test” in ‘defect type trends’ over ‘time’ by ‘entity under test’.© 2012. STAG Software Private Limited. All rights reserved. 17
    • Summarising... Requirements “Area” traceability The least number of test cases with the ability “External area to target specific types of defects in specific that I am areas to ensure “clean software”. Entity covering” e t t arg Code to ere coverage Wh “Drug” “Internal area that I am Test Cases Potency Wh covering” o t “Bug” o targ et? Fault Coverage Countability PDT “What PDTs “Proving sufficiency of test cases” Potential Defect Type uncovered by test cases” Conformance:Robustness “Distribution of +Ve/-Ve test cases” Test case immunity Immunity Levelization “No defect yield Resistant to bugs “Optimal targeting” from test cases” i.e. hardened entities© 2012. STAG Software Private Limited. All rights reserved. 18
    • Results Discovery of missing test cases Assess the quality of test cases statically Improved confidence, able to convince customer about effectiveness logically© 2012. STAG Software Private Limited. All rights reserved. 19
    • Thank you! Connect with us... @stagsoft blog.stagsoftware.com Covered in detail in “Effective review of test cases” A HBT series workshop www.cleansoft.in A division of STAG HBT is the intellectual property of STAG Software Private Limited. STEMTM is the trademark of STAG Software Private Limited.© 2012. STAG Software Private Limited. All rights reserved. www.stagsoftware.com