An introduction to Hypothesis Based Testing

  • 8,067 views
Uploaded on

A scientific personal test methodology to delivering clean software.

A scientific personal test methodology to delivering clean software.

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
8,067
On Slideshare
0
From Embeds
0
Number of Embeds
7

Actions

Shares
Downloads
252
Comments
0
Likes
2

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Introduction to HBT“Hypothesis Based testing”A scientific personal test methodology to delivering clean software Copyright STAG Software Private Limited, 2010-11 www.stagsoftware.com
  • 2. Test Methodologies in vogue …focus on activities …are driven by process …powered by tools …depends on experienceTest methodologies focus on activities that are driven by a process that arepowered by tools, successful outcomes still depend a lot on experience Copyright STAG Software Private Limited, 2010-11 2
  • 3. Testing - A Fishing AnalogyAssume that you are a fisherman(woman) . You fish in a lake andmake Rs 100 per day. What do youhave to do get higher returns sayRs 200 per day?Constraints:You cannot:... fish anywhere else....control the selling price... increase production of fish.What would you do extract higherbusiness value ( or returns)? Copyright STAG Software Private Limited, 2010-11 3
  • 4. HBT in a nutshell Focus on the goal &What defects are you looking for? then on the activities What ‘fishes’ to catchWhen is it detectable earliest?Formulate a staged quality growth model When and where to catch Create a ‘complete’ set of test cases Big net, small holes Use appropriate tooling Measure the right stuff & course correct Cover more, move fast, course correctCopyright STAG Software Private Limited, 2010-11 4
  • 5. Hypothesis-based Testing...is a scientific personal test methodology poweredby a defect detection technology that enables anindividual to rapidly & effectively deliver “CleanSoftware” What is clean software? Identify ‘Cleanliness Criteria” Goal Hypothesize Devise Proof Tooling Assess & PDT Support AnalyzeCopyright STAG Software Private Limited, 2010-11 5
  • 6. How is HBT differentfrom other methodologies? Goal al ic T p HBTy Powered by experience drives Activities ...................................... Powered by defect detection ...................................... ...................................... technology (STEM) Activities ...................................... ...................................... ...................................... ...................................... ...................................... ...................................... ...................................... ...................................... hopefully ...................................... results in Goal Copyright STAG Software Private Limited, 2010-11
  • 7. COMPARING METHODOLOGIES Attributes of a methodology Engineering ManagementMethodology Characteristic Effective Efficient Consistent Scalable Visible AgileProcess driven Well laid process Domain Experience basedcenteredAd-hoc Individual creativity Individual analyticalExploratory skillsAutomation Tool baseddrivenAgile Frequent evaluation Scientific &HBT goal-focussed Copyright STAG Software Private Limited, 2010-11 7
  • 8. 1. Be clear where you want to go! (Clear goal) Marketplace } Expectations Environment Business value How good? Cleanliness criteria End users Requirements Example: Clean Water implies Needs Features 1.Colourless Attributes 2.No suspended particlesWhat do I want? Usage 3.No bacteria 4.Odourless Accelerate understanding/ramp-up Copyright STAG Software Private Limited, 2010-11 8
  • 9. 2. Know the route clearly3. Use learnings from others potential defect Types What types of defects do I need to uncover? Hypothesize Example: Data validation Cleanliness criteria Timeouts Resource leakage Calculation Storage Presentation Transactional ... Potential defect types Accelerate goal clarity Copyright STAG Software Private Limited, 2010-11 9
  • 10. 4. Find a shorter route (optimal) Potential defect typesQuality levels QL3 PDT7 Test types PDT7 TT5 PDT6 PDT6 QL2 PDT5 TT4 PDT5 PDT4 PDT4 TT3 QL1 PDT3 PDT3 PDT2 TT2 PDT2 PDT: TT1 PDT1 PDT1 Potential Defect Types Staged & purposeful detection Optimize testing Copyright STAG Software Private Limited, 2010-11 10
  • 11. 5.Drive carefully6.Detour less Test Scenarios/Cases R1 PDT1 Staged & purposeful TS1 TC1,2,3 detection R2 PDT2 TT R3 TS2 TC4,5,6,7 PDT3 Requirements & Fault traceability Complete test cases Cover more ground Copyright STAG Software Private Limited, 2010-11 11
  • 12. 7. Buy a faster vehicle8. Use good vehicle & fuel (good technology) Staged & purposeful Tooling and scripts detection Better ROI Sensible automation Move fast Copyright STAG Software Private Limited, 2010-11 12
  • 13. 9. Keep track of where you are10.Negotiate trouble quickly Quality Index Complete test cases QL3 QL2 QL1 Sensible automation Quality, Progress & Risk Goal directed measures Course correct quickly Copyright STAG Software Private Limited, 2010-11 13
  • 14. Accelerate understanding Accelerate goal clarity Cleanliness criteria Potential defect types Optimize testing Expectations Staged & purposeful detection Cover more ground Complete test casesCourse correct quickly Move fast Goal directed Sensible automation measures Copyright STAG Software Private Limited, 2010-11 14
  • 15. HBT PictorialExpectations Cleanliness criteria Potential Defect Types(PDT) Functional aspects Needs PD PD PD Non-functional aspects Quality Index QL3 Quality levels PDT7 PDT6 QL2 PDT5 QL1 PDT4 PDT3 PDT2 Tooling and scripts PDT1 Test types PDT7 Test Scenarios/Cases TT5 PDT6 R1 PDT1 TT4 PDT5 TS1 TC1,2,3 R2 PDT2 PDT4 TT TT3 PDT3 R3 TS2 TC4,5,6,7 PDT3 TT2 PDT2 TT1 PDT1 Copyright STAG Software Private Limited, 2010-11 15
  • 16. Hypothesis Based Testing (HBT)A goal focused methodology to validationConsists of SIX stages of “doing” S6 S1 The central theme of HBT is Assess & Understand “hypothesize potential defects that ANALYZE EXPECTATIONS can cause loss of expectations and prove that they will not exist”.S5 Tooling HBT Understand S2 SUPPORT CONTEXT The focus is on the goal and Devise Formulate how we shall achieve it, PROOF HYPOTHESIS rather than the various activities. S4 S3 i.e”goal-centric vs. activity-based” Copyright STAG Software Private Limited, 2010-11 16
  • 17. SIX stages of “doing” are powered by HBT & STEM EIGHT thinking disciplines S6 S1 ‘deliver clean software GOAL quickly & cost-effectively’ Assess & Understand ANALYZE EXPECTATIONS D8 D1 “methodology” D7 D2S5 Tooling Understand a system of ways of doing STEM S2 SUPPORT D6 D3 CONTEXT HBT ‘goal centered scientific D5 D4 approach to validation’ Devise Formulate HYPOTHESIS “method” PROOF STEM a particular way of doing something S4 S3 ‘defect detection technology from STAG’ Copyright STAG Software Private Limited, 2010-11 17
  • 18. STEM 2.0STAG Test Engineering Method STEM Way D8 D1 Consists of EIGHT Disciplines Analysis & Business value management understanding & THIRTY-TWO scientific conceptsD7 D2 Execution & Defect STEM Core reporting STEM Core hypothesis 32 core concepts A discipline consists of steps Visibility Strategy & each of which is aided by planning D3D6 scientific concept(s) Tooling Test design D5 D4Copyright STAG Software Private Limited, 2010-11 18
  • 19. STEM Core - Provides Scientific Basis Consists of 32 Core ConceptsD1 Business value understanding D2 Defect hypothesis Landscaping EFF model Viewpoints (Error-Fault-Failure) Reductionist principle Defect centricity principle Interaction matrix Negative thinking Operational profiling Orthogonality principle Attribute analysis Defect typing GQMD4 Test design D3 Test strategy & planning Reductionist principle Orthogonality principle Input granularity principle Tooling needs assessment Box model Defect centered AB Behavior-Stimuli approach Quality growth principle Techniques landscape Techniques landscape Complexity assessment Process landscape Operational profiling Test coverage evaluation Copyright STAG Software Private Limited, 2010-11 19
  • 20. STEM Core - Provides Scientific Basis Consists of 32 Core ConceptsD5 Tooling D6 Visibility Automation complexity assessment Minimal babysitting principle GQM Separation of concerns Quality quantification model Tooling needs analysisD8 Analysis & Management D7 Execution & Reporting Gating principle Contextual awareness Cycle scoping Defect rating principle Copyright STAG Software Private Limited, 2010-11 20
  • 21. Needs & Expectations } Needs Construction oriented Results in features in the software Implemented using technology(ies) User requirements/ by developers Technical specificationsEnd users } Normally not as clear or Expectations purposeful How well should the needs be met? HBT enables extraction of Is the focus of the test staff Cleanliness Criteria to set up clear goal Copyright STAG Software Private Limited, 2010-11 21
  • 22. HBT Overview Needs Expectations Risk assessment User types Cleanliness Criteria Quality index Requirements Potential Defect Types Test outcome Features Quality Levels Cycle scoping Attributes Test Types Test scripts } Usage profile Test Techniques Tooling architecture Business logic Test Scenarios/Cases Metrics Data Req. traceability Fault traceabilityCopyright STAG Software Private Limited, 2010-11 22
  • 23. A pictorial on HBTExpectations Cleanliness criteria Potential Defect Types(PDT) Functional aspects Needs PD PD PD Non-functional aspects Quality Index QL3 Quality levels PDT7 PDT6 QL2 PDT5 QL1 PDT4 PDT3 PDT2 Tooling and scripts PDT1 Test types PDT7 Test Scenarios/Cases TT5 PDT6 R1 PDT1 TT4 PDT5 TS1 TC1,2,3 R2 PDT2 PDT4 TT TT3 PDT3 R3 TS2 TC4,5,6,7 PDT3 TT2 PDT2 TT1 PDT1Copyright STAG Software Private Limited, 2010-11 23
  • 24. S6 S1S5 HBT S2 Understand Expectations S4 S3 Understand Understand Understand the marketplace for deployment technology(ies) used software environment Identify end user types & #users for each type Identify business requirements for each user type Copyright STAG Software Private Limited, 2010-11 24
  • 25. S6 S1S5 HBT S2 Understand Expectations S4 S3 Understand Understand Understand the marketplace for deployment technology(ies) used software environment Identify end user types & #users for each type STEM Core concepts Landscapes Viewpoints Identify business requirements for each user type Copyright STAG Software Private Limited, 2010-11 25
  • 26. S6 S1S5 HBT S2 Understand Expectations S4 S3 Understand Understand Understand the marketplace for deployment technology(ies) used software environment Identify end user types & #users for each type e s O u tcom t men e map ew docu eatur Identify business ervi ment/f Ov ire requirements for each u t Req type lis user type User Copyright STAG Software Private Limited, 2010-11 26
  • 27. S6 S1S5 HBT S2 Understand Context S4 S3 Identify technical Understand Understand profile offeatures and baseline dependancies usage them Identify critical success factors Prioritize value of end user(s) and features Setup cleanliness Ensure attributes are criteria testable Copyright STAG Software Private Limited, 2010-11 27
  • 28. S6 S1S5 HBT S2 Understand Context S4 S3 Identify technical Understand Understand profile offeatures and baseline dependancies usage them STEM Core concepts Identify critical success Reductionist principle factors Interaction matrix Operational profiling Attribute analysis GQM (Goal-Question-Metric) Prioritize value of end user(s) and features Setup cleanliness Ensure attributes are criteria testable Copyright STAG Software Private Limited, 2010-11 28
  • 29. S6 S1S5 HBT S2 Understand Context S4 S3 Identify technical Understand Understand profile offeatures and baseline dependancies usage them Identify critical success e s tcom factors Ou atrix u st re li itizat ion m Feat prior e ia Prioritize value of end Valu e profile s list riter Us ag ribute ent c user(s) and features tt se ssm K ey a iness as l C lean Setup cleanliness Ensure attributes are criteria testable Copyright STAG Software Private Limited, 2010-11 29
  • 30. S6 S1S5 HBT S2 Formulate Hypothesis S4 S3 Identify PD due to Identify PD due to Identify potential faults data, logic structure,technology based on usage Identify potential failures and therefore PD Identify error injection opportunities and therefore PD Map PDTs toPD = Potential defects Group PDs to form PDTs requirements/featuresPDT = Potential defect types Copyright STAG Software Private Limited, 2010-11 30
  • 31. S6 S1S5 HBT S2 Formulate Hypothesis S4 S3 Identify PD due to Identify PD due to Identify potential faults data, logic structure,technology based on usage Identify potential STEM Core concepts EFF model failures and therefore (Error-Fault-Failure) PD Defect centricity principle Negative thinking Orthogonality principle Identify error injection Defect typing opportunities and therefore PD Map PDTs toPD = Potential defects Group PDs to form PDTs requirements/featuresPDT = Potential defect types Copyright STAG Software Private Limited, 2010-11 31
  • 32. S6 S1S5 HBT S2 Formulate Hypothesis S4 S3 Identify PD due to Identify PD due to Identify potential faults data, logic structure,technology based on usage Identify potential failures and therefore s PD ome og Outc catal t efects char nt ial d gation atrix Identify error injection Pote propa ility m opportunities and t Faul traceab therefore PD t Faul Map PDTs toPD = Potential defects Group PDs to form PDTs requirements/featuresPDT = Potential defect types Copyright STAG Software Private Limited, 2010-11 32
  • 33. S6 S1S5 HBT S2 DEVISE PROOF - Part 1of3 y and plan S4 S3 Test strateg Formulate quality Identify types of test to Understand scope levels be performed Identify test techniques Identify defectIdentify risks detection process Estimate effort using Formulate cycles anddefect based activity Identify tooling needs there scope breakdown Copyright STAG Software Private Limited, 2010-11 33
  • 34. S6 S1S5 HBT S2 DEVISE PROOF - Part 1of3 y and plan S4 S3 Test strateg Formulate quality Identify types of test to Understand scope levels be performed STEM Core concepts Identify test techniques Orthogonality principle Tooling needs assessment Defect centered AB Quality growth principle Techniques landscape Identify defectIdentify risks Process landscape detection process Estimate effort using Formulate cycles anddefect based activity Identify tooling needs there scope breakdown Copyright STAG Software Private Limited, 2010-11 34
  • 35. S6 S1S5 HBT S2 DEVISE PROOF - Part 1of3 y and plan S4 S3 Test strateg Formulate quality Identify types of test to Understand scope levels be performed Identify test techniques s ome Outc eg y s trat Identify defectIdentify risks Test plan detection process Test Estimate effort using Formulate cycles anddefect based activity Identify tooling needs there scope breakdown Copyright STAG Software Private Limited, 2010-11 35
  • 36. S6 S1S5 HBT S2 DEVISE PROOF - Part 2of3 n S4 S3 Test desig Identify test level to Partition each entity Model the intended design consider & & understand behavior semi-formally identify entities business logic/data Generate the test scenarios For each scenario, generate test cases Assess the test Trace the scenarios to Refine scenarios/adequacy by fault the PDT (requirement cases using structural coverage analysis tracing is built in) properties Copyright STAG Software Private Limited, 2010-11 36
  • 37. S6 S1S5 HBT S2 DEVISE PROOF - Part 2of3 n S4 S3 Test desig Identify test level to Partition each entity Model the intended design consider & & understand behavior semi-formally identify entities business logic/data STEM Core concepts Generate the test Reductionist principle Input granularity principle scenarios Box model Behavior-Stimuli approach Techniques landscape Complexity assessment For each scenario, Operational profiling generate test cases Test coverage evaluation Assess the test Trace the scenarios to Refine scenarios/adequacy by fault the PDT (requirement cases using structural coverage analysis tracing is built in) properties Copyright STAG Software Private Limited, 2010-11 37
  • 38. S6 S1S5 HBT S2 DEVISE PROOF - Part 2of3 n S4 S3 Test desig Identify test level to Partition each entity Model the intended design consider & & understand behavior semi-formally identify entities business logic/data e s Out com s case case Generate the test an d t na rios HBT tes scenarios ce o T est s rming t x nfo ture) (co tec m atrix y matri i ility eabilit arch traceab s trac For each scenario, t t Faul iremen generate test cases u Req Assess the test Trace the scenarios to Refine scenarios/adequacy by fault the PDT (requirement cases using structural coverage analysis tracing is built in) properties Copyright STAG Software Private Limited, 2010-11 38
  • 39. S6 S1S5 HBT S2 DEVISE PROOF - Part 3of3 sign S4 S3 Metrics de Identify progress Identify adequacy Identify progress aspects (coverage) aspects aspects For each of the aspects identify the intended goal to meet For each of these goals, identify questions to ask Identify when you To answer these want to measure and questions, identify how to measure metrics Copyright STAG Software Private Limited, 2010-11 39
  • 40. S6 S1S5 HBT S2 DEVISE PROOF - Part 3of3 sign S4 S3 Metrics de Identify progress Identify adequacy Identify progress aspects (coverage) aspects aspects For each of the aspects identify the STEM Core concepts intended goal to meet GQM Quality quantification model For each of these goals, identify questions to ask Identify when you To answer these want to measure and questions, identify how to measure metrics Copyright STAG Software Private Limited, 2010-11 40
  • 41. S6 S1S5 HBT S2 DEVISE PROOF - Part 3of3 sign S4 S3 Metrics de Identify progress Identify adequacy Identify progress aspects (coverage) aspects aspects s For each of the tc ome Ou char t aspects identify the m e nts intended goal to meet a sure Me For each of these goals, identify questions to ask Identify when you To answer these want to measure and questions, identify how to measure metrics Copyright STAG Software Private Limited, 2010-11 41
  • 42. S6 S1S5 HBT S2 DEVISE PROOF Quality levels and Test Types S4 S3 QL4 PDT10 TT8 PDT9 TT7 QL3 PDT9 TT6 Cleanliness PDT8 TT5 PDT7 TT4 QL2 PDT6 PDT5 TT3 QL1 PDT4 PDT3 TT2 PDT2 PDT1 TT1 Stage Copyright STAG Software Private Limited, 2010-11 42
  • 43. S6 S1S5 HBT S2 DEVISE PROOF Requirements traceability S4 S3 Requirements Every test case is mapped to a traceability requirement. R1 TC1 (OR) Every requirement does indeed have a R2 TC2 test case. R3 TC3 The intention of to ensure that each ... ... requirement can indeed be validated. Rm TCi Is seen as a indicator of “test adequacy” Copyright STAG Software Private Limited, 2010-11 43
  • 44. S6 S1S5 HBT S2 DEVISE PROOF Fault traceability S4 S3 Map potential defects Map test cases to potential to requirement defects they can detect PD1 R1 TC1 PD1 PD2 R2 TC2 PD2 PD3 R3 TC3 PD3 ... ... ... ... PDn Rm TCi PDn Tracing the potential defects to the requirements & test cases is Fault Traceability. Allows us to understand that intended potential defects can indeed be uncovered Copyright STAG Software Private Limited, 2010-11 44
  • 45. Requirement & fault Traceability Assume that each requirement had just one Requirements traceability is test case. This implies that we have good RTM “Necessary but not sufficient” i.e. each requirement has been covered. What we do know is that could there additional test cases for some of the requirements? Fault Fault So RTM is a necessary condition but NOT a traceability traceability sufficient condition. PD1 R1 TC1 PD1 So, what does it take to be sufficient? PD2 R2 TC2 PD2 If we had a clear notion of types of defects that PD3 R3 TC3 PD3 could affect the customer experience and then ... ... ... ... mapped these to test cases, we have Fault PDn Rm TCi PDn Traceability Matrix (FTM as proposed by HBT). This allows us to be sure that our test cases can Requirements traceability indeed detect those defects that will impact customer experience. Copyright STAG Software Private Limited, 2010-11 45
  • 46. Key concepts Focused defect identificationQuality (Cleanliness) Each ES is focussed on uncovering certain PDT PDT8 TT5 PDT7 TT4 ES1 PDT6 PDT1 ES2 uncover PDT5 TT3 will PDT4 PDT3 TT2 ES3 PDT2 PDT2 ES4 PDT1 TT1 Stage (Time) Each PDT can be detected A test is a collection of by a specific test technique evaluation scenarios (ES) ES1 PDT1 ES1 enabled by technique ES2 consists of PDT1 ES2 TT1 ES3 PDT2 ES3 PDT2 ES4 ES4 Copyright STAG Software Private Limited, 2010-11 46
  • 47. STEM Test Case Architecture Test Type #3 TC Test Type #2 TC Test Type #1 TC QL3 Test cases QL2 Test cases QL1 Test cases PDT Potential defect typesTest cases are categorized by levels and then by typesResults in>> Excellent clarity>> Purposeful i.e. defect oriented>> Clear insight to quality>> Higher coverageCopyright STAG Software Private Limited, 2010-11 47
  • 48. STEM Test Case Architecture Organized by Quality levels sub-ordered by items (features/modules..), segregated by type, ranked by importance/priority, sub-divided into conformance(+) and robustness(-), classified by early (smoke)/late-stage evaluation, tagged by evaluation frequency, linked by optimal execution order, classified by execution mode (manual/automated) A well architected set of test cases is like a effective bait that can ‘attract defects’ in the system. It is equally important to ensure that they are well organized to enable execution optimization and have the right set of information to ensure easy automation. Copyright STAG Software Private Limited, 2010-11 48
  • 49. Test adequacy Analysis Breadth Types of tests Depth Quality levels Porosity Test case “fine-ness” Test breadth Test depth Conformance vs. Robustness QL4 QL3 Test porosity QL2 QL1Copyright STAG Software Private Limited, 2010-11 49
  • 50. Clear assessment (Better visibility) Quality report CC1 CC2 CC3 CC4 E1Clear assessment implies that weare able to objectively state that an E2element under test is able to meet E3the intended cleanliness criteria E4 E5 Met Not met Partially metCopyright STAG Software Private Limited, 2010-11 50
  • 51. S6 S1S5 HBT S2 Tooling Support S4 S3 Perform tooling Identify automation Assess automation benefit analysis scope complexity Identify the order in which scenarios need to be automated Evaluate toolsDebug and baseline Design automation Develop scriptsscripts architecture Copyright STAG Software Private Limited, 2010-11 51
  • 52. S6 S1S5 HBT S2 Tooling Support S4 S3 Perform tooling Identify automation Assess automation benefit analysis scope complexity Identify the order in STEM Core concepts which scenarios need to Automation complexity be automated assessment Minimal babysitting principle Clear separation of concerns principle Evaluate toolsDebug and baseline Design automation Develop scriptsscripts architecture Copyright STAG Software Private Limited, 2010-11 52
  • 53. S6 S1S5 HBT S2 Tooling Support S4 S3 Perform tooling Identify automation Assess automation benefit analysis scope complexity e s nt Identify the order in ut com cume ort which scenarios need to O fi ts do t rep d bene ssmen be automated s an y asse ecture Need lexit it p arch s e Com mation ent d scop Auto quirem asing an Evaluate tools ol re tion ph pts To ma ri A uto ation sc m AutoDebug and baseline Design automation Develop scriptsscripts architecture Copyright STAG Software Private Limited, 2010-11 53
  • 54. S6 S1S5 HBT S2 Assess and Analyze S4 S3 Identify test cases/ Execute test cases, scripts to be Record defects record outcomes executed Record learnings from the activity and the context Record status of executionUpdate strategy, plan, Quantify quality and Analyze executionscenarios, cases/scripts identify risk to delivery progress Copyright STAG Software Private Limited, 2010-11 54
  • 55. S6 S1S5 HBT S2 Assess and Analyze S4 S3 Identify test cases/ Execute test cases, scripts to be Record defects record outcomes executed Record learnings from the activity and the STEM Core concepts context Contextual awareness Defect rating principle Gating principle Record status of executionUpdate strategy, plan, Quantify quality and Analyze executionscenarios, cases/scripts identify risk to delivery progress Copyright STAG Software Private Limited, 2010-11 55
  • 56. S6 S1S5 HBT S2 Assess and Analyze S4 S3 Identify test cases/ Execute test cases, scripts to be Record defects record outcomes executed s ome Record learnings from Outc tus repo rt the activity and the ta t ion s t context E xecu repor t rt D efec ss repo ort ases / e p /c P rogr iness re enarios Record status of nl sc Clea ted test gs execution a n rnin Upd egy/pla ions/lea t t stra bserva o Key Quantify quality andUpdate strategy, plan, Analyze executionscenarios, cases/scripts identify risk to delivery progress Copyright STAG Software Private Limited, 2010-11 56
  • 57. HBT Case Study• Web based product – Version 2.x• One month pilot• 5-day orientation on HBT/STEM 2.0 to the team• Two teams were involved – HBT team and a non-HBT teamCopyright STAG Software Private Limited, 2010-11 57
  • 58. HBT case study - Test case details Test case details STEM NormalModule Increase method methodM1 100 28 257% Test case details (STEM Method)M2 85 52 63% Module Total Positive NegativeM3 95 66 44% M1 100 59 41M4 132 72 83% M2 85 68 17M5 127 28 354% M3 95 67 28M6 855 116 637% M4 132 112 20TOTAL 1394 362 285% M5 127 85 42 M6 855 749 106 Nearly 3x improvement in test cases TOTAL 1394 1140 254 increasing probability of higher defect yield 2x improvement in Negative cases increasing probability of defect yield Copyright STAG Software Private Limited, 2010-11 58
  • 59. HBT case study - Defect, effort detailsDefect details STEM Normal Increase method method Effort details#Defects 32 16 100% STEM Normal Stage method method STEM Method did yield 2x #defects (hours) (hours) 20 (Major), 12(Minor) Test analysis 30* 20Note that out of the 32 defects found, & Design few were residual defects. One of them was a critical one, that *Observations corrupts the entire data. 1. STEM team found key defects lowering the cost of support. 2. In the case of Normal method, they would have spent higher efforts post-release. Copyright STAG Software Private Limited, 2010-11 59
  • 60. Results & Benefits Results • STEM found important residual defects • STEM team was confident of “guarantee” • Slight increase in initial effort, but well compensated with the quality of defects • Customer was happy with the results and approachBenefits• PDT propelled test design• Quality of defects yielded during pilot was high• Robust test design –A good input for test automation Copyright STAG Software Private Limited, 2010-11 60
  • 61. HBT Results50%-5x reduction in post-release defectsRe-architecting test assets increases test coverage by 250%30% defect leakage reduction from early stageTest assessment accelerates integrationTerse requirement - Holes found & fixed at Stage#1Smart automation - 3x reduction in time Copyright STAG Software Private Limited, 2010-11 61
  • 62. STAG Solutions & Servicesbased on HBT & STEM 2.0 Optimization •Design asset re-engineering Product •Maintenance optimization Quality enhancement •LSPS solution •Coverage enhancement Productivity enhancement •Tool adoption Test acceleration Diagnostics & control Organization •E-Learning validation suite •UT assessment •Mobile app validation suite •IT assessment •ERP validation suite Skill enhancement System enhancement •Bluetooth validation suite •COMPASSTM •DevQ system People •Finishing school •QA system EBA Test services •Outsourced testing Corp Training •Managed QA STEMTM •HBT Series Quality injection E&T •JumpStart QA Retail •Robust TD •Req validation •Assessment services •Purposeful strategy •Successful automation •Archi. validationTA •Custom tooling •Functional automation •LSPS validation Copyright STAG Software Private Limited, 2010-11 62
  • 63. Thank you!STEMTM is the trademark of STAG Software Private Limited Copyright STAG Software Private Limited, 2010-11