Model based testing

1,427 views

Published on

Model Based Testing

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,427
On SlideShare
0
From Embeds
0
Number of Embeds
14
Actions
Shares
0
Downloads
45
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Model based testing

  1. 1. © 2013 Maveric Systems Limited Knowledge Encapsulation in Model Based Testing
  2. 2. © 2013 Maveric Systems Limited 2 ▸ Background ▸ Automation LANDSCAPE ▸ Test design – A Perspective ▸ Model Based Testing – A positive influence ▸ UAT Challenges ▸ Testac – Challenges addressed ▸ Testac driven design process Coverage
  3. 3. © 2013 Maveric Systems Limited 3 Software Testing – Relatively nascent as an independent industry ▸ Key drivers – over past 10 years » Early years – need for objectivity/independence » Mid years – domain/vertical competencies, regression & repeatability » What now? Background
  4. 4. © 2013 Maveric Systems Limited 4 ▸ Productivity improvement across the testing lifecycle, achieved through » Process improvement initiatives (alignment to TMM, TPI etc.) » Automation initiatives What now?
  5. 5. © 2013 Maveric Systems Limited 5 Test DesignTest Planning Test Closure Automation Landscape Defect Management Test Execution Test Automation ▸ Takes up to 50-65% of time in testing ▸ Most significantly impacts the quality of testing ▸ Currently very limited automation ▸ Scoping & estimation tools ▸ Test automation tools ▸ Environment preparation tools ▸ Test construction – simulation tools, integration stubs ▸ Test management tools ▸ Defect management tools ▸ Data generation tools ▸ SOA testing tools ▸ Configuration & version control tools ▸ Build & release management tools A Model based design approach & tool is the focus of this presentation
  6. 6. © 2013 Maveric Systems Limited 6 ▸ Good design takes up to 50-65% of time in the testing lifecycle ▸ Difficult to standardize a structured approach to design ▸ No scientific method to ensure comprehensiveness – no optimization on the number of test conditions ▸ Increased dependency on the individual’s expertise and experience ▸ Process is not documented; therefore not auditable ▸ Warrants a high understanding of the domain among all team members or high dependence on business users Test Design – A Perspective
  7. 7. © 2013 Maveric Systems Limited 7 Model Based Testing – In the Right Direction Maveric has been leading extensive research over the last two years in adoption of Model based Testing for UAT Key benefits ▸ An independent basis for verification ▸ Objective coverage assessment ▸ Scientific mechanism to achieve desired focus on specific areas ▸ Ability for the business user to relate to process and outputs Model based testing Design pack generation Model definition Application of design techniques Equivalence partitioning Pair-wise Rule of ‘n’ Boundary Value Transition Functional Data-flow Pre/Post
  8. 8. © 2013 Maveric Systems Limited 8 ▸ To ensure comprehensive coverage ▸ Duplication avoidance at various levels ▸ Ensuring data completion ▸ Determination of expected RESULTS ▸ Execution planning/sequencing ▸ Execution prioritization UAT Challenges Needs a solution that uses and extends Model Based Testing
  9. 9. © 2013 Maveric Systems Limited 9 UAT Challenges – Testac as a Solution DOMAIN Principal component Ancillary components Testing Principles Boundary Value, Equivalence Partitioning etc. Test Management Run-plan, Functional Decomposition, Control Reports etc. ▸ A domain-centric test design tool that automates generation of test cases ▸ Proprietary functional framework and an algorithm for case generation, embedded ▸ Contains generic definitions pertaining to the domain; allowing user addition, modification or deletion of these definitions ▸ Has a repertoire of testing principles for generating quality test cases ▸ Enables effective management of testing Focus is on “what to test” and “how much to test”
  10. 10. © 2013 Maveric Systems Limited 10 UAT Challenges – Comprehensive Coverage A. Equivalence partitioning For example, the following set presents the equivalence partitioning of the Tenor in loan setup; optimised to cover partitioning across other attributes as well Through Consistent application of testing Principles B. Boundary value analysis For example, the Amount boundary for a cash deposit in a Savings Account is Rs. (10,000 - 15,000) C. Graphical analysis To check dependencies involved in transactions - for example, Loan product and Applicant type have to be captured for every scenario.
  11. 11. © 2013 Maveric Systems Limited 11 UAT Challenges – Comprehensive Coverage By enabling tracking at various levels Reports enable the users in reviewing outputs generated at various levels – helps in tracking changes in the outputs at various stages Product Matrix Transaction Matrix Workflow Matrix Functional Decomposition Definitions at Product level and Transaction level are used to form the scenarios in the Workflow matrix Control Reports
  12. 12. © 2013 Maveric Systems Limited 12 UAT Challenges – Comprehensive Coverage Pre-defined universal definitions The presence of pre-defined domain specific definitions in the tool enables comprehensive coverage for the given project. For example, the following might be the definitions available relating to retail banking:
  13. 13. © 2013 Maveric Systems Limited 13 At a product level Duplication occurs when the product level attribute values are independently considered for testing without an attempt to combine them. The following example explains how this can be restricted UAT Challenges – Duplication Avoidance Product setup definitions Below is the product matrix without the duplication of the definitions
  14. 14. © 2013 Maveric Systems Limited 14 At a transaction level Duplication occurs when the product level attribute values are independently considered for testing without an attempt to combine them. The following example explains how this can be restricted Transaction: Drawdown definitions UAT Challenges – Duplication Avoidance Below is the transaction matrix without the duplication of the definitions
  15. 15. © 2013 Maveric Systems Limited 15 At a module level Duplication occurs in case of Fees and Charges, where one has to develop separate set of scenarios, without an attempt to integrate them with the existing scenarios. Following example on optimization at this level UAT Challenges – Duplication Avoidance Fee setup definitions Fee rule The Fees and Charges are defined for each of the scenarios as expected results in the Workflow matrix
  16. 16. © 2013 Maveric Systems Limited 16 Across modules This happens in case of GL validations and batch processing validations, when they are not getting integrated with actual scenarios. In order to restrict this duplication the run plan is prepared with following definitions. UAT Challenges – Duplication Avoidance The definitions relating to the GL validations and Batch Processing validations are Integrated with the scenarios. The same are then defined In the Run plan across Modules.
  17. 17. © 2013 Maveric Systems Limited 17 UAT Challenges – Ensuring Data Completion The controls are build in the Workflow matrix to ensure that no critical data is missed out for any given scenario Data Dependencies Data Dependencies System generated fields Optional fields Mandatory fields Dependency not filled for scenario 3 Controls reflects Red color as one of the depen- dencies is not filled for scenario 3
  18. 18. © 2013 Maveric Systems Limited 18 UAT Challenges – Expected Results A. At a Gross Level – Status related (Pass/Fail) expected results are defined at Product and Transaction level Expected results at Product and Transaction level B. At a Granular level – Incorporated in Workflow matrix with reference to screen elements where the expected results would get populated Screen level Expected results are captured in the Workflow matrix
  19. 19. © 2013 Maveric Systems Limited 19 UAT Challenges – Execution Planning Creating transactions 3 days Value dated 2 days Batch processing 5 days Frequencies 4 days Holiday 1 day Checking the Holiday transaction 1 day Total Logical days 16 days Prioritizing of Test Conditions: 1. Negative Conditions relating to – Modules, Holiday treatment, Value dated transactions & Frequencies. 2. Positive Conditions relating to – Modules, Value dated transactions, Batch processing & Frequencies Run Plan with negative scenarios sequenced first Planning the sequence of execution (negative conditions, plain conditions etc.) with emphasis on trying to frontload defects as early as possible. For example the following definitions are used to arrive at the number of Logical days for execution.
  20. 20. © 2013 Maveric Systems Limited 20 UAT Challenges – Execution Prioritization Categorizing of the Test conditions helps in providing clarity with respect to Test execution. During execution the defects can also be categorized with respect to Base functionality, Interfaces or Enhancements, thus aiding in prioritization during regression testing Run Plan with all scenarios of Base functionality, Enhancements & Interfaces Module level enhancements Interfaces (inter module and intra module) Base functionality
  21. 21. © 2013 Maveric Systems Limited 21 Model Based Testing Concepts in Tool
  22. 22. © 2013 Maveric Systems Limited 22 Testac Design Process Overview OutputsProcess Components Questionnaire administration Review baseline documents Interact with business users Product rules definition Transaction rules definition Maintenance rules definition Process Flow definition Fees & Charges definition Link Rules definition Calendar definition Batch Processing rules definition Reporting and Analytics Phase Boundary Definition Matrix Definition Scenario Generation Run Plan Generation Analytics Product matrix Product maintenance matrix Control Reports & metrics Coverage Analytics Run Plan with dates & values Workflow matrix Scenarios Transaction matrix
  23. 23. © 2013 Maveric Systems Limited Thank You

×