1INTEGRATIONMATURITY LEVEL(Levels of black box testing)Cyril BoucherChristophe Rault
IntroductionObjective:Defining a tool (called The Integration Maturity Level) thatallows:   Having a common definition of...
IntroductionThe Integration Maturity Level is a Testing levelclassification that has been defined by reusing some original...
4Integration Maturity Level Classification
5   Discover the product, gather information    about explicit / implicit specification   Analyze individual claims, and...
6   Test the completeness of a user story / Involves a    suite of tests on the completed system   Identify things that ...
7   Begin by thinking about everything going on around the    product.   Design tests that involve meaningful and comple...
8   Look for any data processed by the product. Look at    outputs as well as inputs.   Decide which particular data to ...
9   Look for sub-systems and functions that are vulnerable to    being overloaded or “broken” in the presence of    chall...
10   Compare two systems to find which performs better   Measure what parts of the system or workload causes the system ...
11   Find out how the software actually works, and to ask questions about    how it will handle difficult cases   Expect...
12How IML Tool can be used?
13                                   Test Analysis and Design         Use as a Checklist for:   Increasing test plan cove...
14                                             Test Analysis and Design        For Test Plan Coverage:                    ...
‹#›
16                                                Test Review        For Test Idea & Test Strategy Review:   Reviewer Che...
17                             Testing Root Cause Analysis             Use as a classification scheme for identifying     ...
18                                                      Test Reporting   Coverage along functional requirement BUT ALSO a...
‹#›
Upcoming SlideShare
Loading in …5
×

Iml v1.5 open-source version

245 views
210 views

Published on

Integration Maturity Levels
We propose the Integration Maturity Levels as a tool for describing test levels and complexity.
The testing teams can use this tool for:
• defining what "Done" means for a story at the end of an iteration i.e. what is the expected level of quality characteristic that a product feature should have reached (which means tests for this level shall pass and critical / major issues shall be resolved)
• having a common definition between functional teams (from management, architects, developers to testers) of level of integration that we should go through for achieving a good feature quality.
• elaborating and describing the test strategy along iterations
• letting testers have a checklist to review their test plan and challenge its diversity and coverage.
• letting testers being able to report progress on integration not only with figures / metrics but also with a qualitative assessment of “feature quality” and what has been the testing effort and what is the remaining effort of integration.

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
245
On SlideShare
0
From Embeds
0
Number of Embeds
6
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Iml v1.5 open-source version

  1. 1. 1INTEGRATIONMATURITY LEVEL(Levels of black box testing)Cyril BoucherChristophe Rault
  2. 2. IntroductionObjective:Defining a tool (called The Integration Maturity Level) thatallows: Having a common definition of level of integration (i.e. depth of testing) for achieving a good feature quality. Elaborating and Describing the Test strategy along iterations that enforces to test differently along project iterations : early testing sympathetically then testing aggressively and diversely. Reporting the Quality/maturity for Feature
  3. 3. IntroductionThe Integration Maturity Level is a Testing levelclassification that has been defined by reusing some originalideas of test classification, depth of test and type of test aspresented in: Lesson Learned from Software Testing- A Context-Driven Approach: by Cem Kaner, James Bach, Bret Pettichord Heuristic Testing Strategy Model: http://www.testingeducation.org/BBST/foundations/Bach_s atisfice-tsm-4p-1.pdfThe list of test techniques is not exhaustive per level. Theyare given to ease understanding of the expected level of testcomplexity and they can be used in a variety of combinations.
  4. 4. 4Integration Maturity Level Classification
  5. 5. 5 Discover the product, gather information about explicit / implicit specification Analyze individual claims, and clarify vague claims. Expect the product to be brought into alignment with the requirements Mature the feature before feature validation testing
  6. 6. 6 Test the completeness of a user story / Involves a suite of tests on the completed system Identify things that the product can do (functions and sub-functions). Determine how you’d know if a function was capable of working. Test each function / feature, one at a time. See that each function does what it’s supposed to do and not what it isn’t supposed to do. Convoluted scenarios, challenging data and function interaction are avoided at this level of testing
  7. 7. 7 Begin by thinking about everything going on around the product. Design tests that involve meaningful and complex interactions with the product. Testing use cases that cover several features and interaction between features Testing according to various possible users profiles
  8. 8. 8 Look for any data processed by the product. Look at outputs as well as inputs. Decide which particular data to test with. Consider things like boundary values, typical values, convenient values, invalid values, or best representatives. Consider combinations of data worth testing together. Data coverage and complex test result evaluation methods are of interest Test & challenge flow of data through the system / sub- systems Test system reliability again malicious / unexpected lack of / malformed data Test against variability into features Error handling tests, State and transition testing with aggressive test attacks during state transitions
  9. 9. 9 Look for sub-systems and functions that are vulnerable to being overloaded or “broken” in the presence of challenging data or constrained resources. Identify data and resources related to those sub-systems and functions. Select or generate challenging data, or resource constraint conditions to test with: e.g., large or complex data structures, high loads, long test runs, many test cases, low memory conditions. Test resource usage
  10. 10. 10 Compare two systems to find which performs better Measure what parts of the system or workload causes the system to perform badly. Demonstrate that the system meets performance criteria Test Concurrency between features Test product responsiveness Test system scalability
  11. 11. 11 Find out how the software actually works, and to ask questions about how it will handle difficult cases Expectations are open: Some results may be predicted and expected, others may not. Imagine how system could fail and design test to demonstrate whether the system actually fails that way. Looking how the system recovers from failure of issues Test against various configurations (explicit or implicit). Test Inter-operability Test Compatibility (forward or backward) and migration / upgrade Test Security Test against Time: testing with “fast” or “slow” input; fastest and slowest; combinations of fast and slow. Changing Rates: speeding up and slowing down (spikes, bursts, hangs, bottlenecks, interruptions). What happens with unexpected events? Test non functional quality criteria that matters for the product and the customers and that becomes testable only when the product is enough mature
  12. 12. 12How IML Tool can be used?
  13. 13. 13 Test Analysis and Design Use as a Checklist for: Increasing test plan coverage: Going beyond functional testing As an Heuristics for generating new test ideas Diversifying test techniques: Various defect require various kind of test techniques Assessing product risk or value along targeted quality characteristics or defect classes: Various IML Test "test idea / Test basis" against tested domain area, product, feature
  14. 14. 14 Test Analysis and Design For Test Plan Coverage: 30 30 25 20 20 17 15 15 9 10 5 5 3 0 IML1 IML2 IML3 IML4 IML5 IML6 IML7 Do we have good test plan coverage?=> Number of test cases in the test plan per their IML
  15. 15. ‹#›
  16. 16. 16 Test Review For Test Idea & Test Strategy Review: Reviewer Checklist Review test coverage Reviewing Test Techniques against product risks an values
  17. 17. 17 Testing Root Cause Analysis Use as a classification scheme for identifying anti-patterns in test strategy & test plan: For defects leaked to customer or internal downstream showstoppers If test idea was missing in our test plan, What level of testing and techniques would have let us find the issues?
  18. 18. 18 Test Reporting Coverage along functional requirement BUT ALSO along non functional test and test techniques  Common definition to explain test coverage  Understandable description of test purposes  Usefull during Testing Session Debriefing Qualitative Reporting along 3 directions  Test effort  Testing coverage  Testing Result / Quality
  19. 19. ‹#›

×