Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Mattias Ratert - Incremental Scenario Testing

653 views

Published on

EuroSTAR Software Testing Conference 2009 presentation on Incremental Scenario Testing by Mattias Ratert. See more at conferences.eurostarsoftwaretesting.com/past-presentations/

Published in: Software
  • Be the first to comment

  • Be the first to like this

Mattias Ratert - Incremental Scenario Testing

  1. 1. Incremental Scenario Testing: Beyond Exploratory Testing www.teleca.com Matthias Ratert
  2. 2. Incremental Scenario Testing: Beyond Exploratory Testing 2 Background Teleca is a world-leading supplier of software services and solutions to the mobile industry  Global capability to deliver packaged software solutions, systems design, integration and testing Teleca was developing a complex internet service for a top 5 mobile OEM  500kLOC, large number of handsets, operators, target markets  complex multi-backend with e-commerce, operator integration High functional compliancy reached  ... but due to the complexity the quality was poor
  3. 3. Incremental Scenario Testing: Beyond Exploratory Testing 3 SW functionality to be tested Software complexity SW function range Cross- functionalityHW / SW configuration Memory situation Data in- & output Environment variations Network User interaction HW variations SW customizatio n SW re-usage Distributed development Distributed systems
  4. 4. Incremental Scenario Testing: Beyond Exploratory Testing 4 Test cases Errors Regression Testing SW function range Error fixes Existing errors ...to be exposed Regression Tests need complementation
  5. 5. Incremental Scenario Testing: Beyond Exploratory Testing 5 Be part of the overall test strategy Find as many errors as possible Generate test cases within the test session Challenge the tester  Work without detailed test case descriptions  Allow creativity, force own ideas and motivate to be pro-active  Own judgment of the test result (no expected behavior) Utilize the feedback from previous test sessions Exploratory Testing was not successful for us: Difficult to come up with new and creative test ideas Too many areas remained untested Exploratory Testing
  6. 6. Incremental Scenario Testing: Beyond Exploratory Testing 6 Incremental Scenario Testing (IST) IST controls the complexity IST guides & inspires the tester with Scenarios  Encourage the tester to explore new functionality IST automates the test planning  Optimize the utilization of the available testers and test time  Indentify SW areas not tested ever before IST focuses the testing on new, modified & weak SW areas  Provide an interface for collaboration between Test and SW Team IST increases the visibility of all activities IST provides a web-based tool to offer the features above  The Incremental Scenario Testing Tool (ISTT)
  7. 7. Incremental Scenario Testing: Beyond Exploratory Testing 7 Test Scenario High level test case description built out of three test items: 1. Precondition 2. State 3. Event How to reach each test item is up to the tester  This assures additional variation for usage, input data, timing etc. To be understood as a guideline  The tester is free to experiment and to do additional things State Precondition Event
  8. 8. Incremental Scenario Testing: Beyond Exploratory Testing 8 Test items forming a Scenario Precondition: Everything influencing the SW to be tested  Dependent or independent of the functionality to be tested  The amount of preconditions is flexible from 0 – 5 State: All states and functionalities to be tested  May be linked to requirements or specifications Event: An action with internal or external influence  Dependent or independent of the functionality to be tested Executionflow Test items are grouped by test categories NOTE: Testing can only be as complete as its test data
  9. 9. Incremental Scenario Testing: Beyond Exploratory Testing 9 Test environment and setup for one tester Hardware, Operating System, Configuration, Language, ... Test Scope The Test Scope won‘t be changed within the test session All testers might get the same Test Scope ...or each tester might get a different Test Scope
  10. 10. Incremental Scenario Testing: Beyond Exploratory Testing 10 SW complexity example: Voice calls Outgoing calls (MO) Incoming calls (MT) States Scopes, Preconditions & Events User interactio n New MO / MT callsOS / Phone / Language Battery / Headset Call settings / Contacts / Call listsRAM / C: status Network coverage Operator 2G / 3G
  11. 11. Incremental Scenario Testing: Beyond Exploratory Testing 11 Voice call complexity in numbers How to ensure that the important Scenarios are selected? Scopes cannot be counted easily, e.g. number of devices supported. Preconditions >50 States >40 Events >50
  12. 12. Incremental Scenario Testing: Beyond Exploratory Testing 12 Test Scenario generation Test item history Failed Showstopper Failed Critical Failed Major Failed Minor Not tested (Passed) Scenario history Test item focus Selected to be tested? Scenario calculation based the Test Session Configuration Test item weight Often Regularly Sporadic Rarely (Unusual) Prefer the new or changed functionality Test Scenario to be tested Prefer the common functionality (items) Prefer the risky and not tested functionality (items) Prefer the risky and not tested Scenarios
  13. 13. Incremental Scenario Testing: Beyond Exploratory Testing 13 Scope: Nokia N96, OS 3.1, English, T-Mobile Scope: iPhone, OS 3.0, German, Vodafone Voice call Scenario examples P1: Headset connected P2: Sceensaver = ON P3: Profile = Silent S: MT call ringing E: Lose GSM network P: Battery almost empty S: MO call via address book E: Plug in charger P1: RAM almost full P2: UMTS active S: MO call via speed dial E: MT call ringing P1: Sending of caller ID = ON P2: Summary after call = ON S: MO call via SMS is ringing E: Calendar alarm expires
  14. 14. Incremental Scenario Testing: Beyond Exploratory Testing 14 Test sessions Test sessions are individual  Every tester can start at any time independent of each other  Every tester can pause or stop at any time  Everybody is able to see the progress and results at any time Test sessions are flexible  Test as much as possible within a given time box  Test as long as testers are available (without a deadline) Test sessions can be distributed globally Test sessions are stopped by the Test Manager  Updating the SW will stop the test session and start a new one  The testers are notified after finishing the current Scenario
  15. 15. Incremental Scenario Testing: Beyond Exploratory Testing 15 Further add-ons Ranking system (Junior Tester, Tester, Senior Tester) The rank depends on the experience within the IST project Shall motivate the tester and gives the Test Manager confidence when analyzing the test results Re-Testing: Momentous test results, especially „Failed“ & „Impossible“, need evidence in the same session Preferable done by a tester with a higher rank Supports the Test Manager within the test result analysis Developer Role: Everyone can influence the testing Adding of new test items Marking test items as „To be tested“
  16. 16. Incremental Scenario Testing: Beyond Exploratory Testing 16 Possible scenario (here 40) Possible scenario Test focus: „To be tested“ Test Scenario spectrum 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 Test session kick-off Test session Possible scenario Test focus: „To be tested“ Scenario to be tested First scenarios are selected by test focus and weights
  17. 17. Incremental Scenario Testing: Beyond Exploratory Testing 17 Test Scenario spectrum 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 Test Scenario results Test session ? ? PASS Failed Impossible Skipped Not clear Possible scenario Test focus: „To be tested“ Scenario (to be) tested Scenario (to be) re- tested Impossible scenario
  18. 18. Incremental Scenario Testing: Beyond Exploratory Testing 18 Test session evolution Test Scenario spectrum Test session 1 (Increment 1) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 ? Test session 2 (Increment 2) ? 41 42 43 44 45 PASS Failed Impossible Skipped Not clear? 46 Test session 6 (Increment 6) 47 48 49 50 Test session 5 (Increment 5) ? Test session 4 (Increment 4) ? Test session 3 (Increment 3)
  19. 19. Incremental Scenario Testing: Beyond Exploratory Testing 19 Case study Project period Mar/06 – Mar/09 IST usage period Nov/07 – Mar/09 (17 months) Performed test sessions 27 Duration of test sessions in days 1 – 38 (average: 4,8 days) Possible Test Scenarios (1 precondition) 1.274.400 Possible Test Scenarios (2 preconditions) 50.976.000 Executed Test Scenarios 10.219 New severe error reports 476
  20. 20. Incremental Scenario Testing: Beyond Exploratory Testing 20 Our main achievement Time Internal errors found Errors Release to customer Internal + customer errors found Internal + customer errors found Release to customer Time Internal errors found plus IST Errors
  21. 21. Incremental Scenario Testing: Beyond Exploratory Testing 21 The football trainer Scopes: Team League / Tournament Rules Preconditions: Position/rank Team condition Opponent Venue Weather Events: Goal Injury Yellow card Red card States: Score Time left IST is suitable for any complex system
  22. 22. Matthias Ratert Teleca Germany Rensingstr. 15 44807 Bochum Germany Matthias.Ratert@teleca.com Thank you for your attention! Please visit us at our stand no. 52 for a demo session & further discussions

×