Risk-based Testing

2,077 views

Published on

Risk-based Testing

Published in: Technology, Business
1 Comment
3 Likes
Statistics
Notes
No Downloads
Views
Total views
2,077
On SlideShare
0
From Embeds
0
Number of Embeds
35
Actions
Shares
0
Downloads
114
Comments
1
Likes
3
Embeds 0
No embeds

No notes for slide

Risk-based Testing

  1. 1. Risk-based Testing The Devil is in the Details 1 2013-09-27 PA1 Confidential
  2. 2. Introduction • Johan Hoberg • Sony Mobile ▫ ~10 years ▫ Tester, Test Team Leader, Test Architect • This is a presentation of my experiences of risk-based testing 2 2013-09-27 PA1 Confidential
  3. 3. Definition • Risk-based testing (RBT) is a type of software testing that prioritizes the tests of features and functions based on the risk of their failure - a function of their importance and likelihood or impact of failure. (http://en.wikipedia.org/wiki/Risk-based_testing) 3 2013-09-27 PA1 Confidential
  4. 4. The Devil is in the Details • Refers to a catch or mysterious element hidden in the details • Derives from “God is in the Details” ▫ Generally attributed to Gustave Flaubert (1821–1880) • It is easy to ask for efficient, risk-based testing • It is much more difficult to actually do it well 4 2013-09-27 PA1 Confidential
  5. 5. System Under Test Configuration 1 Configuration 2 Android Gingerbread Generic Configuration 3 Android Gingerbread Customer A Android Gingerbread Customer B Android Jellybean Generic Product X Product Y Android Jellybean Customer A Android Jellybean Customer B 5 2013-09-27 PA1 Confidential
  6. 6. System Under Test 6 2013-09-27 PA1 Confidential
  7. 7. Changes in the System • There is a change introduced somewhere in the system • Could be a bug fix, a change request, a new feature, refactoring, etc. 7 2013-09-27 PA1 Confidential
  8. 8. Scope Selection • Do you run every available test on every possible configuration of the system and verify all interoperability? • A smart scope is necessary to make it containable • We must be able to reuse test results between products, configuration, and branches 8 2013-09-27 PA1 Confidential
  9. 9. Scope Selection based on Risk Change Introduced 9 2013-09-27 PA1 System Impact Risk-based Scope Confidential
  10. 10. Complex vs. Complicated • Complicated ▫ Opposite of simple ▫ Containing intricately combined or involved parts • Complex ▫ Opposite of predictable? ▫ This adjective means having parts so interconnected as to make the whole perplexing 10 2013-09-27 PA1 Confidential
  11. 11. Complex System • A complex system is difficult to predict • It is therefore difficult to plan which actions to take to mitigate different outcomes • It is difficult to plan which tests to run to cover all potentially relevant defects 11 2013-09-27 PA1 Confidential
  12. 12. Reduce Complexity Reduce Complex System complexity Very difficult! Scope Selection Change 12 2013-09-27 PA1 Confidential
  13. 13. Example • A change is introduced into the Display Driver component – What do I need to test? ▫ Camera? ▫ Interoperability with TV and similar devices? ▫ Multimedia  View pictures, video etc.? ▫ All UI testing? ▫ Other parts of the system? 13 2013-09-27 PA1 Confidential
  14. 14. Reduce Complexity • Risk Analysis • Automated Test Framework • Early Exploratory Testing 14 2013-09-27 PA1 Confidential
  15. 15. Risk Analysis – Input Data Feature & Hardware Delta Historical Data Code Changes System Dependencies & Architecture 15 2013-09-27 PA1 •Easy to obtain •Might give us an indication on where to start looking, but can often give false confidence in scope selection, especially when it comes to E2E system test •Hard to obtain Confidential
  16. 16. Risk Analysis – Risk Model Risk Identification Probability Severity (S) Occurrence (O) Detection (D) 16 2013-09-27 PA1 Consequence Mitigation Plan Risk Priority Number (RPN) = S*O*D Confidential
  17. 17. Risk Analysis - Complex vs. Complicated Risk analysis of whole system is complex 17 2013-09-27 PA1 Risk analysis of sufficiently small part is more complicated than complex Confidential
  18. 18. Risk Analysis Database • If each function performs risk analysis on their components and enters risks in a database this information could be used on system level to get better understanding of impact of specific changes • Requires a uniform way of handing risk information 18 2013-09-27 PA1 Confidential
  19. 19. Problems? • We can get a lot of data that will help us select a scope – but ultimately we will introduce a lot of risk if we do not know how the system is designed, and how different components depend on each other • To get a complete dependency map of the whole system requires not only a good risk-handling infrastructure, but a common effort from a lot of people 19 2013-09-27 PA1 Confidential
  20. 20. Dependency Map Example Application Application Application Application Framework Application Application Application Framework Library Driver 20 2013-09-27 PA1 Confidential
  21. 21. Automated Test Framework Change + Basic Risk Analysis Automated Test Execution 21 Scope Setting Test Results Analysis 2013-09-27 PA1 Test Execution Confidential
  22. 22. Automated Test Framework Change Automated Tests Executed Better understanding of System Dependencies 22 2013-09-27 PA1 Failed Tests Investigated Change in component mapped towards failed tests in other components Confidential
  23. 23. Self-learning System • Over time you will have more and more information about how different changes impact the system • Map code changes to failed tests • This impact map can further help you set an efficient scope 23 2013-09-27 PA1 Confidential
  24. 24. Continuous Integration • The more code changes we have and the more isolated these code changes are, the easier it is to create the impact map • If you only have one integration per month, with thousands of lines of code changed in different parts of the system, it will be difficult to draw conclusions 24 2013-09-27 PA1 Confidential
  25. 25. Problems? • The information value you receive is dependant on how good the tests are • Dependency graph grows over time – in the beginning there is very little data 25 2013-09-27 PA1 Confidential
  26. 26. Early Exploratory Testing Change + Basic Risk Analysis Exploratory Testing 26 Scope Setting Test Results Analysis 2013-09-27 PA1 Test Execution Confidential
  27. 27. Codifying Experience • How do you transfer the knowledge gained from one tester to the other? • The understanding about impact and dependencies we gain from performing exploratory testing must be stored in a way which makes it accessible to other testers for their risk-based scope selections 27 2013-09-27 PA1 Confidential
  28. 28. Manual Dependency Map • After each exploratory session it is possible to add information to a dependency map, where you indicate which changes were made, and what impact those changes had • It needs to be done in a way which is quick and easy, otherwise it will not be done at all 28 2013-09-27 PA1 Confidential
  29. 29. Problems? • Additional time might be needed before testers get experience with the new process • It will be more difficult to time plan activity before early exploratory testing is done – this can cause problems with test/project leaders and managers • It requires skilled testers 29 2013-09-27 PA1 Confidential
  30. 30. Final thoughts • Every risk-based scope introduces risk – our goal is to eliminate any unnecessary risk • Without skilled testers to write good automated tests or perform exploratory testing it becomes very difficult without a very intricate risk/impact framework 30 2013-09-27 PA1 Confidential
  31. 31. Conclusion • It is very difficult to perform riskbased testing for a large, complex system • It is very easy to ask someone else to reduce their test scope by introducing risk-based testing 31 2013-09-27 PA1 Confidential
  32. 32. Contact • Email ▫ johan.hoberg@outlook.com • Slideshare ▫ www.slideshare.net/JohanHoberg 32 2013-09-27 PA1 Confidential

×