Towards Statistical Prioritization for Software
Product Lines Testing

Xavier Devroey * ; Gilles Perrouin ; Maxime Cordy ;...
Plan
• Introduction
• Background
– Featured Transition Systems

– Product-Based Test Derivation

• Family-Based Test Prior...
TESTING
… in a Product Line Context
Testing Process
Specification
Testing Process
Specification
1. Implemented

SUT
Testing Process
Specification
1. Implemented
2. Derived

SUT
3. Executed

Operationalization

{(pay, change, soda,
serveSo...
Testing a Product Line

{(pay, change, soda,
serveSoda, open,
take close),
(pay, change, tea,
serveTea, open, take,
close)...
Testing a Product Line

{(pay, change, soda, s
erveSoda, open, take
close),
(pay, change, tea, ser
veTea, open, take, clo
...
SPECIFYING A PRODUCT LINE
Featured Transition Systems (FTSs) [Classen et al.
2011]
Specifying a Product Line
Specifying a Product Line
Specifying a Product Line
Featured Transition System
Featured Transition System

[Classen et al. 2011 ]

Feature Diagram
TESTING LOTS OF SYSTEMS
Which product first ?
Which product first ?

• Particularly useful during regression testing
• Using weights on features [Henard et al. 2013, Jo...
Usage Model
• Statistical testing [Whittaker 1994]
• Deterministic Time Markov
Chain (DTMC)
• Independent from the FTS
 A...
Product-Based Test Derivation
[Samih and Baudry 2012]

Family-Based Test Prioritization
Product-Based Test Derivation
[Samih and Baudry 2012]

1. Product
Selection

Family-Based Test Prioritization
Product-Based Test Derivation
[Samih and Baudry 2012]

1. Product
Selection

2. FTS
Projection

Family-Based Test Prioriti...
Product-Based Test Derivation
[Samih and Baudry 2012]

1. Product
Selection

2. FTS
Projection

3. DTMC
Pruning

Family-Ba...
Product-Based Test Derivation
[Samih and Baudry 2012]

1. Product
Selection

2. FTS
Projection

3. DTMC
Pruning

Family-Ba...
Product-Based Test Derivation
[Samih and Baudry 2012]

Family-Based Test Prioritization

1. Product
Selection

2. FTS
Proj...
Product-Based Test Derivation
[Samih and Baudry 2012]

Family-Based Test Prioritization

1. Product
Selection

2. FTS
Proj...
Product-Based Test Derivation
[Samih and Baudry 2012]

Family-Based Test Prioritization

(¬f∧t)∧

1. Product
Selection

3....
Product-Based Test Derivation
[Samih and Baudry 2012]

Family-Based Test Prioritization

1. Product
Selection

3. Product
...
FAMILY-BASED PRODUCT
PRIORITIZATION
Feasibility assessment
www.unamur.be
Claroline: DTMC
• Derived from anonymized Apache Access Log (5.26 Go)
• From January 1st to October 1st 2013
• 12.689.033 ...
Claroline: models

(http://info.fundp.ac.be/~xde/fts-testing/)

• Usage Model (DTMC)
– 96 states and 2149 transitions
– 2 ...
Claroline: Setup and Results
Run 1

Run 2

Run 3

Run 4

Lmax

98

98

98

98

Pr min

1E-4

1E-5

1E-6

1E-7

Pr max

1

...
Claroline: Discussion
• Observation:
– Even with a “simple” algorithm, computation time is reasonable
– Independence of th...
Claroline: Discussion
• Multiple usage models: one per role (i.e., student,
teacher, admin, visitor)
• Use other selection...
CONCLUSION…
… and Future Works
Conclusion
• Contribution:
– A first approach prioritazing behaviours statistically for testing
SPLs in a family-based man...
THANK YOU FOR YOUR ATTENTION !

Models and tools available on
http://info.fundp.ac.be/~xde/fts-testing/
E-mail: xavier.dev...
Upcoming SlideShare
Loading in...5
×

Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

393

Published on

Paper may be downloaded at https://pure.fundp.ac.be/ws/files/7911785/VAMOS2014_FTS_statistical_prioritization.pdf

Published in: Design, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
393
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
9
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

  1. 1. Towards Statistical Prioritization for Software Product Lines Testing Xavier Devroey * ; Gilles Perrouin ; Maxime Cordy ; Pierre-Yves Schobbens ; Axel Legay ; Patrick Heymans 8th International Workshop on Variability Modelling of Software-intensive Systems, VaMoS ’14 Nice, France www.unamur.be
  2. 2. Plan • Introduction • Background – Featured Transition Systems – Product-Based Test Derivation • Family-Based Test Prioritization • Feasibility Assessment (Claroline case-study) • Conclusion and Future Works
  3. 3. TESTING … in a Product Line Context
  4. 4. Testing Process Specification
  5. 5. Testing Process Specification 1. Implemented SUT
  6. 6. Testing Process Specification 1. Implemented 2. Derived SUT 3. Executed Operationalization {(pay, change, soda, serveSoda, open, take close), (pay, change, tea, serveTea, open, take, close)} Test-Cases Pass 4. Fail
  7. 7. Testing a Product Line {(pay, change, soda, serveSoda, open, take close), (pay, change, tea, serveTea, open, take, close)} Pass Fail {(free,, tea, serveTea, take), (free, soda, serveSoda, take); (free, cancel, return)} {(pay, change, tea, se rveTea, open, take close), (pay, change, cancel, return)} Pass Pass Fail Fail {(free,, soda, serveod a, take),}
  8. 8. Testing a Product Line {(pay, change, soda, s erveSoda, open, take close), (pay, change, tea, ser veTea, open, take, clo se)} Which is first ??? Pass Fail {(free,, tea, serveTea, take), (free, soda, serveSod a, take); (free, cancel, return)} {(pay, change, tea, serveTea, open, take close), (pay, change, cancel, return)} Pass Pass Fail Fail {(free,, soda, serveoda, take),}
  9. 9. SPECIFYING A PRODUCT LINE Featured Transition Systems (FTSs) [Classen et al. 2011]
  10. 10. Specifying a Product Line
  11. 11. Specifying a Product Line
  12. 12. Specifying a Product Line
  13. 13. Featured Transition System Featured Transition System [Classen et al. 2011 ] Feature Diagram
  14. 14. TESTING LOTS OF SYSTEMS Which product first ?
  15. 15. Which product first ? • Particularly useful during regression testing • Using weights on features [Henard et al. 2013, Johansen et al. 2012] – Does not consider behaviour • Using weights (i.e., probabilities) on transitions
  16. 16. Usage Model • Statistical testing [Whittaker 1994] • Deterministic Time Markov Chain (DTMC) • Independent from the FTS  Allows usage of existing tools  Extraction method is agnostic of features  DTMC may be incomplete • Allows invalid paths => DTMC + FTS detects inconsistencies
  17. 17. Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization
  18. 18. Product-Based Test Derivation [Samih and Baudry 2012] 1. Product Selection Family-Based Test Prioritization
  19. 19. Product-Based Test Derivation [Samih and Baudry 2012] 1. Product Selection 2. FTS Projection Family-Based Test Prioritization
  20. 20. Product-Based Test Derivation [Samih and Baudry 2012] 1. Product Selection 2. FTS Projection 3. DTMC Pruning Family-Based Test Prioritization
  21. 21. Product-Based Test Derivation [Samih and Baudry 2012] 1. Product Selection 2. FTS Projection 3. DTMC Pruning Family-Based Test Prioritization
  22. 22. Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization 1. Product Selection 2. FTS Projection DFS(lmax = 7 ; Pr min = 0; Pr max = 0.1 ; DTMC) = {(pay, change, cancel, return; Pr = 0,01) ; (free, cancel, return; Pr = 0,09) ; (pay, change, tea, serveTea, open, take, close; Pr = 0,009); (pay, change, tea, serveTea, take; Pr = 0,081) ; (free, tea, serveTea, open, take, close; Pr = 0,081)} 3. DTMC Pruning 1. Trace Selection
  23. 23. Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization 1. Product Selection 2. FTS Projection 2. Trace Filtering and FTS Pruning DFS(lmax = 7 ; Pr min = 0; Pr max = 0.1 ; DTMC) = {(pay, change, cancel, return; Pr = 0,01) ; (free, cancel, return; Pr = 0,09) ; (pay, change, tea, serveTea, open, take, close; Pr = 0,009); (pay, change, tea, serveTea, take; Pr = 0,081) ; (free, tea, serveTea, open, take, close; Pr = 0,081)} 3. DTMC Pruning 1. Trace Selection
  24. 24. Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization (¬f∧t)∧ 1. Product Selection 3. Product Prioritization 2. FTS Projection 2. Trace Filtering and FTS Pruning DFS(lmax = 7 ; Pr min = 0; Pr max = 0.1 ; DTMC) = {(pay, change, cancel, return; Pr = 0,01) ; (free, cancel, return; Pr = 0,09) ; (pay, change, tea, serveTea, open, take, close; Pr = 0,009); (pay, change, tea, serveTea, take; Pr = 0,081) ; (free, tea, serveTea, open, take, close; Pr = 0,081)} 3. DTMC Pruning 1. Trace Selection
  25. 25. Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization 1. Product Selection 3. Product Prioritization 2. FTS Projection 2. Trace Filtering and FTS Pruning DFS(lmax = 7 ; Pr min = 0; Pr max = 0.1 ; DTMC) = {(pay, change, cancel, return; Pr = 0,01) ; (free, cancel, return; Pr = 0,09) ; (pay, change, tea, serveTea, open, take, close; Pr = 0,009); (pay, change, tea, serveTea, take; Pr = 0,081) ; (free, tea, serveTea, open, take, close; Pr = 0,081)} 3. DTMC Pruning 1. Trace Selection
  26. 26. FAMILY-BASED PRODUCT PRIORITIZATION Feasibility assessment
  27. 27. www.unamur.be
  28. 28. Claroline: DTMC • Derived from anonymized Apache Access Log (5.26 Go) • From January 1st to October 1st 2013 • 12.689.033 PHP pages HTTP requests • • • • (1 PHP page  1 state) + initial state 1 request  1 transition User session = sequence of request (timeout = 45 min) 2-gram without smoothing [Verwer et al. 2013] Pr(si , s j ) = # occurence(si , s j ) å # occurence(s , s) i sÎS
  29. 29. Claroline: models (http://info.fundp.ac.be/~xde/fts-testing/) • Usage Model (DTMC) – 96 states and 2149 transitions – 2 hours computation • Ubuntu Linux (Intel Core i3, 3.10 GHz, 4GB mem.) • Feature Diagram (FD) – Built manually by inspecting a Claroline local instance – 44 features • Lots of optional features • Featured Transition System (FTS) – – – – Web crawler on local instance to get the pages (1 page  1 state) + initial state Every state accessible from anywhere Transitions tagged with feature expressions based on the knowledge of the system – 107 states and 11236 transitions
  30. 30. Claroline: Setup and Results Run 1 Run 2 Run 3 Run 4 Lmax 98 98 98 98 Pr min 1E-4 1E-5 1E-6 1E-7 Pr max 1 1 1 1 #DTMC traces 211 1389 9287 62112 #Valid traces 211 1389 9287 62112 Traces avg. size 4.82 5.51 6.35 7.17 Traces avg. proba 2.06E-3 3.36E-4 5.26E-5 8.10E-6 #Pruned FTS states 16 36 50 69 #Pruned FTS transitions 66 224 442 844
  31. 31. Claroline: Discussion • Observation: – Even with a “simple” algorithm, computation time is reasonable – Independence of the features and low size of valid traces • #products associated to each trace too important • Generate longer traces by coupling probabilistic approach with state/transitions coverage criteria • Select minimal features set needed to execute a trace – Use knowledge of the application domain – Select features according to their frequency in the feature expressions of valid traces
  32. 32. Claroline: Discussion • Multiple usage models: one per role (i.e., student, teacher, admin, visitor) • Use other selection criteria on the usage model – Least/Most probable traces • Main threat: Web nature of the considered application
  33. 33. CONCLUSION… … and Future Works
  34. 34. Conclusion • Contribution: – A first approach prioritazing behaviours statistically for testing SPLs in a family-based manner • Future works: – Improve exploration algorithm in order to support other “statistical selection” criteria on the usage model • Least/Most probable behaviours – Combine structural selection criteria with statistical testing in an SPL context • State coverage, transition coverage, transition pairs coverage, path coverage, etc.
  35. 35. THANK YOU FOR YOUR ATTENTION ! Models and tools available on http://info.fundp.ac.be/~xde/fts-testing/ E-mail: xavier.devroey@unamur.be
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×