Model Based Test Validation and Oracles for Data Acquisition Systems

445 views
265 views

Published on

ASE 2013 presentation on model-based testing of satellite data acquisition systems, research project with SES

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
445
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Model Based Test Validation and Oracles for Data Acquisition Systems

  1. 1. Model Based Test Validation and Oracles for Data Acquisition Systems University of Luxembourg Interdisciplinary Centre for Security, Reliability and Trust Software Verification and Validation Lab November 14, 2013 Daniel Di Nardo, Nadia Alshahwan, Lionel Briand, Elizabeta Fourneret University of Luxembourg Tomislav Nakic, Vincent Masquelier SES S.A., Luxembourg
  2. 2. 2 DAQ System Configurations Log Files Data Context: Data Acquisition Systems Structured/Complex Defines how input transmissions are processed Captures what happened while processing
  3. 3. Context and Problem Definition 3 •  Complexity of DAQ systems lies in the structure of the input transmission data and output log files and the mappings between these two elements. •  Problem: –  Testing of such systems is difficult. –  Test cases can be composed of large and highly complex input and output files.
  4. 4. Why is DAQ Testing so Hard? 4 •  Manually constructing test inputs is too much work •  Input data has a complex specific structure –  Dependencies between different fields –  Dependencies between input fields and configuration fields •  System output logs are also complex and their size can grow very large –  Manual inspection time consuming •  Test oracle problem –  Manual validation of test outputs is challenging and error-prone •  Automation specifically designed for these systems is required
  5. 5. State of the Art 5 •  No directly related work •  Model-based testing (MBT) –  Large body of work –  Focused on behavioral models –  Not appropriate for modeling complex input/output file mappings •  Modelling DAQ Systems –  Focused on simulation and design –  Test Oracle Automation not addressed by current approaches –  Test Case Validation has not yet been addressed
  6. 6. 6 DAQ System Configurations Log Files Data How to Represent the System?
  7. 7. A Model Based Approach 7 DAQ System Input-Input Input-Config Input-Output Constraints (OCL): Input Transmission File Config Files Output log Files OCL OCL OCL
  8. 8. Modelling Methodology 8 •  Model system input and output data to automate test validation and oracle checking •  Requirements and domain knowledge is used without source code analysis èApproach is black box •  Modelling notation is not sufficient •  Precise methodology to support the modelling objectives is necessary
  9. 9. Transmission File Structure 9 •  A satellite transmission consists of multiple Channel Data Units Channel  Data  Unit   Sync   Marker   Frame  Data   Reed  Solomon  check  symbols   Frame  Data   Header   PacketZone   Header   Spacecra=  ID   Virtual  Channel  ID   Frame  Count   Header  Error  Control  
  10. 10. Modelling the File Structure 10
  11. 11. Example Constraint on Input and Configuration 11 •  Constraint to ensure that the virtual channelId is valid (can be one of many active ones or can be the idle channel). ! ! context Header inv:! ! let config : configurationData =! self.frameData.channelData.transmissionData.configuration! in! ! config.cId->exists(x | x = self.channelId)! or! self.channelId = config.idleCId
  12. 12. Example Oracle Constraint 12 context ChannelData inv:! let! frameCount : Integer = self.frameData.header.frameCount,! prevFrameCount : Integer = self.prevOnChannel.frameData.header.frameCount ! in ! ! not self.prevOnChannel->isEmpty() and ! if prevFrameCount < 16777215 ! then frameCount <> prevFrameCount + 1 ! else prevFrameCount = 16777215 and frameCount <> 0 ! endif ! implies ! self.transmissionData.outputData.frameDataReport.event! .eventType->exists(i | i = Events::COUNTER_JUMP)) •  Constraint to ensure that the COUNTER_JUMP event is captured in the output log.
  13. 13. Modelling Methodology – Applications 13 Four applications of the model in practice: 1.  Test Design and Generation 2.  Test Oracle 3.  Specifications Refinement 4.  Run-Time Verification
  14. 14. Modelling Methodology – Applications 14 Four applications of the model in practice: 1.  Test Design and generation 2.  Test Oracle 3.  Specifications Refinement 4.  Run-Time Verification
  15. 15. Automation 15 •  Tool architecture for the automation of test validation and oracles.
  16. 16. Case Study – Subject Selection 16 •  Selected a representative, complex DAQ system •  System accepts a binary satellite transmission data file •  Configuration files •  Log files report on results of processing input file •  32 test cases are approved for system validation (manually written) •  Also use real transmission files (2 GB) to validate our approach and assess its scalability
  17. 17. Case Study – Execution and Data Collection 17 Study DAQ System Create the Model Validate the Test Inputs Check Oracle on Test Inputs/Outputs Record Execution Times
  18. 18. Case Study – RQ1 18 RQ1: How much effort is needed to produce the model for a real system?
  19. 19. Case Study Results – Modelling Effort 19 Modelling Effort •  One man-month •  Time needed is largely dependent on the person’s domain knowledge and expertise in modelling and in OCL •  Size of the model •  A surrogate measure to estimate the effort needed to follow our modelling methodology in a specific context
  20. 20. Case Study Results – Modelling Effort 20 •  Size of the Input, Configuration and Output Models that were created for the Case Study System File Classes Attributes Associations Generalisations Input 36 156 17 4 Configuration 9 30 6 1 Output 23 132 15 0 Total 68 318 38 5
  21. 21. Case Study Results – Modelling Effort 21 •  Information about Constraints for the Case Study System Classified by the Files to which they apply File # of Constraints # of Clauses # of Opers. on Collections # of Iterative Opers. Input 15 30 6 1 Input/ Configuration 12 54 14 6 Input/Output 10 38 2 10 Input/ Configuration/ Output 12 87 15 19 Total 49 209 37 36
  22. 22. Case Study Results – Modelling Effort 22 RQ1: How much effort is needed to produce the model for a real system? Answer: •  Results show that the size of the model is much less than what is typically observed when modelling system designs •  Cost of modelling was considered acceptable by the system’s engineers
  23. 23. Case Study – RQ2 23 RQ2: How long does it take to validate test cases and check the oracle?
  24. 24. Acceptance Test Cases – Execution Time 24 •  Average, minimum, maximum execution times for all acceptance test cases Operation Execution Time (ms) Min Max Avg. Model Instantiation 684 845 762 Test Input Validation 1 56 41 Oracle Checking < 1 39 31 Total 685 940 834
  25. 25. Large Transmissions - Model Instantiation Time 25 •  Model instantiation time by input file size
  26. 26. Large Transmissions – Input Validation Time 26 •  Input validation time by input file size
  27. 27. Large Transmissions - Oracle Validation Time 27 •  Oracle checking time by input file size
  28. 28. Case Study Results – RQ2 28 RQ2: How long does it take to validate test cases and check the oracle? Answer: •  Results show that our approach is scalable in terms of execution time •  Test validation and oracle checking execution time on real transmission files is manageable in practice, with less than 3 and 50 minutes for input and oracle constraints, respectively •  Linear relationship between size of input file and execution time makes it possible to potentially process much larger files
  29. 29. Case Study – RQ3 29 RQ3: Is the model effective in practice in validating test cases and checking the oracle? Is it effective in uncovering issues, if any, in the input files, the DAQ system, or the specifications of the system?
  30. 30. Case Study Results – RQ3 30 •  Validated the actual 32 test cases using our tool •  Our test validation approach could help identify specifications changes without the need to execute the transmission file on the system •  No violations of the oracle constraints were reported •  When validating the real transmission files, we found that in some files many input constraints were violated
  31. 31. Case Study Results – RQ3 31 RQ3: Is the model effective in practice in validating test cases and checking the oracle? Is it effective in uncovering issues, if any, in the input files, the DAQ system, or the specifications of the system? Answer: •  Results show that our approach is effective in validating test cases and checking the oracle •  Our approach is also able to identify implicit changes in specifications of the input file and the DAQ system
  32. 32. •  SES system integration •  Deploy the oracle checker into the SES build process for the current system under evaluation. •  Training / Knowledge Transfer to Industry partner. •  Apply our methodology to other DAQ systems. Success Story 32
  33. 33. 33 Overall Approach
  34. 34. 34 Current Progress on Approach Oracle and input checker is fully working
  35. 35. 35 Next Steps: Automated Input Generation •  Currently defining method of automatically generating erroneous inputs based on realistic faults •  Example faults: •  Flipped bits •  Missing packets
  36. 36. •  Created an automated test validation and oracle checking approach for systems with complex inputs/outputs and mappings between inputs and outputs, e.g., Data Acquisition Systems. •  Approach driven by models of the input/output structure and content. •  We defined a specific modelling methodology using UML class diagrams and OCL constraints. •  Case study shows that the modelling approach is scalable. •  Input and oracle validation executed within reasonable times. •  And people are using it! Conclusion 36 Supported by the Fonds National de la Recherche, Luxembourg (FNR/P10/03 and FNR 4082113)
  37. 37. 37
  38. 38. •  Automated test case generation •  Simple approach: •  Start with a valid input file •  mutate values corresponding to model leaf node (e.g., flip bits) •  Possible selection criteria: •  break all constraints •  break combos of constraints •  Mutate thousands of times -> run through input validator -> select most diverse test cases for test suite Next Steps 38
  39. 39. Progress: Tool Implementation •  Implemented a testing framework: –  Loads existing input file into the model –  Checks constraints on input and configuration –  Checks constraints on output •  Used the tool to validate the model against existing SES test cases –  Helps in reviewing the model and constraints –  Initial test of the approach where we know the expected result 39
  40. 40. Modelling Methodology – File Structure 40 File Item Model Element Example Field Class ChannelData Leaf Field Class or Attribute Sync spacecraftId Property Attribute dataLength in Packet class Containment Composition ChannelData is composed of Sync and FrameData Alternative Sub-components Generalisation PacketZone can either be an IdlePacketZone or a PacketSet Optional/Multi Sub-components Multiplicity One PacketSet can have one or many instances of Packet Dependency Association The association between TransmissionData and Configuration Computation Operation calculateRsCrc
  41. 41. Case Study Results – Threats to Validity 41 •  Internal Threats –  we used all the test cases and transmission files provided by the system testers to avoid experimenter bias •  External Threats –  might only be relevant in the DAQ application domain; nevertheless, this domain is important and widely used •  Construct Threats –  to study scalability, used the size of the model and constraints and the execution time of the validation and oracle checking processes –  for execution time, might depend on content of file the used; transmission files are not only real but representative

×