Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Exploratory Analysis of the Performance of a Configurable CEGAR Framework

Presentation of our paper at the 24th Minisymposium of the Department of Measurement and Information Systems of the Budapest University of Technology and Economics (Minisy@DMIS 2017). Budapest, Hungary

  • Login to see the comments

  • Be the first to like this

Exploratory Analysis of the Performance of a Configurable CEGAR Framework

  1. 1. Budapest University of Technology and Economics Department of Measurement and Information Systems Exploratory Analysis of the Performance of a Configurable CEGAR Framework Ákos Hajdu1,2, Zoltán Micskei1 1Budapest University of Technology and Economics, Department of Measurement and Information Systems 2MTA-BME Lendület Cyber-Physical Systems Research Group 24th Minisymposium of DMIS, 31.01.2017. 1
  2. 2. Background – Formal verification 2 Real-life system Formal model Formal requirement Verification: explore states CEGAR Safe Counterexample Abstraction Refinement ¬(Red Ʌ Green)
  3. 3. Motivation  Configurable CEGAR framework o Different algorithm configurations o Different kinds of models  Which is the “best” configuration?  Preliminary experiment and evaluation 3 Á. Hajdu, T. Tóth, A. Vörös, and I. Majzik, “A configurable CEGAR framework with interpolation-based refinements,” in Formal Techniques for Distributed Objects, Components and Systems, ser. LNCS. Springer, 2016, vol. 9688, pp. 158–174.
  4. 4. Variables of the problem  Input variables: model o System type (Hardware/PLC) o Name o Number of variables o Size  Input variables: configuration o Domain of abstraction (Pred./Expl.) o Refinement strategy (Craig itp./Seq. itp./Unsat core) o Initial precision (Empty/Prop.) o Search strategy (BFS/DFS) 4
  5. 5. Variables of the problem  Output variables o Is the model safe o Execution time o Number of refinement iterations o Size of the ARG (Abstract Reachability Graph) o Depth of the ARG o Length of the counterexample (cex) 5
  6. 6. Measurement procedure  18 input models o 12 hardware (benchmarks from HWMCC) o 6 PLC (from a particle accelerator)  20 algorithm configurations  Repeated 5 times  Timeout 480 s  1800 measurement points, 1120 successful 6
  7. 7. Research questions  RQ1: Overall, high level properties  RQ2: Effect of individual input parameters  RQ3: Influence of input parameters on output  Validity o External: representative input models o Internal: repetitions, dedicated machine 7
  8. 8. RQ1: Overall, high level properties 8 Many outliers Small IQR
  9. 9. RQ1: Overall, high level properties 9 Average execution time (ms, log scale) Easy problems Varying difficulty High success rate Single configuration, but short time Pred Seq. Itp. Prop. DFS
  10. 10. RQ2: Effect of individual input parameters 10 Explicit value abstraction more efficient for PLCs Execution time (ms)
  11. 11. RQ2: Effect of individual input parameters 11 Number of iterations Less iterations with seq. itp. Large difference for some PLCs
  12. 12. RQ3: Influence of input parameters on output 12 Predicate domain bad for PLCs Predicate domain good for hardware Explicit domain with Craig itp. good in general
  13. 13. Conclusions  CEGAR framework o Different configurations o Different systems  Preliminary results o Different configurations are more suitable for different tasks o Connections between input and output variables  Future work o Improving the framework o Further analysis, heuristics 13  inf.mit.bme.hu/en/members/hajdua

×