Model-driven performance prediction of distributed real-time embedded defence systems

407 views
321 views

Published on

Presentation at the 18th International Conference on Engineering of Complex Computer Systems (ICECCS), 2013.07, Singapore, Singapore. More details about the paper at https://sites.google.com/site/vaneachiprianov/papers .

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
407
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
9
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Model-driven performance prediction of distributed real-time embedded defence systems

  1. 1. Model-driven performance prediction of distributed real-time embedded defence systems Katrina Falkner Nickolas Falkner James Hill Dan Fraser Marianne Rieckmann Vanea Chiprianov Claudia Szabo Gavin Puddy Adrian Johnston Andrew Wallis
  2. 2. Agenda • Model-driven engineering and System execution modelling for defence systems • The architecture of the performance prediction system • Early validation on an Unmanned Air Vehicle (UAV) • Conclusion and perspectives University of Adelaide 2
  3. 3. Model-driven engineering and System execution modelling for defence systems • Requirements of DRE defence systems – Long life-cycles – Change in development philosophies – Modular design – Reuse – Greater concern for non-functional • Space, weight, power University of Adelaide 3
  4. 4. Model-driven engineering and System execution modelling for defence systems • Performance prediction while(!perfModel.satistify(userPerfGoal)){ perfModel<-improvedPerfModel; } • Model-driven engineering – Model – Execute • System execution modelling (SEM) – Performance specificity – Hardware testbeds University of Adelaide 4
  5. 5. The architecture of the performance prediction system University of Adelaide 5
  6. 6. Modelling • Modelling the System under study (SUS) – the SEM – Systemic structure – Functional behaviour – Workload – Deployment • Modelling Scenarios – Simulate realistic interactions – Analyse performance of SUS – Scenario Domain Specific Language (DSL) University of Adelaide 6
  7. 7. Executing • Executing the System execution model (SEM) – Application: SEM + scenarios – Middleware: Data Distribution Service DDS – Operating system – Hardware • Executing Scenarios – Platform specific information – Code generation of distributed units – Deployment University of Adelaide 7 Defence needs
  8. 8. Evaluating and predicting • Collect execution traces • Aggregate metrics • Evaluate if(perfModel.meet( perfConstraints)) • Visualize University of Adelaide 8
  9. 9. Early validation on an Unmanned Air Vehicle • Scenario: => change in bandwidth => change in CPU workload University of Adelaide 9 UAV in the air UAV going underwater
  10. 10. Early validation on an Unmanned Air Vehicle University of Adelaide 10 Systemic structural model of the SUS Behavioural and workload models of the SUS
  11. 11. Early validation on an Unmanned Air Vehicle • Evaluating utilization: u = 𝑠𝑒𝑟𝑣𝑖𝑐𝑒 𝑡𝑖𝑚𝑒 𝑟𝑢𝑛𝑡𝑖𝑚𝑒 uAIR=4.15% uSUB=59.6% for workload=150 msec University of Adelaide 11 Execution traces of the SEM
  12. 12. Conclusion and perspectives • Model-driven performance prediction system – Integration of realistic data sources – Visualization of the causes of performance issues – Understanding of models and relationships • Perspectives – Graphical Scenario DSL – Performance DSL – Multi-modelling DSL University of Adelaide 12

×