SICOMORO-CM: DEVELOPMENT
OF TRUSTWORTHY
SYSTEMS VIA MODELS AND
ADVANCED TOOLS
Projects Showcase@STAF Marburg, 2017
E. Albert1, P. C. Cañizares1, E. Guerra2, J. de Lara2,
E. Marcos4, M. Nuñez1, G. Román-Díez3,
J. M. Vara4, D. Zanardini3
1Universidad Complutense de Madrid
2Universidad Autónoma de Madrid
3Universidad Politécnica de Madrid
4Universidad Rey Juan Carlos
PROJECT
Funded by the Madrid region government
• European social fund
R+D programme to foster collaboration among the Universities in
Madrid
• Teams of at least 3 different Universities
• 4 years, with a review in year 2
Funded with 635.088,65€
Started in October 2014
2
CONSORTIUM
UCM-TER: Testing and Performance Evaluation research group
• Formal methods in testing, analysis and performance evaluation of systems
• http://antares.sip.ucm.es/testing/
• Project coordinator
UAM-miso: Modelling and Software Engineering research group
• Model driven engineering
• http://miso.es
COSTA research group
• Formal techniques to optimize, verify and understand programs
• UCM and UPM
• http://costa.ls.fi.upm.es
URJC-KYBELE research group
• Software engineering, Model-driven engineering, Service oriented engineering
• http://www.kybele.es
3
CONSORTIUM
ASSOCIATED PARTNERS
Companies:
• Brain-tec
• CAF signalling
• IBM
• IK4-IKERLAN
• Itegrasys
Research groups:
• Ansymo (Vangheluwe)
• ATeSS (Hierons)
• AtlanMOD (Cabot)
• ES (Paige)
• FM-CWI (de Boer)
• InfoLab (Papazoglou)
4
• Kybele consulting
• OpenCanarias
• SINTEF
• Thales
• MDSDev (Varro)
• Microsoft Research Cambridge
(Rybalchenko)
• NetSoft (Cavalli)
• Premodela (Broch Johnsen)
• SE (Hanhle)
GOALS
Introduce methodologies, supported by tools, to develop trustworthy
and high quality software
• using a rigorous process
• covering all development phases
Application to particular domains
• services
• cloud systems
Technically
• Combine MDE, formal testing methods, program analysis
• Tool integration. Eclipse and on the web (collaborative environment)
5
WORK PACKAGES
9 Workpackages
1. Executable and trustworthy models
2. Verification of models and transformations
3. Transformations as a service
4. Verification and validation of systems
5. Model-based systems validation
6. Virtual collaborative environment
7. Modelling service operations
8. Cloud systems: model, verification and validation
9. Dissemination
6
OB1: EXECUTABLE AND
TRUSTWORTHY MODELS
1. Specify DSLs in a cost-effective way
2. Analyse DSL semantics
• Operational
• Denotational
Results:
• Transformation reuse (in-place, out-place) [dLG17]
• Develop DSLs through examples [LGG17]
• Evaluate effectiveness of concrete syntax [GVB17]
7
[dLG17] de Lara, Guerra: A Posteriori Typing for Model-Driven Engineering: Concepts, Analysis,
and Applications. ACM TOSEM. 25(4): 31:1-31:60 (2017)
[LGG17] López-Fernández, Garmendia, Guerra, de Lara: Example-Based Generation of
Graphical Modelling Environments. ECMFA 2016: 101-117
[GVB17] Granada, Vara, Brambilla, Bollati, Marcos. Analysing the cognitive effectiveness of the
webML visual notation. Software and System Modeling, 16(1):195-227, 2017
8
OB1: EXECUTABLE AND
TRUSTWORTHY MODELS
MMA
MA
operA
MMB
MB
«instance of»
«typed on»
«applicableto» MMB
MB
«typed on»
operB
«applicableto»
MMA
operA
MMB
MB
«typed on»
«instance of»«instance of» «instance of»
MMA
(a) transformational approach (b) operation adaptation (c) retyping
«applicable to»
transform
transform
rewrite operA
«typed on»
9
OB1: EXECUTABLE AND
TRUSTWORTHY MODELS
Automatic induction
• Abstract syntax (meta-
model)
• Concrete syntax (including
spatial relations)
Modelling tool
• Eclipse
• EMF
• Sirius
Informal
drawing tool
• yED
• Powerpoint
domain
experts
domain
experts
metaBUP
10
OB1: EXECUTABLE AND
TRUSTWORTHY MODELS
• Decisions about
concrete syntax
collected in high-level
model
• Contextual help
(Moody’s principles)
Modelling tool
+ Metrics Reports
CEViNEdit
Ecore
metamodel
domain
experts
11
OB1: EXECUTABLE AND
TRUSTWORTHY MODELS
CeVINEdit
Modeling tool
.gmfmap
Report
Language
Analyzer
Moody’s
Metrics
Analyzer
.ecore
.gmfgraph
.gmftool
OB2: VERIFICATION OF MODELS
AND TRANSFORMATIONS
12
1. Analyse properties of (meta-)models
2. Analyse properties of transformations
• Model-to-model, Model-to-text, In-place
Results:
• Static analysis of ATL [SGdL17]
• Traceability analysis [JVBM15]
• DSLs for meta-model validation & verification [LGdL17]
[LGdL17] López-Fernández, Guerra, de Lara: Combining unit and specification-based testing for
meta-model validation and verification. Inf. Syst. 62: 104-135 (2016)
[SGdL17] Sánchez-Cuadrado, Guerra, de Lara. Static analysis of model transformations. IEEE
TSE, in press(1):1-32, 2017
[JVBM15] Jimenez, Vara, Bollati, Marcos. Metagem-trace: Improving trace generation in model
transformation by leveraging the role of transformation models. SCICO, 98:3-27, 2015.
OB2: VERIFICATION OF MODELS
AND TRANSFORMATIONS
13
anATLyzer: http://sanchezcuadrado.es/projects/anatlyzer/
Analysis of the ATL Zoo
• All transformations have some problem
14
OB2: VERIFICATION OF MODELS
AND TRANSFORMATIONS
OB2: VERIFICATION OF MODELS
AND TRANSFORMATIONS
15
OB3: TRANSFORMATIONS AS
A SERVICE
16
1. Optimize and execute transformations (aaS) in the cloud
2. Distributed, streaming transformations
Results:
• Distil [SGdL15]
• Collaborations with other groups:
• ATLANMOD [BTS16]
• L’Aquila – MDEForge [RRP16]
• Datalyzer (ongoing)
[SGdL15] Carrascal, Sánchez-Cuadrado, de Lara. Building MDE cloud services with Distil. In
CloudMDE@MODELS, CEUR Workshop Proceedings 1563, pp.19-24, 2015.
[BTS16] Benelallam, Tisi, Sánchez-Cuadrado, de Lara, Cabot. Efficient model partitioning for
distributed model transformations. SLE’16, pp 226-238.
[RRP16] Di Rocco, Di Ruscio, Pierantonio, Sánchez Cuadrado, de Lara, Guerra. Using ATL
Transformation Services in the MDEForge Collaborative Modeling Platform. ICMT 2016: 70-78
17
OB3: TRANSFORMATIONS AS
A SERVICE
deploy
datalyzer
APIs
results
End users
OB4: VERIFICATION AND
VALIDATION OF SYSTEMS
18
1. Analyze the resource consumption of distributed systems
• Cost attributed to different locations
• Level of parallelism, load balance, data transmissions…
• Bound non-cumulative resources (memory, connections, …)
• Parallel cost of a system (which is the last object in finishing)
OB4: VERIFICATION AND
VALIDATION OF SYSTEMS
Results:
• Resource analysis with cost centers [AACGGPG15]
• Performance indicators [ACPR15]
• Peak cost analysis [ACR15]
• Parallel cost analysis [ACJR15]
• Resource analysis in Software Product Lines [ZAV16]
19
[AACGGPG15] Object-Sensitive Cost Analysis for Concurrent Objects, STVR, 25(3):218-
271
[ACPR15] Quantified Abstract Configurations of Distributed, FAoC 27(4):665-699
[ACR15] Non-Cumulative Resource Analysis. TACAS’15, 9035 of LNCS, 85-100
[ACJR15] Parallel Cost Analysis of Distributed Systems, SAS’15, 9291 of LNCS, 275-292
[ZAV16] Resource-usage-aware configuration in software product lines, JLAMP 85(1),
2016, 173-199
OB5: MODEL-BASED SYSTEM
VALIDATION
20
Validation complements verification.
• Main validation technique is testing.
Initially, testing was mainly manual and informal
• Model-based Testing (MBT) is currently a very active research area.
MBT: Given a specification we have to assess, applying test cases,
whether an implementation conforms to the specification.
We do not only test what the implementation does but also how the
implementation performs actions (time needed, probability of
performing an action, needed resources, etc).
Merayo, Núñez: Passive testing of communicating systems with timeouts. Information &
Software Technology 64: 19-35 (2015)
Hierons, Merayo, Núñez: An extended framework for passive asynchronous testing. J. Log.
Algebr. Meth. Program. 86(1): 408-424 (2017)
Hierons, Núñez: Implementation relations and probabilistic schedulers in the distributed test
architecture. Journal of Systems and Software (in press, 2017)
OB5: MODEL-BASED SYSTEM
VALIDATION
21
test
execution
test
generation
tests
specification
implementation
pass / fail
=conforms
OB6: VIRTUAL COLLABORATIVE
ENVIRONMENT
22
1. Infrastructure for the extensible combination of tools, and their
cloud-based execution
J. Domenech, S. Genaim, E. Broch Johnsen, and R. Schlatte. Easyinterface: A toolkit for
rapid development of GUIs for research prototype tools. Proc. FASE'17, LNCS 10202, pp
379- 383. Springer, 2017.
OB6: VIRTUAL COLLABORATIVE
ENVIRONMENT
23
OB7: MODELLING SERVICE
OPERATIONS
24
OB7: MODELLING SERVICE
OPERATIONS
25
OB7: MODELLING SERVICE
OPERATIONS
26
OB7: MODELLING SERVICE
OPERATIONS
27
OB7: MODELLING SERVICE
OPERATIONS
28
· · ·
OB7: MODELLING SERVICE
OPERATIONS
29
OB8: CLOUD SYSTEMS: MODEL
VERIFICATION AND VALIDATION
30
Apply our general frameworks to the cloud
Formally model the components of the cloud and combine
simulation and testing to decide the goodness of the architecture
• Metamorphic testing: Metamorphic relations indicate properties
that any correct system must fulfill
• Mutation testing to compare whether our implementation is
performing better than mutants (mutants should be worse than the
developed system)
Nuñez, Hierons: A methodology for validating cloud models using metamorphic
testing. Annales des Télécommunications 70(3-4): 127-135 (2015)
Cañizares, Núñez, de Lara: MAGICIAN: Model-based design for optimizing the
configuration of data-centers. SEKE 2017: 602-607 (2017).
OB8: CLOUD SYSTEMS: MODEL
VERIFICATION AND VALIDATION
31
THE ROAD AHEAD
Integration!
• Cloud-based
• Eclipse-based
Complete the work in different WPs
• Add V&V support to the service operation tools
• …
Co-direction of PhD theses
Technology transfer to associated companies
• Field studies
Continuity of the project in next call
32
THANKS!
Juan.deLara@uam.es
33
@miso_uam
http://sicomoro-cm.es/

SICOMORO

  • 1.
    SICOMORO-CM: DEVELOPMENT OF TRUSTWORTHY SYSTEMSVIA MODELS AND ADVANCED TOOLS Projects Showcase@STAF Marburg, 2017 E. Albert1, P. C. Cañizares1, E. Guerra2, J. de Lara2, E. Marcos4, M. Nuñez1, G. Román-Díez3, J. M. Vara4, D. Zanardini3 1Universidad Complutense de Madrid 2Universidad Autónoma de Madrid 3Universidad Politécnica de Madrid 4Universidad Rey Juan Carlos
  • 2.
    PROJECT Funded by theMadrid region government • European social fund R+D programme to foster collaboration among the Universities in Madrid • Teams of at least 3 different Universities • 4 years, with a review in year 2 Funded with 635.088,65€ Started in October 2014 2
  • 3.
    CONSORTIUM UCM-TER: Testing andPerformance Evaluation research group • Formal methods in testing, analysis and performance evaluation of systems • http://antares.sip.ucm.es/testing/ • Project coordinator UAM-miso: Modelling and Software Engineering research group • Model driven engineering • http://miso.es COSTA research group • Formal techniques to optimize, verify and understand programs • UCM and UPM • http://costa.ls.fi.upm.es URJC-KYBELE research group • Software engineering, Model-driven engineering, Service oriented engineering • http://www.kybele.es 3
  • 4.
    CONSORTIUM ASSOCIATED PARTNERS Companies: • Brain-tec •CAF signalling • IBM • IK4-IKERLAN • Itegrasys Research groups: • Ansymo (Vangheluwe) • ATeSS (Hierons) • AtlanMOD (Cabot) • ES (Paige) • FM-CWI (de Boer) • InfoLab (Papazoglou) 4 • Kybele consulting • OpenCanarias • SINTEF • Thales • MDSDev (Varro) • Microsoft Research Cambridge (Rybalchenko) • NetSoft (Cavalli) • Premodela (Broch Johnsen) • SE (Hanhle)
  • 5.
    GOALS Introduce methodologies, supportedby tools, to develop trustworthy and high quality software • using a rigorous process • covering all development phases Application to particular domains • services • cloud systems Technically • Combine MDE, formal testing methods, program analysis • Tool integration. Eclipse and on the web (collaborative environment) 5
  • 6.
    WORK PACKAGES 9 Workpackages 1.Executable and trustworthy models 2. Verification of models and transformations 3. Transformations as a service 4. Verification and validation of systems 5. Model-based systems validation 6. Virtual collaborative environment 7. Modelling service operations 8. Cloud systems: model, verification and validation 9. Dissemination 6
  • 7.
    OB1: EXECUTABLE AND TRUSTWORTHYMODELS 1. Specify DSLs in a cost-effective way 2. Analyse DSL semantics • Operational • Denotational Results: • Transformation reuse (in-place, out-place) [dLG17] • Develop DSLs through examples [LGG17] • Evaluate effectiveness of concrete syntax [GVB17] 7 [dLG17] de Lara, Guerra: A Posteriori Typing for Model-Driven Engineering: Concepts, Analysis, and Applications. ACM TOSEM. 25(4): 31:1-31:60 (2017) [LGG17] López-Fernández, Garmendia, Guerra, de Lara: Example-Based Generation of Graphical Modelling Environments. ECMFA 2016: 101-117 [GVB17] Granada, Vara, Brambilla, Bollati, Marcos. Analysing the cognitive effectiveness of the webML visual notation. Software and System Modeling, 16(1):195-227, 2017
  • 8.
    8 OB1: EXECUTABLE AND TRUSTWORTHYMODELS MMA MA operA MMB MB «instance of» «typed on» «applicableto» MMB MB «typed on» operB «applicableto» MMA operA MMB MB «typed on» «instance of»«instance of» «instance of» MMA (a) transformational approach (b) operation adaptation (c) retyping «applicable to» transform transform rewrite operA «typed on»
  • 9.
    9 OB1: EXECUTABLE AND TRUSTWORTHYMODELS Automatic induction • Abstract syntax (meta- model) • Concrete syntax (including spatial relations) Modelling tool • Eclipse • EMF • Sirius Informal drawing tool • yED • Powerpoint domain experts domain experts metaBUP
  • 10.
    10 OB1: EXECUTABLE AND TRUSTWORTHYMODELS • Decisions about concrete syntax collected in high-level model • Contextual help (Moody’s principles) Modelling tool + Metrics Reports CEViNEdit Ecore metamodel domain experts
  • 11.
    11 OB1: EXECUTABLE AND TRUSTWORTHYMODELS CeVINEdit Modeling tool .gmfmap Report Language Analyzer Moody’s Metrics Analyzer .ecore .gmfgraph .gmftool
  • 12.
    OB2: VERIFICATION OFMODELS AND TRANSFORMATIONS 12 1. Analyse properties of (meta-)models 2. Analyse properties of transformations • Model-to-model, Model-to-text, In-place Results: • Static analysis of ATL [SGdL17] • Traceability analysis [JVBM15] • DSLs for meta-model validation & verification [LGdL17] [LGdL17] López-Fernández, Guerra, de Lara: Combining unit and specification-based testing for meta-model validation and verification. Inf. Syst. 62: 104-135 (2016) [SGdL17] Sánchez-Cuadrado, Guerra, de Lara. Static analysis of model transformations. IEEE TSE, in press(1):1-32, 2017 [JVBM15] Jimenez, Vara, Bollati, Marcos. Metagem-trace: Improving trace generation in model transformation by leveraging the role of transformation models. SCICO, 98:3-27, 2015.
  • 13.
    OB2: VERIFICATION OFMODELS AND TRANSFORMATIONS 13 anATLyzer: http://sanchezcuadrado.es/projects/anatlyzer/
  • 14.
    Analysis of theATL Zoo • All transformations have some problem 14 OB2: VERIFICATION OF MODELS AND TRANSFORMATIONS
  • 15.
    OB2: VERIFICATION OFMODELS AND TRANSFORMATIONS 15
  • 16.
    OB3: TRANSFORMATIONS AS ASERVICE 16 1. Optimize and execute transformations (aaS) in the cloud 2. Distributed, streaming transformations Results: • Distil [SGdL15] • Collaborations with other groups: • ATLANMOD [BTS16] • L’Aquila – MDEForge [RRP16] • Datalyzer (ongoing) [SGdL15] Carrascal, Sánchez-Cuadrado, de Lara. Building MDE cloud services with Distil. In CloudMDE@MODELS, CEUR Workshop Proceedings 1563, pp.19-24, 2015. [BTS16] Benelallam, Tisi, Sánchez-Cuadrado, de Lara, Cabot. Efficient model partitioning for distributed model transformations. SLE’16, pp 226-238. [RRP16] Di Rocco, Di Ruscio, Pierantonio, Sánchez Cuadrado, de Lara, Guerra. Using ATL Transformation Services in the MDEForge Collaborative Modeling Platform. ICMT 2016: 70-78
  • 17.
    17 OB3: TRANSFORMATIONS AS ASERVICE deploy datalyzer APIs results End users
  • 18.
    OB4: VERIFICATION AND VALIDATIONOF SYSTEMS 18 1. Analyze the resource consumption of distributed systems • Cost attributed to different locations • Level of parallelism, load balance, data transmissions… • Bound non-cumulative resources (memory, connections, …) • Parallel cost of a system (which is the last object in finishing)
  • 19.
    OB4: VERIFICATION AND VALIDATIONOF SYSTEMS Results: • Resource analysis with cost centers [AACGGPG15] • Performance indicators [ACPR15] • Peak cost analysis [ACR15] • Parallel cost analysis [ACJR15] • Resource analysis in Software Product Lines [ZAV16] 19 [AACGGPG15] Object-Sensitive Cost Analysis for Concurrent Objects, STVR, 25(3):218- 271 [ACPR15] Quantified Abstract Configurations of Distributed, FAoC 27(4):665-699 [ACR15] Non-Cumulative Resource Analysis. TACAS’15, 9035 of LNCS, 85-100 [ACJR15] Parallel Cost Analysis of Distributed Systems, SAS’15, 9291 of LNCS, 275-292 [ZAV16] Resource-usage-aware configuration in software product lines, JLAMP 85(1), 2016, 173-199
  • 20.
    OB5: MODEL-BASED SYSTEM VALIDATION 20 Validationcomplements verification. • Main validation technique is testing. Initially, testing was mainly manual and informal • Model-based Testing (MBT) is currently a very active research area. MBT: Given a specification we have to assess, applying test cases, whether an implementation conforms to the specification. We do not only test what the implementation does but also how the implementation performs actions (time needed, probability of performing an action, needed resources, etc). Merayo, Núñez: Passive testing of communicating systems with timeouts. Information & Software Technology 64: 19-35 (2015) Hierons, Merayo, Núñez: An extended framework for passive asynchronous testing. J. Log. Algebr. Meth. Program. 86(1): 408-424 (2017) Hierons, Núñez: Implementation relations and probabilistic schedulers in the distributed test architecture. Journal of Systems and Software (in press, 2017)
  • 21.
  • 22.
    OB6: VIRTUAL COLLABORATIVE ENVIRONMENT 22 1.Infrastructure for the extensible combination of tools, and their cloud-based execution J. Domenech, S. Genaim, E. Broch Johnsen, and R. Schlatte. Easyinterface: A toolkit for rapid development of GUIs for research prototype tools. Proc. FASE'17, LNCS 10202, pp 379- 383. Springer, 2017.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
    OB8: CLOUD SYSTEMS:MODEL VERIFICATION AND VALIDATION 30 Apply our general frameworks to the cloud Formally model the components of the cloud and combine simulation and testing to decide the goodness of the architecture • Metamorphic testing: Metamorphic relations indicate properties that any correct system must fulfill • Mutation testing to compare whether our implementation is performing better than mutants (mutants should be worse than the developed system) Nuñez, Hierons: A methodology for validating cloud models using metamorphic testing. Annales des Télécommunications 70(3-4): 127-135 (2015) Cañizares, Núñez, de Lara: MAGICIAN: Model-based design for optimizing the configuration of data-centers. SEKE 2017: 602-607 (2017).
  • 31.
    OB8: CLOUD SYSTEMS:MODEL VERIFICATION AND VALIDATION 31
  • 32.
    THE ROAD AHEAD Integration! •Cloud-based • Eclipse-based Complete the work in different WPs • Add V&V support to the service operation tools • … Co-direction of PhD theses Technology transfer to associated companies • Field studies Continuity of the project in next call 32
  • 33.