SlideShare a Scribd company logo
Semantic Model Differencing 
Utilizing Behavioral Semantics Specifications 
www.modelexecution.org 
p Philip g , Langer, j Tanja y Mayerhofer, , Gerti pp 
Kappel 
Business Informatics Group 
Institute of Software Technology and Interactive Systems 
Vienna University of Technology 
Favoritenstraße 9‐11/1883, 1040 Vienna, Austria 
phone: 43 (1) 5880118804 (secretary), fax: 43 (1) 5880118896 
office@big.tuwien.ac.at, www.big.tuwien.ac.at
Motivation 
Models are subject to change  Change management for models is required 
Model differencing 
 Goal: Identify differences among models 
 Applications: Merging, versioning, conflict detection, incremental testing, etc. 
Syntactic model differencing 
 Procedure: Matching ‐ Differencing 
 Output: Syntactic differences (add, delete, update operations) 
Semantic model differencing 
 Takes semantics of models into account 
 Enables additional analyses of changes (e.g., semantic preservation) 
 Provides basis for comprehending the evolution of models 
2
Motivating Example Syntactic Model Differencing 
M1 
p3 
p1 t1 p2 t2 p4 
M2 
+ 
p1 t1 p2 t2 p3 t3 p 
p4 
3
Motivating Example Syntactic Model Differencing 
M1 1. Matching 
 Identify corresponding elements 
based on 
 Identifiers 
p3 
 Signatures 
 Similarity 
p1 t1 p2 t2 p4 
 Custom matching 
M2 
+ 
p1 t1 p2 t2 p3 t3 p 
p4 
4
Motivating Example Syntactic Model Differencing 
M1 1. Matching 
 Identify corresponding elements 
based on 
 Identifiers 
p3 
 Signatures 
 Similarity 
p1 t1 p2 t2 p4 
 Custom matching (names) 
M2 
+ 
p1 t1 p2 t2 p3 t3 p 
p4 
5
Motivating Example Syntactic Model Differencing 
M1 1. Matching 
 Identify corresponding elements 
based on 
 Identifiers 
p3 
 Signatures 
 Similarity 
p1 t1 p2 t2 p4 
 Custom matching (names) 
2. Differencing 
M2 
 Compare corresponding 
elements 
+ 
p1 t1 p2 t2 p3 t3 p4 
 Identify non‐corresponding 
elements 
p 
6
Motivating Example Syntactic Model Differencing 
M1 1. Matching 
 Identify corresponding elements 
based on 
 Identifiers 
p3 
 Signatures 
 Similarity 
p1 t1 p2 t2 p4 
p3 removed output 
p3 removed as input 
p3 added output 
 Custom matching (names) 
2. Differencing 
M2 
as as ! 
p4 removed as output 
! 
 Compare corresponding 
elements 
+ 
+ 
p1 t1 p2 t2 p3 t3 p4 
 Identify non‐corresponding 
elements 
p 
t3 added 
7
Motivating Example Syntactic Model Differencing 
Which impact do the changes have on the model( model(‘s semantics)? 
M1 1. Matching 
 Identify corresponding elements 
based on 
 Identifiers 
p3 
 Signatures 
 Similarity 
p1 t1 p2 t2 p4 
p3 removed output 
p3 removed as input 
p3 added output 
 Custom matching (names) 
2. Differencing 
M2 
as as ! 
p4 removed as output 
! 
 Compare corresponding 
elements 
+ 
+ 
p1 t1 p2 t2 p3 t3 p4 
 Identify non‐corresponding 
elements 
p 
t3 added 
8
Introduction to Semantic Model Differencing 
Semantics of a modeling language1 
Mapping : Language → Semantic domain 
 Assigns meaning to each conformant model 
 Provides (semantic) interpretations of models 
Semantic model differencing 
 Procedure: 
 Obtain semantic interpretations of models 
 Analyze semantic interpretations 
 Output: Semantic differences 
 Semantic interpretations valid for one model but not for the other one 
 Witness semantic differences among models (diff witnesses2) 
9 
1 D. Harel, B. Rumpe. Meaningful Modeling: What’s the Semantics of “Semantics”? IEEE Computer, 37(10):64–72, 2004. 
2 S. Maoz et al. A Manifesto for Semantic Model Differencing. MODELS’10, volume 6627 of LNCS, pages 194–203. 
Springer, 2011.
Motivating Example Semantic Model Differencing 
Petri nets 
Semantics 
 Firing of transitions leading to markings 
(token distributions) 
M1 
Semantic p3 
interpretation 
 Traces (order of transition firings) 
 Reachable markings 
p1 t1 p2 t2 p4 
Semantic difference 
l l d d l 
M2 
 Traces only valid in one model 
 Markings only reachable in one model 
+ 
p1 t1 p2 t2 p3 t3 p4  Final p markings only reached by one model 
10
Motivating Example Semantic Model Differencing 
Petri nets 
Semantics 
 Firing of transitions leading to markings 
(token distributions) 
M1 
Semantic p3 
interpretation 
 Traces (order of transition firings) 
 Reachable markings 
p1 t1 p2 t2 p4 
Semantic difference 
l l d d l 
M2 
 Traces only valid in one model 
 Markings only reachable in one model 
+ 
p1 t1 p2 t2 p3 t3 p4  Final p markings only reached by one model 
11
Motivating Example Semantic Model Differencing 
Petri nets 
Semantic interpretation 
M1 
M : p1=1 0 0 p3 
M0,M1: 1, p2=0, p3=0, p4=0 
M0 
p1 t1 p2 t2 p4 
M2 
+ 
p1 t1 p2 t2 p3 t3 p 
p4 
12
Motivating Example Semantic Model Differencing 
Petri nets 
Semantic interpretation 
M1 
M : p1=1 0 0 p3 
M0,M1: 1, p2=0, p3=0, p4=0 
M1,M1: p1=0, p2=1, p3=1, p4=0 
M0 
p1 t1 p2 t2 p4 
M2 
+ 
p1 t1 p2 t2 p3 t3 p 
p4 
13
Motivating Example Semantic Model Differencing 
Petri nets 
Semantic interpretation 
M1 
M : p1=1 0 0 p3 
M0,M1: 1, p2=0, p3=0, p4=0 
M1,M1: p1=0, p2=1, p3=1, p4=0 
M0 M2,M1: p1=0, p2=0, p3=0, p4=1 
p1 t1 p2 t2 p4 
M2 
+ 
p1 t1 p2 t2 p3 t3 p 
p4 
14
Motivating Example Semantic Model Differencing 
Petri nets 
Semantic interpretation 
M1 
M : p1=1 0 0 p3 
M0,M1: 1, p2=0, p3=0, p4=0 
M1,M1: p1=0, p2=1, p3=1, p4=0 
M0 M2,M1: p1=0, p2=0, p3=0, p4=1 
p1 t1 p2 t2 p4 
M2 M0,M2: p1=1, p2=0, p3=0, p4=0 
M : p1=0 1 0 + 
p1 t1 p2 t2 p3 t3 p4 
M1,M2: 0, p2=1, p3=0, p4=0 
M2,M2: p1=0, p2=0, p3=1, p4=0 
M3,M2: p1=0, p2=0, p3=0, p4=1 
M0 
p 
15
Motivating Example Semantic Model Differencing 
Petri nets 
Semantic difference (final marking) 
M1 
M : p1=1 0 0 p3 
M0,M1: 1, p2=0, p3=0, p4=0 
M1,M1: p1=0, p2=1, p3=1, p4=0 
M0 M2,M1: p1=0, p2=0, p3=0, p4=1 
p1 t1 p2 t2 p4 
semantically 
equivalent 
M2 M0,M2: p1=1, p2=0, p3=0, p4=0 
M : p1=0 1 0 q 
+ 
p1 t1 p2 t2 p3 t3 p4 
M1,M2: 0, p2=1, p3=0, p4=0 
M2,M2: p1=0, p2=0, p3=1, p4=0 
M3,M2: p1=0, p2=0, p3=0, p4=1 
M0 
p 
16
Motivating Example Semantic Model Differencing 
Petri nets 
Semantic difference (final marking) 
M1 
M : p1=0 1 0 p3 
M0,M1: 0, p2=1, p3=0, p4=0 
M0 
p1 t1 p2 t2 p4 diff witness 
M2 M0,M2: p1=0, p2=1, p3=0, p4=0 
M : p1=0 0 1 + 
p1 t1 p2 t2 p3 t3 p4 
M1,M2: 0, p2=0, p3=1, p4=0 
M0 M2,M2: p1=0, p2=0, p3=0, p4=1 
p 
17
Generic Semantic Model Differencing: Motivation 
Language‐specific approach (Maoz et al.1 and Fahrenberg et al.2) 
 Procedure: 
 Translate models into semantic domain 
 Perform semantic differencing in semantic domain 
 Translate result into modeling language 
 Challenge: Complex translations and differencing algorithms 
Generic approach 
 Idea: Utilize behavioral semantics specifications for semantic differencing 
 Goal: 
 Perform semantic differencing directly in modeling language 
 Use existing behavioral semantics specifications for semantic differencing 
 Enable to apply custom semantic equivalence criteria 
1 S. Maoz et al. A Manifesto for Semantic Model Differencing. MODELS’10, volume 6627 of LNCS, pages 194–203. Springer, 2011. 
2 U. Fahrenberg et al. Vision Paper: Make a Difference! (Semantically). MODELS’11, volume 6981 of LNCS, pages 490–500. 
Springer, 2011. 
18
Generic Semantic Model Differencing: Introduction 
Generic framework for semantic model differencing 
 Idea: Utilize behavioral semantics specifications for semantic differencing 
 Procedure: 
 Execute models to obtain execution traces (semantic interpretations) 
 Identify execution traces valid for only one model by comparison (diff witness) 
 Customize execution trace comparison to modeling language and suitable 
semantic equivalence criterion 
 Benefits: 
 Implementation of translations and algorithms specifically for semantic model 
differencing is avoided 
 Only comparison of execution traces is specific to modeling language and 
semantic equivalence criterion 
19
Generic Semantic Model Differencing: Overview 
syn 
CM1,M2 
M1 
Syntactic 
syn 
Matching CM1,M2 
M1 TM1 
Model 
Execution 
Semantic 
sem 
Matching CM1,M2 
TM1 
M2 
g 
Match 
Rules 
M2 
IM1 IM2 
TM2 
Match 
Rules 
g 
TM2 C … Correspon-dence 
I … Input 
M … Model 
Syn 
M1 M2 Sem T … Trace 
20
Generic Semantic Model Differencing: Overview 
Epsilon Platform1 syn 
CM1,M2 
M1 
Syntactic 
syn 
Matching CM1,M2 
M1 TM1 
Model 
Execution 
Semantic 
sem 
Matching CM1,M2 
TM1 
M2 
g 
Match 
Rules 
M2 
IM1 IM2 
TM2 
Match 
Rules 
g 
TM2 C … Correspon-dence 
I … Input 
M … Model 
Syn 
M1 M2 Sem 
Epsilon Comparison 
Language (ECL) 
1 Syntactic matching: Identify syntactic correspondences 
T … Trace 
1.  Syntactic match rules define custom language‐specific matching algorithm 
21 1 D. Kolovos, L. Rose, A. García‐Domínguez, R. Paige. The Epsilon Book. March 2014. 
http://www.eclipse.org/epsilon/doc/book.
Generic Semantic Model Differencing: Overview 
syn 
CM1,M2 
Behavioral semantics 
M1 
Syntactic 
syn 
Matching CM1,M2 
M1 TM1 
Model 
Execution 
Semantic 
sem 
Matching CM1,M2 
TM1 
M2 
g 
Match 
Rules 
M2 
IM1 IM2 
TM2 
Match 
Rules 
g 
TM2 C … Correspon-dence 
I … Input 
M … Model 
Syn 
M1 M2 Sem 
2 Model Obtain traces 
Generic execution 
trace format 
T … Trace 
2. execution: execution  Execute models based on behavioral semantics specification 
 Obtain execution traces adhering to generic execution trace format 
22
Generic Execution Trace Format 
states Runtime states of 
model during execution 
State Object 
states * 
t t 1 
1 
objects * 
Runtime of 
model elements 
Trace 
E t 
source 
0..1 outgoing 
target incoming 0..1 Events causing 
Transition 
Event 
qualifiedName : EString 
transitions * 
event 1 
state transitions 
 Execution Sequence Transitions between 
runtime states 
trace: of runtime states 
 Format serves as interface to our semantic model differencing framework 
 Execution traces adhering to this format are the input for indentifying semantic 
differences among models 
 Framework is generic with respect to semantics specification approach and 
model execution environment 
23
Generic Semantic Model Differencing: Overview 
syn Epsilon Platform 
CM1,M2 
M1 
Syntactic 
syn 
Matching CM1,M2 
M1 TM1 
Model 
Execution 
Semantic 
sem 
Matching CM1,M2 
TM1 
M2 
g 
Match 
Rules 
M2 
IM1 IM2 
TM2 
Match 
Rules 
g 
TM2 C … Correspon-dence 
I … Input 
M … Model 
Syn 
M1 M2 Sem T … Trace 
3 Semantic Identify semantic correspondences 
Epsilon Comparison 
Language (ECL) 
3. matching:  Semantic match rules define whether two models are semantically equivalent 
 Non‐matching traces constitute diff witnesses 
24
Generic Semantic Model Differencing: Overview 
Behavioral syn 
Semantics 
CM1,M2 
M1 
Syntactic 
syn 
Matching CM1,M2 
M1 TM1 
Model 
Execution 
Semantic 
sem 
Matching CM1,M2 
TM1 
M2 
g 
Match 
Rules 
M2 
IM1 IM2 
TM2 
Match 
Rules 
g 
TM2 C … Correspon-dence 
I … Input 
M … Model 
Syn 
M1 M2 Sem T … Trace 
1 1. Syntactic matching: Identify syntactic correspondences 
2. Model execution: Obtain execution traces 
3. Semantic matching: Identify semantic correspondences 
25
Example Syntactic Match Rules (ECL) 
1 rule MatchPlace 
2 match left : Place with right : Place { 
3 compare : left.name = right.name 
4 } 
5 rule MatchTransition 
6 match left : Transition with right : Transition { 
7 compare : left.name = right.name 
8 } 
Metamodel 
Net 
Models 
M 
* transitions * places 
input 
M1 
p3 
Place 
name : EString 
Transition 
name : EString 
p 
* 
output 
p1 t1 p2 t2 p4 
M2 
* + 
p1 t1 p2 t2 p3 t3 p4 
MatchPlace MatchTransition 
26
Example Behavioral Semantics (xMOF) 
Behavioral semantics specification with xMOF1 
 Integrates fUML action language with metamodeling languages (Ecore) 
 Behavioral semantics is defined with UML activities (operational semantics) 
 Model execution is performed by fUML virtual machine 
27 1 T. Mayerhofer, P. Langer, M. Wimmer, and G. Kappel. xMOF: Executable DSMLs Based on fUML. In Proc. of SLE’13, 
volume 8225 of LNCS, pages 56–75. Springer, 2013.
Example Behavioral Semantics (xMOF) 
Behavioral semantics specification with xMOF1 
 Integrates fUML action language with metamodeling languages (Ecore) 
 Behavioral semantics is defined with UML activities (operational semantics) 
 Model execution is performed by fUML virtual machine 
Net Transition Place 
NetConfiguration TransitionConfiguration PlaceConfiguration 
* heldTokens 
Token 
Runtime concept 
28 1 T. Mayerhofer, P. Langer, M. Wimmer, and G. Kappel. xMOF: Executable DSMLs Based on fUML. In Proc. of SLE’13, 
volume 8225 of LNCS, pages 56–75. Springer, 2013.
Example Behavioral Semantics (xMOF) 
Behavioral semantics specification with xMOF1 
 Integrates fUML action language with metamodeling languages (Ecore) 
 Behavioral semantics is defined with UML activities (operational semantics) 
 Model execution is performed by fUML virtual machine 
Net Transition Place 
TransitionConfiguration 
fire() 
i E bl d() EB l 
PlaceConfiguration 
addToken() 
run() isEnabled() : EBoolean removeToken() 
T k () 
NetConfiguration 
main(Token[*]) 
() 
* heldTokens 
Computational steps 
Token 
Runtime concept 
p p 
for executing model 
29 1 T. Mayerhofer, P. Langer, M. Wimmer, and G. Kappel. xMOF: Executable DSMLs Based on fUML. In Proc. of SLE’13, 
volume 8225 of LNCS, pages 56–75. Springer, 2013.
Example Behavioral Semantics (xMOF) 
Behavioral semantics specification with xMOF1 
 Integrates fUML action language with metamodeling languages (Ecore) 
 Behavioral semantics is defined with UML activities (operational semantics) 
 Model execution is performed by fUML virtual machine 
Net Transition Place 
TransitionConfiguration 
fire() 
i E bl d() EB l 
PlaceConfiguration 
addToken() 
run() isEnabled() : EBoolean removeToken() 
Activity 
T k () 
NetConfiguration 
main(Token[*]) 
() 
* heldTokens 
PlaceConfiguration::addToken() 
Token 
Runtime concept 
g () 
ReadSelf 
read self 
AddStructuralFeatureValue 
result object 
add heldTokens 
j 
CreateObject 
1 T. Mayerhofer, P. Langer, M. Wimmer, and G. Kappel. xMOF: Executable DSMLs Based on fUML. In Proc. of SLE’13, 
volume 8225 of LNCS, pages 56–75. Springer, 2013. 
30 
value 
create Token 
result
Example Execution Traces 
M1 TM 1,1 objects heldTokens 
t1 : Trace : PlaceConfiguration 
name = “p1" 
s1 : State 
states 
: Token 
p3 
p1 t1 p2 
t2 p4 
M0 
p1 
p p p 
M2 
+ 
p1 t1 p2 t2 p3 t3 p4 
31
Example Execution Traces 
M1 TM 1,1 objects heldTokens 
t1 : Trace : PlaceConfiguration 
name = “p1" 
s1 : State 
states 
: Token 
p3 
p1 t1 p2 
objects heldTokens 
t2 p4 objects 
p1 
s2 : PlaceConfiguration 
name = “p2" 
: Token 
M0 
p p p heldTokens 
: State 
: PlaceConfiguration 
name = "p3" 
: Token 
M2 
+ 
p1 t1 p2 t2 p3 t3 p4 
32
Example Execution Traces 
M1 TM 1,1 objects heldTokens 
t1 : Trace : PlaceConfiguration 
name = “p1" 
s1 : State 
states 
: Token 
p3 
p1 t1 p2 
objects heldTokens 
t2 p4 objects 
p1 
s2 : PlaceConfiguration 
name = “p2" 
: Token 
M0 
p p p heldTokens 
: State 
: PlaceConfiguration 
name = "p3" 
: Token 
objects : PlaceConfiguration heldTokens 
s3 : State : Token 
name = “p4" 
M2 
+ 
p1 t1 p2 t2 p3 t3 p4 
33
Example Execution Traces 
M1 TM 1,1 objects heldTokens 
t1 : Trace : PlaceConfiguration 
name = “p1" 
s1 : State 
states 
: Token 
p3 
p1 t1 p2 
objects heldTokens 
t2 p4 objects 
p1 
s2 : PlaceConfiguration 
name = “p2" 
: Token 
M0 
p p p heldTokens 
: State 
: PlaceConfiguration 
name = "p3" 
: Token 
objects : PlaceConfiguration heldTokens 
s3 : State : Token 
name = “p4" 
states objects heldTokens 
t2 : Trace s1 : State : PlaceConfiguration : Token 
name = “p1" 
M2 
+ 
TM 2,1 
M0 
objects : PlaceConfiguration heldTokens 
p1 t1 p2 t2 p3 t3 p4 : Token 
name = “p2" 
s2 : State 
objects : PlaceConfiguration heldTokens 
name = “p3" 
s3 : State 
objects : PlaceConfiguration heldTokens 4 St t 
: Token 
s4 : State : T k 
Token 
g 
name = “p4" 
34
Example Semantic Match Rules (ECL) 
Final marking equivalence: For the same initial markings (input), the same 
final markings are reached 
1 rule MatchTrace 
2 mat h l ft T ith i ht T Returns final state 
capturing final marking 
match left : Trace with right : Trace { 
3 compare { 
4 var finalStateLeft : State = left.getFinalState(); // final state of left net 
5 var finalStateRight : State = right.getFinalState(); // final state of right net 
6 return finalStateLeft.matches(finalStateRight) and // final states match 
7 finalStateRight.matches(finalStateLeft); 
8 } 
9 } 
Returns instances of 
10 rule MatchState 
PlaceConfiguration 
11 match left : State with right : State { 
12 compare { 
13 var placeConfsLeft : Set = left getPlaceConfigurations(); // final left.states of left places 
14 var placeConfsRight : Set = right.getPlaceConfigurations(); // final states of right places 
15 return placeConfsLeft.matches(placeConfsRight); // final states of places match 
16 } 
17 } 
18 rule MatchPlaceConfiguration 
19 match left : PlaceConfiguration with right : PlaceConfiguration extends MatchPlace { 
20 compare : left.heldTokens.size() = right.heldTokens.size() // places hold same amount of tokens 
35 
21 }
Example Semantic Match Rules Application 
M1 TM 1,1 objects heldTokens 
t1 : Trace : PlaceConfiguration 
name = “p1" 
s1 : State 
states 
: Token 
p3 
p1 t1 p2 
objects heldTokens 
t2 p4 objects 
p1 
s2 : PlaceConfiguration 
name = “p2" 
: Token 
M0 
p p p heldTokens 
: State 
: PlaceConfiguration 
name = "p3" 
: Token 
objects : PlaceConfiguration heldTokens 
s3 : State : Token 
name = “p4" 
M2 
+ 
TM 2,1 states objects heldTokens 
t2 : Trace s1 : State : PlaceConfiguration : Token 
name = “p1" 
M0 
p1 t1 p2 t2 p3 t3 p4 
objects : PlaceConfiguration heldTokens 
name = “p2" 
s2 : State 
objects : Token 
: PlaceConfiguration heldTokens 
name = “p3" 
s3 : State : Token 
j objects : PlaceConfiguration 
heldTokens 
s4 : State : Token 
name = “p4" 
36
Example Semantic Match Rules Application 
M1 TM 1,1 objects heldTokens 
t1 : Trace : PlaceConfiguration 
name = “p1" 
s1 : State 
states 
: Token 
p3 
p1 t1 p2 
objects heldTokens 
t2 p4 objects 
p1 
s2 : PlaceConfiguration 
name = “p2" 
: Token 
M0 
p p p heldTokens 
: State 
: PlaceConfiguration 
name = "p3" 
: Token 
objects : PlaceConfiguration heldTokens 
s3 : State : Token 
name = “p4" 
M2 
+ 
TM 2,1 states objects heldTokens 
t2 : Trace s1 : State : PlaceConfiguration : Token 
name = “p1" 
M0 
p1 t1 p2 t2 p3 t3 p4 
objects : PlaceConfiguration heldTokens 
name = “p2" 
s2 : State 
objects : Token 
: PlaceConfiguration heldTokens 
name = “p3" 
s3 : State : Token 
objects : PlaceConfiguration 
heldTokens 
MatchTrace 
j s4 : State : Token 
name = “p4" 
37
Example Semantic Match Rules Application 
M1 TM 1,1 objects heldTokens 
t1 : Trace : PlaceConfiguration 
name = “p1" 
s1 : State 
states 
: Token 
p3 
p1 t1 p2 
objects heldTokens 
t2 p4 objects 
p1 
s2 : PlaceConfiguration 
name = “p2" 
: Token 
M0 
p p p heldTokens 
: State 
: PlaceConfiguration 
name = "p3" 
: Token 
objects : PlaceConfiguration heldTokens 
s3 : State : Token 
name = “p4" 
final state 
M2 
+ 
TM 2,1 states objects heldTokens 
t2 : Trace s1 : State : PlaceConfiguration : Token 
name = “p1" 
M0 
p1 t1 p2 t2 p3 t3 p4 
objects : PlaceConfiguration heldTokens 
name = “p2" 
s2 : State 
objects : Token 
MatchTrace 
: PlaceConfiguration heldTokens 
name = “p3" 
s3 : State : Token 
j objects : PlaceConfiguration 
heldTokens 
s4 : State : Token 
name = “p4" 
final state
Example Semantic Match Rules Application 
M1 TM 1,1 objects heldTokens 
t1 : Trace : PlaceConfiguration 
name = “p1" 
s1 : State 
states 
: Token 
p3 
p1 t1 p2 
objects heldTokens 
t2 p4 objects 
p1 
s2 : PlaceConfiguration 
name = “p2" 
: Token 
M0 
p p p heldTokens 
: State 
: PlaceConfiguration 
name = "p3" 
: Token 
objects : PlaceConfiguration heldTokens 
s3 : State : Token 
name = “p4" 
final state 
M2 
+ 
TM 2,1 states objects heldTokens 
t2 : Trace s1 : State : PlaceConfiguration : Token 
name = “p1" 
M0 
p1 t1 p2 t2 p3 t3 p4 
objects : PlaceConfiguration heldTokens 
name = “p2" 
s2 : State 
objects : Token 
MatchTrace 
h 
: PlaceConfiguration heldTokens 
name = “p3" 
s3 : State : Token 
objects : PlaceConfiguration 
heldTokens 
MatchState j s4 : State : Token 
name = “p4" 
final state
Example Semantic Match Rules Application 
M1 TM 1,1 objects heldTokens 
t1 : Trace : PlaceConfiguration 
name = “p1" 
s1 : State 
states 
: Token 
p3 
p1 t1 p2 
objects heldTokens 
t2 p4 objects 
p1 
s2 : PlaceConfiguration 
name = “p2" 
: Token 
M0 
p p p heldTokens 
: State 
: PlaceConfiguration 
name = "p3" 
: Token 
objects : PlaceConfiguration heldTokens 
s3 : State : Token 
name = “p4" 
final state 
M2 
+ 
TM 2,1 states objects heldTokens 
t2 : Trace s1 : State : PlaceConfiguration : Token 
name = “p1" 
M0 
p1 t1 p2 t2 p3 t3 p4 
objects : PlaceConfiguration heldTokens 
name = “p2" 
s2 : State 
objects : Token 
MatchTrace 
j h 
MatchState : PlaceConfiguration heldTokens 
name = “p3" 
s3 : State : Token 
objects : PlaceConfiguration 
heldTokens 
MatchPlaceConfiguration 
s4 : State : Token 
name = “p4" 
final state
Example Semantic Match Rules Application 
M1 TM 1,1 objects heldTokens 
t1 : Trace : PlaceConfiguration 
name = “p1" 
s1 : State 
states 
: Token 
p3 
p1 t1 p2 
objects heldTokens 
t2 p4 objects 
p1 
s2 : PlaceConfiguration 
name = “p2" 
: Token 
M0 
p p p heldTokens 
: State 
: PlaceConfiguration 
name = "p3" 
: Token 
objects : PlaceConfiguration heldTokens 
s3 : State : Token 
name = “p4" 
final state 
M2 
+ 
TM 2,1 states objects heldTokens 
t2 : Trace s1 : State : PlaceConfiguration : Token 
name = “p1" 
M0 
p1 t1 p2 t2 p3 t3 p4 
objects : PlaceConfiguration heldTokens 
name = “p2" 
s2 : State 
objects : Token 
MatchTrace 
j h 
MatchState : PlaceConfiguration heldTokens 
name = “p3" 
s3 : State : Token 
objects : PlaceConfiguration 
heldTokens 
MatchPlaceConfiguration 
s4 : State : Token 
name = “p4" 
final state
Example Semantic Match Rules Application 
M1 TM 1,1 objects heldTokens 
t1 : Trace : PlaceConfiguration 
name = “p1" 
s1 : State 
states 
: Token 
p3 
p1 t1 p2 
objects heldTokens 
t2 p4 objects 
p1 
s2 : PlaceConfiguration 
name = “p2" 
: Token 
M0 
p p p heldTokens 
: State 
: PlaceConfiguration 
name = "p3" 
semantically : Token 
equivalent 
objects : PlaceConfiguration heldTokens 
s3 : State : Token 
name = “p4" 
final state 
M2 
+ 
TM 2,1 states objects heldTokens 
t2 : Trace s1 : State : PlaceConfiguration : Token 
name = “p1" 
M0 
p1 t1 p2 t2 p3 t3 p4 
objects : PlaceConfiguration heldTokens 
name = “p2" 
s2 : State 
objects : Token 
MatchTrace 
j h 
MatchState : PlaceConfiguration heldTokens 
name = “p3" 
s3 : State : Token 
objects : PlaceConfiguration 
heldTokens 
MatchPlaceConfiguration 
s4 : State : Token 
name = “p4" 
final state
Example Semantic Match Rules Application 
M1 T : M1,1 I = {} MF = {} M1,1 
T : I = {p1 p1=1} MF = {p4 p4=1} 
p3 
M1,2 M1,2 
T : I = {p2=1} MF = {p2=1} diff witness M1,3 M1,3 
T : I = {p3=1} = {p3=1} diff witness p p1 t1 p p2 
t2 p p4 
M1,4 M1,4 
MF T : M1,5 I = {p1=1,p2=1} MF = {p2=1,p4=1} diff witness M1,5 
T : M1,6 I = {p1=1,p3=1} MF = {p3=1,p4=1} diff witness M1,6 
T : M 7 I = {p2=1,p3=1} MF = {p4=1} diff witness M1,7 M 7 p ,p } F p } M1,7 
T : M1,8 I = {p1=1,p2=1,p3=1} MF = {p4=2} diff witness M1,8 
M2 
+ 
T : M2,1 I = {} MF = {} M2,1 
T : M2,2 I = {p1=1} MF = {p4=1} M2,2 
T I {21} M { 4 diff it 
p1 t1 p2 t2 p3 t3 p4 
: M2,3 = p2=1} MF = p4=1} witness M2,3 
T : M2,4 I = {p3=1} MF = {p4=1} diff witness M2,4 
T : M2 5 I = {p1=1,p2=1} MF = {p4=2} diff witness M2,5 M2,5 
T : M2,6 I = {p1=1,p3=1} MF = {p4=2} diff witness M2,6 
T : M2,7 I = {p2=1,p3=1} MF = {p4=2} diff witness M2,7 
T : I = {1 1 M = p4=3} diff M2,8 p1=1,p2=1,p3=1} MF {witness M2,8 
43
Evaluation 
 Case studies comparing our approach with CDDiff1 and ADDiff2 by Maoz et al. 
 Behavioral semantics of both languages 
 Semantic match rules for both languages 
 Computation of diff witnesses for example models provided by Maoz et al.3 
 Expressive power 
 Sufficient for defining non‐trivial semantic model differencing operators 
 Developing operators is a language engineering task 
 Runtime states build basis for semantic model differencing 
 Performance 
 Model execution is most expensive taking 95 % of overall execution time 
 High performance of execution environment and reasonable number of model 
executions (inputs) are important 
1 S. Maoz et al. CDDiff: Semantic Differencing for Class Diagrams. ECOOP’11, volume 6813 of LNCS, pages 230–254. Springer, 2011. 
2 S. Maoz et al. ADDiff: Semantic Differencing for Activity Diagrams. ESEC/FSE’11, pages 179–189. ACM, 2011. 
44 
3 http://www.se‐rwth.de/materials/semdiff
Summary 
syn 
CM1,M2 
M1 
Syntactic 
syn 
Matching CM1,M2 
M1 TM1 
Model 
Execution 
Semantic 
sem 
Matching CM1,M2 
TM1 
M2 
g 
Match 
Rules 
M2 
IM1 IM2 
TM2 
Match 
Rules 
g 
TM2 C … Correspon-dence 
I … Input 
M … Model 
Syn 
Characteristics 
M1 M2 Sem T … Trace 
 Generic w.r.t. modeling language (behavioral semantics) 
 Generic w.r.t. semantics specification approach (generic trace format) 
 Configurable w.r.t. semantic equivalence criterion (semantic match rules) 
45
Outlook 
 Generation of inputs relevant to semantic differencing 
 Relevant inputs are inputs that cause distinct execution traces 
 Problem: Inputs have to be defined manually now 
 Symbolic execution1 
 Calculation of semantic differences avoiding model execution 
 Problem: Model execution for obtaining concrete execution traces is expensive 
 Path conditions obtained from symbolic execution already capture differences 
among execution traces 
 Only syntactic differences can lead to semantic differences 
 Directed and differential symbolic execution2,3 
46 
1 L. Clarke. A Program Testing System. In Proc. of ACM ‘76, pages 488–491. ACM, 1976. 
2 K.‐K. Ma et al. Directed Symbolic Execution. In Proc of SAS’11, volume 6887 of LNCS, pages 95–111. Springer, 2011. 
3 S. Person et al. Differential Symbolic Execution. In Proc. of FSE’08, pages 226–237. ACM, 2008.
Thank you! 
Model Execution Based on fUML 
www.modelexecution.org

More Related Content

Similar to Semantic Model Differencing Utilizing Behavioral Semantics Specifications (Talk at MODELS 2014)

2-Chapter Two-N-gram Language Models.ppt
2-Chapter Two-N-gram Language Models.ppt2-Chapter Two-N-gram Language Models.ppt
2-Chapter Two-N-gram Language Models.ppt
milkesa13
 
[Paper Reading] Unsupervised Learning of Sentence Embeddings using Compositi...
[Paper Reading]  Unsupervised Learning of Sentence Embeddings using Compositi...[Paper Reading]  Unsupervised Learning of Sentence Embeddings using Compositi...
[Paper Reading] Unsupervised Learning of Sentence Embeddings using Compositi...
Hiroki Shimanaka
 
Set-values prototypes through Consensus Analysis
Set-values prototypes through Consensus AnalysisSet-values prototypes through Consensus Analysis
Set-values prototypes through Consensus Analysis
Mario Fordellone
 
Coping with Semantic Variation Points in Domain-Specific Modeling Languages
Coping with Semantic Variation Points in Domain-Specific Modeling LanguagesCoping with Semantic Variation Points in Domain-Specific Modeling Languages
Coping with Semantic Variation Points in Domain-Specific Modeling Languages
Marc Pantel
 
Spell Checker
Spell CheckerSpell Checker
Mathematical Modeling for Practical Problems
Mathematical Modeling for Practical ProblemsMathematical Modeling for Practical Problems
Mathematical Modeling for Practical Problems
Liwei Ren任力偉
 
Interpretability of machine learning
Interpretability of machine learningInterpretability of machine learning
Interpretability of machine learning
Daiki Tanaka
 
Artificial intelligence for Social Good
Artificial intelligence for Social GoodArtificial intelligence for Social Good
Artificial intelligence for Social Good
Oana Tifrea-Marciuska
 
A Critical Reassessment of Evolutionary Algorithms on the Cryptanalysis of th...
A Critical Reassessment of Evolutionary Algorithms on the Cryptanalysis of th...A Critical Reassessment of Evolutionary Algorithms on the Cryptanalysis of th...
A Critical Reassessment of Evolutionary Algorithms on the Cryptanalysis of th...
ijcisjournal
 
A critical reassessment of
A critical reassessment ofA critical reassessment of
A critical reassessment of
ijcisjournal
 
A survey on parallel corpora alignment
A survey on parallel corpora alignment A survey on parallel corpora alignment
A survey on parallel corpora alignment
andrefsantos
 
Tdm probabilistic models (part 2)
Tdm probabilistic  models (part  2)Tdm probabilistic  models (part  2)
Tdm probabilistic models (part 2)
KU Leuven
 
Approaches to software model inconsistency management
Approaches to software model inconsistency managementApproaches to software model inconsistency management
Approaches to software model inconsistency management
Tom Mens
 
Declare Your Language: Syntax Definition
Declare Your Language: Syntax DefinitionDeclare Your Language: Syntax Definition
Declare Your Language: Syntax Definition
Eelco Visser
 
Text Mining Analytics 101
Text Mining Analytics 101Text Mining Analytics 101
Text Mining Analytics 101
Manohar Swamynathan
 
RDO_01_2016_Journal_P_Web
RDO_01_2016_Journal_P_WebRDO_01_2016_Journal_P_Web
RDO_01_2016_Journal_P_Web
Sahl Martin
 
Intepretable Machine Learning
Intepretable Machine LearningIntepretable Machine Learning
Intepretable Machine Learning
Ankit Tewari
 
Modeling and meta-modeling presentation at LTH, Sweden
Modeling and meta-modeling presentation at LTH, Sweden Modeling and meta-modeling presentation at LTH, Sweden
Modeling and meta-modeling presentation at LTH, Sweden
Saïd Assar
 
Semester VI.pdf
Semester VI.pdfSemester VI.pdf
Semester VI.pdf
GayathriRHICETCSESTA
 
Extending Boyer-Moore Algorithm to an Abstract String Matching Problem
Extending Boyer-Moore Algorithm to an Abstract String Matching ProblemExtending Boyer-Moore Algorithm to an Abstract String Matching Problem
Extending Boyer-Moore Algorithm to an Abstract String Matching Problem
Liwei Ren任力偉
 

Similar to Semantic Model Differencing Utilizing Behavioral Semantics Specifications (Talk at MODELS 2014) (20)

2-Chapter Two-N-gram Language Models.ppt
2-Chapter Two-N-gram Language Models.ppt2-Chapter Two-N-gram Language Models.ppt
2-Chapter Two-N-gram Language Models.ppt
 
[Paper Reading] Unsupervised Learning of Sentence Embeddings using Compositi...
[Paper Reading]  Unsupervised Learning of Sentence Embeddings using Compositi...[Paper Reading]  Unsupervised Learning of Sentence Embeddings using Compositi...
[Paper Reading] Unsupervised Learning of Sentence Embeddings using Compositi...
 
Set-values prototypes through Consensus Analysis
Set-values prototypes through Consensus AnalysisSet-values prototypes through Consensus Analysis
Set-values prototypes through Consensus Analysis
 
Coping with Semantic Variation Points in Domain-Specific Modeling Languages
Coping with Semantic Variation Points in Domain-Specific Modeling LanguagesCoping with Semantic Variation Points in Domain-Specific Modeling Languages
Coping with Semantic Variation Points in Domain-Specific Modeling Languages
 
Spell Checker
Spell CheckerSpell Checker
Spell Checker
 
Mathematical Modeling for Practical Problems
Mathematical Modeling for Practical ProblemsMathematical Modeling for Practical Problems
Mathematical Modeling for Practical Problems
 
Interpretability of machine learning
Interpretability of machine learningInterpretability of machine learning
Interpretability of machine learning
 
Artificial intelligence for Social Good
Artificial intelligence for Social GoodArtificial intelligence for Social Good
Artificial intelligence for Social Good
 
A Critical Reassessment of Evolutionary Algorithms on the Cryptanalysis of th...
A Critical Reassessment of Evolutionary Algorithms on the Cryptanalysis of th...A Critical Reassessment of Evolutionary Algorithms on the Cryptanalysis of th...
A Critical Reassessment of Evolutionary Algorithms on the Cryptanalysis of th...
 
A critical reassessment of
A critical reassessment ofA critical reassessment of
A critical reassessment of
 
A survey on parallel corpora alignment
A survey on parallel corpora alignment A survey on parallel corpora alignment
A survey on parallel corpora alignment
 
Tdm probabilistic models (part 2)
Tdm probabilistic  models (part  2)Tdm probabilistic  models (part  2)
Tdm probabilistic models (part 2)
 
Approaches to software model inconsistency management
Approaches to software model inconsistency managementApproaches to software model inconsistency management
Approaches to software model inconsistency management
 
Declare Your Language: Syntax Definition
Declare Your Language: Syntax DefinitionDeclare Your Language: Syntax Definition
Declare Your Language: Syntax Definition
 
Text Mining Analytics 101
Text Mining Analytics 101Text Mining Analytics 101
Text Mining Analytics 101
 
RDO_01_2016_Journal_P_Web
RDO_01_2016_Journal_P_WebRDO_01_2016_Journal_P_Web
RDO_01_2016_Journal_P_Web
 
Intepretable Machine Learning
Intepretable Machine LearningIntepretable Machine Learning
Intepretable Machine Learning
 
Modeling and meta-modeling presentation at LTH, Sweden
Modeling and meta-modeling presentation at LTH, Sweden Modeling and meta-modeling presentation at LTH, Sweden
Modeling and meta-modeling presentation at LTH, Sweden
 
Semester VI.pdf
Semester VI.pdfSemester VI.pdf
Semester VI.pdf
 
Extending Boyer-Moore Algorithm to an Abstract String Matching Problem
Extending Boyer-Moore Algorithm to an Abstract String Matching ProblemExtending Boyer-Moore Algorithm to an Abstract String Matching Problem
Extending Boyer-Moore Algorithm to an Abstract String Matching Problem
 

Recently uploaded

Applied Science: Thermodynamics, Laws & Methodology.pdf
Applied Science: Thermodynamics, Laws & Methodology.pdfApplied Science: Thermodynamics, Laws & Methodology.pdf
Applied Science: Thermodynamics, Laws & Methodology.pdf
University of Hertfordshire
 
Thornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdfThornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdf
European Sustainable Phosphorus Platform
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
Abdul Wali Khan University Mardan,kP,Pakistan
 
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...
Travis Hills MN
 
ESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptxESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptx
PRIYANKA PATEL
 
Medical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptxMedical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptx
terusbelajar5
 
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdfwaterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
LengamoLAppostilic
 
Shallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptxShallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptx
Gokturk Mehmet Dilci
 
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
vluwdy49
 
Deep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless ReproducibilityDeep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless Reproducibility
University of Rennes, INSA Rennes, Inria/IRISA, CNRS
 
Randomised Optimisation Algorithms in DAPHNE
Randomised Optimisation Algorithms in DAPHNERandomised Optimisation Algorithms in DAPHNE
Randomised Optimisation Algorithms in DAPHNE
University of Maribor
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
RitabrataSarkar3
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
Sérgio Sacani
 
8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf
by6843629
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
yqqaatn0
 
Oedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptxOedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptx
muralinath2
 
Equivariant neural networks and representation theory
Equivariant neural networks and representation theoryEquivariant neural networks and representation theory
Equivariant neural networks and representation theory
Daniel Tubbenhauer
 
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxThe use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
MAGOTI ERNEST
 
Micronuclei test.M.sc.zoology.fisheries.
Micronuclei test.M.sc.zoology.fisheries.Micronuclei test.M.sc.zoology.fisheries.
Micronuclei test.M.sc.zoology.fisheries.
Aditi Bajpai
 
The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
Sérgio Sacani
 

Recently uploaded (20)

Applied Science: Thermodynamics, Laws & Methodology.pdf
Applied Science: Thermodynamics, Laws & Methodology.pdfApplied Science: Thermodynamics, Laws & Methodology.pdf
Applied Science: Thermodynamics, Laws & Methodology.pdf
 
Thornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdfThornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdf
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
 
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...
 
ESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptxESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptx
 
Medical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptxMedical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptx
 
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdfwaterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
 
Shallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptxShallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptx
 
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
在线办理(salfor毕业证书)索尔福德大学毕业证毕业完成信一模一样
 
Deep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless ReproducibilityDeep Software Variability and Frictionless Reproducibility
Deep Software Variability and Frictionless Reproducibility
 
Randomised Optimisation Algorithms in DAPHNE
Randomised Optimisation Algorithms in DAPHNERandomised Optimisation Algorithms in DAPHNE
Randomised Optimisation Algorithms in DAPHNE
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
 
8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
 
Oedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptxOedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptx
 
Equivariant neural networks and representation theory
Equivariant neural networks and representation theoryEquivariant neural networks and representation theory
Equivariant neural networks and representation theory
 
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxThe use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
 
Micronuclei test.M.sc.zoology.fisheries.
Micronuclei test.M.sc.zoology.fisheries.Micronuclei test.M.sc.zoology.fisheries.
Micronuclei test.M.sc.zoology.fisheries.
 
The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
 

Semantic Model Differencing Utilizing Behavioral Semantics Specifications (Talk at MODELS 2014)

  • 1. Semantic Model Differencing Utilizing Behavioral Semantics Specifications www.modelexecution.org p Philip g , Langer, j Tanja y Mayerhofer, , Gerti pp Kappel Business Informatics Group Institute of Software Technology and Interactive Systems Vienna University of Technology Favoritenstraße 9‐11/1883, 1040 Vienna, Austria phone: 43 (1) 5880118804 (secretary), fax: 43 (1) 5880118896 office@big.tuwien.ac.at, www.big.tuwien.ac.at
  • 2. Motivation Models are subject to change  Change management for models is required Model differencing  Goal: Identify differences among models  Applications: Merging, versioning, conflict detection, incremental testing, etc. Syntactic model differencing  Procedure: Matching ‐ Differencing  Output: Syntactic differences (add, delete, update operations) Semantic model differencing  Takes semantics of models into account  Enables additional analyses of changes (e.g., semantic preservation)  Provides basis for comprehending the evolution of models 2
  • 3. Motivating Example Syntactic Model Differencing M1 p3 p1 t1 p2 t2 p4 M2 + p1 t1 p2 t2 p3 t3 p p4 3
  • 4. Motivating Example Syntactic Model Differencing M1 1. Matching  Identify corresponding elements based on  Identifiers p3  Signatures  Similarity p1 t1 p2 t2 p4  Custom matching M2 + p1 t1 p2 t2 p3 t3 p p4 4
  • 5. Motivating Example Syntactic Model Differencing M1 1. Matching  Identify corresponding elements based on  Identifiers p3  Signatures  Similarity p1 t1 p2 t2 p4  Custom matching (names) M2 + p1 t1 p2 t2 p3 t3 p p4 5
  • 6. Motivating Example Syntactic Model Differencing M1 1. Matching  Identify corresponding elements based on  Identifiers p3  Signatures  Similarity p1 t1 p2 t2 p4  Custom matching (names) 2. Differencing M2  Compare corresponding elements + p1 t1 p2 t2 p3 t3 p4  Identify non‐corresponding elements p 6
  • 7. Motivating Example Syntactic Model Differencing M1 1. Matching  Identify corresponding elements based on  Identifiers p3  Signatures  Similarity p1 t1 p2 t2 p4 p3 removed output p3 removed as input p3 added output  Custom matching (names) 2. Differencing M2 as as ! p4 removed as output !  Compare corresponding elements + + p1 t1 p2 t2 p3 t3 p4  Identify non‐corresponding elements p t3 added 7
  • 8. Motivating Example Syntactic Model Differencing Which impact do the changes have on the model( model(‘s semantics)? M1 1. Matching  Identify corresponding elements based on  Identifiers p3  Signatures  Similarity p1 t1 p2 t2 p4 p3 removed output p3 removed as input p3 added output  Custom matching (names) 2. Differencing M2 as as ! p4 removed as output !  Compare corresponding elements + + p1 t1 p2 t2 p3 t3 p4  Identify non‐corresponding elements p t3 added 8
  • 9. Introduction to Semantic Model Differencing Semantics of a modeling language1 Mapping : Language → Semantic domain  Assigns meaning to each conformant model  Provides (semantic) interpretations of models Semantic model differencing  Procedure:  Obtain semantic interpretations of models  Analyze semantic interpretations  Output: Semantic differences  Semantic interpretations valid for one model but not for the other one  Witness semantic differences among models (diff witnesses2) 9 1 D. Harel, B. Rumpe. Meaningful Modeling: What’s the Semantics of “Semantics”? IEEE Computer, 37(10):64–72, 2004. 2 S. Maoz et al. A Manifesto for Semantic Model Differencing. MODELS’10, volume 6627 of LNCS, pages 194–203. Springer, 2011.
  • 10. Motivating Example Semantic Model Differencing Petri nets Semantics  Firing of transitions leading to markings (token distributions) M1 Semantic p3 interpretation  Traces (order of transition firings)  Reachable markings p1 t1 p2 t2 p4 Semantic difference l l d d l M2  Traces only valid in one model  Markings only reachable in one model + p1 t1 p2 t2 p3 t3 p4  Final p markings only reached by one model 10
  • 11. Motivating Example Semantic Model Differencing Petri nets Semantics  Firing of transitions leading to markings (token distributions) M1 Semantic p3 interpretation  Traces (order of transition firings)  Reachable markings p1 t1 p2 t2 p4 Semantic difference l l d d l M2  Traces only valid in one model  Markings only reachable in one model + p1 t1 p2 t2 p3 t3 p4  Final p markings only reached by one model 11
  • 12. Motivating Example Semantic Model Differencing Petri nets Semantic interpretation M1 M : p1=1 0 0 p3 M0,M1: 1, p2=0, p3=0, p4=0 M0 p1 t1 p2 t2 p4 M2 + p1 t1 p2 t2 p3 t3 p p4 12
  • 13. Motivating Example Semantic Model Differencing Petri nets Semantic interpretation M1 M : p1=1 0 0 p3 M0,M1: 1, p2=0, p3=0, p4=0 M1,M1: p1=0, p2=1, p3=1, p4=0 M0 p1 t1 p2 t2 p4 M2 + p1 t1 p2 t2 p3 t3 p p4 13
  • 14. Motivating Example Semantic Model Differencing Petri nets Semantic interpretation M1 M : p1=1 0 0 p3 M0,M1: 1, p2=0, p3=0, p4=0 M1,M1: p1=0, p2=1, p3=1, p4=0 M0 M2,M1: p1=0, p2=0, p3=0, p4=1 p1 t1 p2 t2 p4 M2 + p1 t1 p2 t2 p3 t3 p p4 14
  • 15. Motivating Example Semantic Model Differencing Petri nets Semantic interpretation M1 M : p1=1 0 0 p3 M0,M1: 1, p2=0, p3=0, p4=0 M1,M1: p1=0, p2=1, p3=1, p4=0 M0 M2,M1: p1=0, p2=0, p3=0, p4=1 p1 t1 p2 t2 p4 M2 M0,M2: p1=1, p2=0, p3=0, p4=0 M : p1=0 1 0 + p1 t1 p2 t2 p3 t3 p4 M1,M2: 0, p2=1, p3=0, p4=0 M2,M2: p1=0, p2=0, p3=1, p4=0 M3,M2: p1=0, p2=0, p3=0, p4=1 M0 p 15
  • 16. Motivating Example Semantic Model Differencing Petri nets Semantic difference (final marking) M1 M : p1=1 0 0 p3 M0,M1: 1, p2=0, p3=0, p4=0 M1,M1: p1=0, p2=1, p3=1, p4=0 M0 M2,M1: p1=0, p2=0, p3=0, p4=1 p1 t1 p2 t2 p4 semantically equivalent M2 M0,M2: p1=1, p2=0, p3=0, p4=0 M : p1=0 1 0 q + p1 t1 p2 t2 p3 t3 p4 M1,M2: 0, p2=1, p3=0, p4=0 M2,M2: p1=0, p2=0, p3=1, p4=0 M3,M2: p1=0, p2=0, p3=0, p4=1 M0 p 16
  • 17. Motivating Example Semantic Model Differencing Petri nets Semantic difference (final marking) M1 M : p1=0 1 0 p3 M0,M1: 0, p2=1, p3=0, p4=0 M0 p1 t1 p2 t2 p4 diff witness M2 M0,M2: p1=0, p2=1, p3=0, p4=0 M : p1=0 0 1 + p1 t1 p2 t2 p3 t3 p4 M1,M2: 0, p2=0, p3=1, p4=0 M0 M2,M2: p1=0, p2=0, p3=0, p4=1 p 17
  • 18. Generic Semantic Model Differencing: Motivation Language‐specific approach (Maoz et al.1 and Fahrenberg et al.2)  Procedure:  Translate models into semantic domain  Perform semantic differencing in semantic domain  Translate result into modeling language  Challenge: Complex translations and differencing algorithms Generic approach  Idea: Utilize behavioral semantics specifications for semantic differencing  Goal:  Perform semantic differencing directly in modeling language  Use existing behavioral semantics specifications for semantic differencing  Enable to apply custom semantic equivalence criteria 1 S. Maoz et al. A Manifesto for Semantic Model Differencing. MODELS’10, volume 6627 of LNCS, pages 194–203. Springer, 2011. 2 U. Fahrenberg et al. Vision Paper: Make a Difference! (Semantically). MODELS’11, volume 6981 of LNCS, pages 490–500. Springer, 2011. 18
  • 19. Generic Semantic Model Differencing: Introduction Generic framework for semantic model differencing  Idea: Utilize behavioral semantics specifications for semantic differencing  Procedure:  Execute models to obtain execution traces (semantic interpretations)  Identify execution traces valid for only one model by comparison (diff witness)  Customize execution trace comparison to modeling language and suitable semantic equivalence criterion  Benefits:  Implementation of translations and algorithms specifically for semantic model differencing is avoided  Only comparison of execution traces is specific to modeling language and semantic equivalence criterion 19
  • 20. Generic Semantic Model Differencing: Overview syn CM1,M2 M1 Syntactic syn Matching CM1,M2 M1 TM1 Model Execution Semantic sem Matching CM1,M2 TM1 M2 g Match Rules M2 IM1 IM2 TM2 Match Rules g TM2 C … Correspon-dence I … Input M … Model Syn M1 M2 Sem T … Trace 20
  • 21. Generic Semantic Model Differencing: Overview Epsilon Platform1 syn CM1,M2 M1 Syntactic syn Matching CM1,M2 M1 TM1 Model Execution Semantic sem Matching CM1,M2 TM1 M2 g Match Rules M2 IM1 IM2 TM2 Match Rules g TM2 C … Correspon-dence I … Input M … Model Syn M1 M2 Sem Epsilon Comparison Language (ECL) 1 Syntactic matching: Identify syntactic correspondences T … Trace 1.  Syntactic match rules define custom language‐specific matching algorithm 21 1 D. Kolovos, L. Rose, A. García‐Domínguez, R. Paige. The Epsilon Book. March 2014. http://www.eclipse.org/epsilon/doc/book.
  • 22. Generic Semantic Model Differencing: Overview syn CM1,M2 Behavioral semantics M1 Syntactic syn Matching CM1,M2 M1 TM1 Model Execution Semantic sem Matching CM1,M2 TM1 M2 g Match Rules M2 IM1 IM2 TM2 Match Rules g TM2 C … Correspon-dence I … Input M … Model Syn M1 M2 Sem 2 Model Obtain traces Generic execution trace format T … Trace 2. execution: execution  Execute models based on behavioral semantics specification  Obtain execution traces adhering to generic execution trace format 22
  • 23. Generic Execution Trace Format states Runtime states of model during execution State Object states * t t 1 1 objects * Runtime of model elements Trace E t source 0..1 outgoing target incoming 0..1 Events causing Transition Event qualifiedName : EString transitions * event 1 state transitions  Execution Sequence Transitions between runtime states trace: of runtime states  Format serves as interface to our semantic model differencing framework  Execution traces adhering to this format are the input for indentifying semantic differences among models  Framework is generic with respect to semantics specification approach and model execution environment 23
  • 24. Generic Semantic Model Differencing: Overview syn Epsilon Platform CM1,M2 M1 Syntactic syn Matching CM1,M2 M1 TM1 Model Execution Semantic sem Matching CM1,M2 TM1 M2 g Match Rules M2 IM1 IM2 TM2 Match Rules g TM2 C … Correspon-dence I … Input M … Model Syn M1 M2 Sem T … Trace 3 Semantic Identify semantic correspondences Epsilon Comparison Language (ECL) 3. matching:  Semantic match rules define whether two models are semantically equivalent  Non‐matching traces constitute diff witnesses 24
  • 25. Generic Semantic Model Differencing: Overview Behavioral syn Semantics CM1,M2 M1 Syntactic syn Matching CM1,M2 M1 TM1 Model Execution Semantic sem Matching CM1,M2 TM1 M2 g Match Rules M2 IM1 IM2 TM2 Match Rules g TM2 C … Correspon-dence I … Input M … Model Syn M1 M2 Sem T … Trace 1 1. Syntactic matching: Identify syntactic correspondences 2. Model execution: Obtain execution traces 3. Semantic matching: Identify semantic correspondences 25
  • 26. Example Syntactic Match Rules (ECL) 1 rule MatchPlace 2 match left : Place with right : Place { 3 compare : left.name = right.name 4 } 5 rule MatchTransition 6 match left : Transition with right : Transition { 7 compare : left.name = right.name 8 } Metamodel Net Models M * transitions * places input M1 p3 Place name : EString Transition name : EString p * output p1 t1 p2 t2 p4 M2 * + p1 t1 p2 t2 p3 t3 p4 MatchPlace MatchTransition 26
  • 27. Example Behavioral Semantics (xMOF) Behavioral semantics specification with xMOF1  Integrates fUML action language with metamodeling languages (Ecore)  Behavioral semantics is defined with UML activities (operational semantics)  Model execution is performed by fUML virtual machine 27 1 T. Mayerhofer, P. Langer, M. Wimmer, and G. Kappel. xMOF: Executable DSMLs Based on fUML. In Proc. of SLE’13, volume 8225 of LNCS, pages 56–75. Springer, 2013.
  • 28. Example Behavioral Semantics (xMOF) Behavioral semantics specification with xMOF1  Integrates fUML action language with metamodeling languages (Ecore)  Behavioral semantics is defined with UML activities (operational semantics)  Model execution is performed by fUML virtual machine Net Transition Place NetConfiguration TransitionConfiguration PlaceConfiguration * heldTokens Token Runtime concept 28 1 T. Mayerhofer, P. Langer, M. Wimmer, and G. Kappel. xMOF: Executable DSMLs Based on fUML. In Proc. of SLE’13, volume 8225 of LNCS, pages 56–75. Springer, 2013.
  • 29. Example Behavioral Semantics (xMOF) Behavioral semantics specification with xMOF1  Integrates fUML action language with metamodeling languages (Ecore)  Behavioral semantics is defined with UML activities (operational semantics)  Model execution is performed by fUML virtual machine Net Transition Place TransitionConfiguration fire() i E bl d() EB l PlaceConfiguration addToken() run() isEnabled() : EBoolean removeToken() T k () NetConfiguration main(Token[*]) () * heldTokens Computational steps Token Runtime concept p p for executing model 29 1 T. Mayerhofer, P. Langer, M. Wimmer, and G. Kappel. xMOF: Executable DSMLs Based on fUML. In Proc. of SLE’13, volume 8225 of LNCS, pages 56–75. Springer, 2013.
  • 30. Example Behavioral Semantics (xMOF) Behavioral semantics specification with xMOF1  Integrates fUML action language with metamodeling languages (Ecore)  Behavioral semantics is defined with UML activities (operational semantics)  Model execution is performed by fUML virtual machine Net Transition Place TransitionConfiguration fire() i E bl d() EB l PlaceConfiguration addToken() run() isEnabled() : EBoolean removeToken() Activity T k () NetConfiguration main(Token[*]) () * heldTokens PlaceConfiguration::addToken() Token Runtime concept g () ReadSelf read self AddStructuralFeatureValue result object add heldTokens j CreateObject 1 T. Mayerhofer, P. Langer, M. Wimmer, and G. Kappel. xMOF: Executable DSMLs Based on fUML. In Proc. of SLE’13, volume 8225 of LNCS, pages 56–75. Springer, 2013. 30 value create Token result
  • 31. Example Execution Traces M1 TM 1,1 objects heldTokens t1 : Trace : PlaceConfiguration name = “p1" s1 : State states : Token p3 p1 t1 p2 t2 p4 M0 p1 p p p M2 + p1 t1 p2 t2 p3 t3 p4 31
  • 32. Example Execution Traces M1 TM 1,1 objects heldTokens t1 : Trace : PlaceConfiguration name = “p1" s1 : State states : Token p3 p1 t1 p2 objects heldTokens t2 p4 objects p1 s2 : PlaceConfiguration name = “p2" : Token M0 p p p heldTokens : State : PlaceConfiguration name = "p3" : Token M2 + p1 t1 p2 t2 p3 t3 p4 32
  • 33. Example Execution Traces M1 TM 1,1 objects heldTokens t1 : Trace : PlaceConfiguration name = “p1" s1 : State states : Token p3 p1 t1 p2 objects heldTokens t2 p4 objects p1 s2 : PlaceConfiguration name = “p2" : Token M0 p p p heldTokens : State : PlaceConfiguration name = "p3" : Token objects : PlaceConfiguration heldTokens s3 : State : Token name = “p4" M2 + p1 t1 p2 t2 p3 t3 p4 33
  • 34. Example Execution Traces M1 TM 1,1 objects heldTokens t1 : Trace : PlaceConfiguration name = “p1" s1 : State states : Token p3 p1 t1 p2 objects heldTokens t2 p4 objects p1 s2 : PlaceConfiguration name = “p2" : Token M0 p p p heldTokens : State : PlaceConfiguration name = "p3" : Token objects : PlaceConfiguration heldTokens s3 : State : Token name = “p4" states objects heldTokens t2 : Trace s1 : State : PlaceConfiguration : Token name = “p1" M2 + TM 2,1 M0 objects : PlaceConfiguration heldTokens p1 t1 p2 t2 p3 t3 p4 : Token name = “p2" s2 : State objects : PlaceConfiguration heldTokens name = “p3" s3 : State objects : PlaceConfiguration heldTokens 4 St t : Token s4 : State : T k Token g name = “p4" 34
  • 35. Example Semantic Match Rules (ECL) Final marking equivalence: For the same initial markings (input), the same final markings are reached 1 rule MatchTrace 2 mat h l ft T ith i ht T Returns final state capturing final marking match left : Trace with right : Trace { 3 compare { 4 var finalStateLeft : State = left.getFinalState(); // final state of left net 5 var finalStateRight : State = right.getFinalState(); // final state of right net 6 return finalStateLeft.matches(finalStateRight) and // final states match 7 finalStateRight.matches(finalStateLeft); 8 } 9 } Returns instances of 10 rule MatchState PlaceConfiguration 11 match left : State with right : State { 12 compare { 13 var placeConfsLeft : Set = left getPlaceConfigurations(); // final left.states of left places 14 var placeConfsRight : Set = right.getPlaceConfigurations(); // final states of right places 15 return placeConfsLeft.matches(placeConfsRight); // final states of places match 16 } 17 } 18 rule MatchPlaceConfiguration 19 match left : PlaceConfiguration with right : PlaceConfiguration extends MatchPlace { 20 compare : left.heldTokens.size() = right.heldTokens.size() // places hold same amount of tokens 35 21 }
  • 36. Example Semantic Match Rules Application M1 TM 1,1 objects heldTokens t1 : Trace : PlaceConfiguration name = “p1" s1 : State states : Token p3 p1 t1 p2 objects heldTokens t2 p4 objects p1 s2 : PlaceConfiguration name = “p2" : Token M0 p p p heldTokens : State : PlaceConfiguration name = "p3" : Token objects : PlaceConfiguration heldTokens s3 : State : Token name = “p4" M2 + TM 2,1 states objects heldTokens t2 : Trace s1 : State : PlaceConfiguration : Token name = “p1" M0 p1 t1 p2 t2 p3 t3 p4 objects : PlaceConfiguration heldTokens name = “p2" s2 : State objects : Token : PlaceConfiguration heldTokens name = “p3" s3 : State : Token j objects : PlaceConfiguration heldTokens s4 : State : Token name = “p4" 36
  • 37. Example Semantic Match Rules Application M1 TM 1,1 objects heldTokens t1 : Trace : PlaceConfiguration name = “p1" s1 : State states : Token p3 p1 t1 p2 objects heldTokens t2 p4 objects p1 s2 : PlaceConfiguration name = “p2" : Token M0 p p p heldTokens : State : PlaceConfiguration name = "p3" : Token objects : PlaceConfiguration heldTokens s3 : State : Token name = “p4" M2 + TM 2,1 states objects heldTokens t2 : Trace s1 : State : PlaceConfiguration : Token name = “p1" M0 p1 t1 p2 t2 p3 t3 p4 objects : PlaceConfiguration heldTokens name = “p2" s2 : State objects : Token : PlaceConfiguration heldTokens name = “p3" s3 : State : Token objects : PlaceConfiguration heldTokens MatchTrace j s4 : State : Token name = “p4" 37
  • 38. Example Semantic Match Rules Application M1 TM 1,1 objects heldTokens t1 : Trace : PlaceConfiguration name = “p1" s1 : State states : Token p3 p1 t1 p2 objects heldTokens t2 p4 objects p1 s2 : PlaceConfiguration name = “p2" : Token M0 p p p heldTokens : State : PlaceConfiguration name = "p3" : Token objects : PlaceConfiguration heldTokens s3 : State : Token name = “p4" final state M2 + TM 2,1 states objects heldTokens t2 : Trace s1 : State : PlaceConfiguration : Token name = “p1" M0 p1 t1 p2 t2 p3 t3 p4 objects : PlaceConfiguration heldTokens name = “p2" s2 : State objects : Token MatchTrace : PlaceConfiguration heldTokens name = “p3" s3 : State : Token j objects : PlaceConfiguration heldTokens s4 : State : Token name = “p4" final state
  • 39. Example Semantic Match Rules Application M1 TM 1,1 objects heldTokens t1 : Trace : PlaceConfiguration name = “p1" s1 : State states : Token p3 p1 t1 p2 objects heldTokens t2 p4 objects p1 s2 : PlaceConfiguration name = “p2" : Token M0 p p p heldTokens : State : PlaceConfiguration name = "p3" : Token objects : PlaceConfiguration heldTokens s3 : State : Token name = “p4" final state M2 + TM 2,1 states objects heldTokens t2 : Trace s1 : State : PlaceConfiguration : Token name = “p1" M0 p1 t1 p2 t2 p3 t3 p4 objects : PlaceConfiguration heldTokens name = “p2" s2 : State objects : Token MatchTrace h : PlaceConfiguration heldTokens name = “p3" s3 : State : Token objects : PlaceConfiguration heldTokens MatchState j s4 : State : Token name = “p4" final state
  • 40. Example Semantic Match Rules Application M1 TM 1,1 objects heldTokens t1 : Trace : PlaceConfiguration name = “p1" s1 : State states : Token p3 p1 t1 p2 objects heldTokens t2 p4 objects p1 s2 : PlaceConfiguration name = “p2" : Token M0 p p p heldTokens : State : PlaceConfiguration name = "p3" : Token objects : PlaceConfiguration heldTokens s3 : State : Token name = “p4" final state M2 + TM 2,1 states objects heldTokens t2 : Trace s1 : State : PlaceConfiguration : Token name = “p1" M0 p1 t1 p2 t2 p3 t3 p4 objects : PlaceConfiguration heldTokens name = “p2" s2 : State objects : Token MatchTrace j h MatchState : PlaceConfiguration heldTokens name = “p3" s3 : State : Token objects : PlaceConfiguration heldTokens MatchPlaceConfiguration s4 : State : Token name = “p4" final state
  • 41. Example Semantic Match Rules Application M1 TM 1,1 objects heldTokens t1 : Trace : PlaceConfiguration name = “p1" s1 : State states : Token p3 p1 t1 p2 objects heldTokens t2 p4 objects p1 s2 : PlaceConfiguration name = “p2" : Token M0 p p p heldTokens : State : PlaceConfiguration name = "p3" : Token objects : PlaceConfiguration heldTokens s3 : State : Token name = “p4" final state M2 + TM 2,1 states objects heldTokens t2 : Trace s1 : State : PlaceConfiguration : Token name = “p1" M0 p1 t1 p2 t2 p3 t3 p4 objects : PlaceConfiguration heldTokens name = “p2" s2 : State objects : Token MatchTrace j h MatchState : PlaceConfiguration heldTokens name = “p3" s3 : State : Token objects : PlaceConfiguration heldTokens MatchPlaceConfiguration s4 : State : Token name = “p4" final state
  • 42. Example Semantic Match Rules Application M1 TM 1,1 objects heldTokens t1 : Trace : PlaceConfiguration name = “p1" s1 : State states : Token p3 p1 t1 p2 objects heldTokens t2 p4 objects p1 s2 : PlaceConfiguration name = “p2" : Token M0 p p p heldTokens : State : PlaceConfiguration name = "p3" semantically : Token equivalent objects : PlaceConfiguration heldTokens s3 : State : Token name = “p4" final state M2 + TM 2,1 states objects heldTokens t2 : Trace s1 : State : PlaceConfiguration : Token name = “p1" M0 p1 t1 p2 t2 p3 t3 p4 objects : PlaceConfiguration heldTokens name = “p2" s2 : State objects : Token MatchTrace j h MatchState : PlaceConfiguration heldTokens name = “p3" s3 : State : Token objects : PlaceConfiguration heldTokens MatchPlaceConfiguration s4 : State : Token name = “p4" final state
  • 43. Example Semantic Match Rules Application M1 T : M1,1 I = {} MF = {} M1,1 T : I = {p1 p1=1} MF = {p4 p4=1} p3 M1,2 M1,2 T : I = {p2=1} MF = {p2=1} diff witness M1,3 M1,3 T : I = {p3=1} = {p3=1} diff witness p p1 t1 p p2 t2 p p4 M1,4 M1,4 MF T : M1,5 I = {p1=1,p2=1} MF = {p2=1,p4=1} diff witness M1,5 T : M1,6 I = {p1=1,p3=1} MF = {p3=1,p4=1} diff witness M1,6 T : M 7 I = {p2=1,p3=1} MF = {p4=1} diff witness M1,7 M 7 p ,p } F p } M1,7 T : M1,8 I = {p1=1,p2=1,p3=1} MF = {p4=2} diff witness M1,8 M2 + T : M2,1 I = {} MF = {} M2,1 T : M2,2 I = {p1=1} MF = {p4=1} M2,2 T I {21} M { 4 diff it p1 t1 p2 t2 p3 t3 p4 : M2,3 = p2=1} MF = p4=1} witness M2,3 T : M2,4 I = {p3=1} MF = {p4=1} diff witness M2,4 T : M2 5 I = {p1=1,p2=1} MF = {p4=2} diff witness M2,5 M2,5 T : M2,6 I = {p1=1,p3=1} MF = {p4=2} diff witness M2,6 T : M2,7 I = {p2=1,p3=1} MF = {p4=2} diff witness M2,7 T : I = {1 1 M = p4=3} diff M2,8 p1=1,p2=1,p3=1} MF {witness M2,8 43
  • 44. Evaluation  Case studies comparing our approach with CDDiff1 and ADDiff2 by Maoz et al.  Behavioral semantics of both languages  Semantic match rules for both languages  Computation of diff witnesses for example models provided by Maoz et al.3  Expressive power  Sufficient for defining non‐trivial semantic model differencing operators  Developing operators is a language engineering task  Runtime states build basis for semantic model differencing  Performance  Model execution is most expensive taking 95 % of overall execution time  High performance of execution environment and reasonable number of model executions (inputs) are important 1 S. Maoz et al. CDDiff: Semantic Differencing for Class Diagrams. ECOOP’11, volume 6813 of LNCS, pages 230–254. Springer, 2011. 2 S. Maoz et al. ADDiff: Semantic Differencing for Activity Diagrams. ESEC/FSE’11, pages 179–189. ACM, 2011. 44 3 http://www.se‐rwth.de/materials/semdiff
  • 45. Summary syn CM1,M2 M1 Syntactic syn Matching CM1,M2 M1 TM1 Model Execution Semantic sem Matching CM1,M2 TM1 M2 g Match Rules M2 IM1 IM2 TM2 Match Rules g TM2 C … Correspon-dence I … Input M … Model Syn Characteristics M1 M2 Sem T … Trace  Generic w.r.t. modeling language (behavioral semantics)  Generic w.r.t. semantics specification approach (generic trace format)  Configurable w.r.t. semantic equivalence criterion (semantic match rules) 45
  • 46. Outlook  Generation of inputs relevant to semantic differencing  Relevant inputs are inputs that cause distinct execution traces  Problem: Inputs have to be defined manually now  Symbolic execution1  Calculation of semantic differences avoiding model execution  Problem: Model execution for obtaining concrete execution traces is expensive  Path conditions obtained from symbolic execution already capture differences among execution traces  Only syntactic differences can lead to semantic differences  Directed and differential symbolic execution2,3 46 1 L. Clarke. A Program Testing System. In Proc. of ACM ‘76, pages 488–491. ACM, 1976. 2 K.‐K. Ma et al. Directed Symbolic Execution. In Proc of SAS’11, volume 6887 of LNCS, pages 95–111. Springer, 2011. 3 S. Person et al. Differential Symbolic Execution. In Proc. of FSE’08, pages 226–237. ACM, 2008.
  • 47. Thank you! Model Execution Based on fUML www.modelexecution.org