SlideShare a Scribd company logo
1 of 27
An Adapter-Based
Approach to Co-evolve
Generated SQL in M2T
Transformations
Jokin García1
, Oscar Díaz1
and Jordi Cabot2
Onekin1
University of the Basque Country, Spain
AtlanMod2
Ecole des Mines, Nantes (France)
Thessaloniki - 19th
of june, 2014
CAiSE
2
Index
 Problem statement
 Case study
 Solution
 Evaluation
 Conclusions, future work
3
Problem statement: context
 Software components on top of platforms
 Dependencies
 Platform evolution is a common situation.
 DB
 API
APIApplication
4
Problem statement: context
 Perpetual beta state of platforms.
 External dependency
5
Problem statement: context
Domain model M2T Transformation Code
Model refs.
(Dynamic)
Embedded platform code
(Static)
Transformation code
6
Problem statement: problem
 Different versions of the platform leave the code and M2T
transformation outdated.
DB DB’
Δ
Code
M2T
?
7
Problem statement: solution
 Adapter to adapt generated code to new platform
Media
Wiki
Media
Wiki’
Δ
Code
M2T
Adapter
Code
8
Case study: MediaWiki DB in
Wikiwhirl
9
Case study: MediaWiki
 Used by Wikipedia and more than 40.000 wikis
 In 4,5 years: 171 schema upgrades
10
Case study: MediaWiki DB in
Wikiwhirl
Platform-dependent
concepts
Refs to
domain model
11
Solution
 Synchronize the generated code with platform
 Using adapters at runtime
Media
Wiki’
M2T
Adapter
Code
INSERT INTO categorylinks (cl_from, cl_to, cl_sortkey, cl_timestamp,
cl_type, cl_sortkey_prefix, cl_collation) VALUES (@pageId, ‘Softwareproject’,
‘House_Testing’, (DATE_FORMAT(CURRENT_TIMESTAMP(), ‘%Y%m%d
%k%i%s’), ‘page’, ‘’, ‘0’);
print("INSERT into categorylinks (cl_from, cl_to, cl_sortkey, cl_timestamp)
VALUES (@pageId, '" + categoryTitle + "','" + pageTitle)
New columns “cl_type”, “cl_sortkey_prefix” and
“cl_collation”
12
Process outline
Code
(MediaWiki DB)
New MediaWiki
schema
Old MediaWiki
schema
Domain
model
Difference
model
New schema
model
Old schema
model
13
Process outline
Code
(MediaWiki DB)
New MediaWiki
schema
Old MediaWiki
schema
Domain
model
Difference
model
New schema
model
Old schema
model
Don't worry:
all in one click
14
Process: Differences between
platforms: DB schema
15
Process: Schema Modification
Operators (SMO)
SMO % of
usage
Change
type
Adaptation
Create table 8.9 NBC New comment in the transformation on the existence of this
table in the new version
Drop table 3.3 BRC Delete statement associated to the table
Rename table 1.1 BRC Update name
Copy table 2.2 NBC (None)
Add column 38.7 NBC/
BRC
For insert statements: if the attribute is Not Null, add the new
column in the statement with a default value (from the DB if
there is available or according to the type if there is not)
Drop column 26.4 BRC Delete the column and the value in the statement
Rename column 16 BRC Update name
Copy column 0.4 BRC Like add column case
Move column 1.5 BRC Like drop column + add column cases
16
Process: Adaptation
 Plaform-specific, schema-independent
 Replace all “println” instructions with “printSQL”
 Import “printSQL” library
 ZQL extension
 For each printSQL invocation:
 Iterate over the changes reported in the Difference model
 Checks if any of the changes impacts the current statement.
 Needed information to adapt the statement is retrieved and
added to a list of parameters: the statement, affected table,
column, …
 A function that adapts the statement is called and new
statement is printed.
17
Process: Adaptation output
Added columns
Delete tables
Delete columns
18
Roles
 Producer
 Injector for target platform
 Implement adapter as a library for transformation
 Consumer
 Import adapter library in the transformation
 Execute the batch
19
Evaluation
 Manual Cost = D + P * #Impacts
 D: Detection time
 P: Propagation time
 Assisted Cost = C + V * #Impacts
 C: Configuration time
 V: Verification time
20
Evaluation
21
Dump changes from code to
transformation
 Assist manual propagation
 Record generation with change to be done and where (line and column
in the transformation)
M2T
transformation
M2T
transformation’HOT
print(“select * from …”) printSQL(“select * from …”, line, column)
RECORD:
#Added columns cl_type, cl_sortkey_prefix and cl_collation
#transformation line: 12, column: 11
INSERT INTO categorylinks (cl_from, cl_to, cl_sortkey, cl_timestamp, cl_type,
cl_sortkey_prefix, cl_collation) VALUES (@pageId, ‘Softwareproject’,
‘House_Testing’, (DATE_FORMAT(CURRENT_TIMESTAMP(), ‘%Y%m%d%k
%i%s’), ‘page’, ‘’, ‘0’);
22
Conclusions
 Mechanism to adapt code generated by M2T transformations
to platform evolution
 Apply in a specific case study
 Premises: platform instability and transformation coupling
23
Issues and future work
 Generalization: other platforms
 Methodology for adapter development
24
Questions
jokin.garcia@ehu.es http://www.onekin.org
Implementation: www.onekin.org/downloads/public/Batch_MofscriptAdaptation.rar
Screencasts: www.onekin.org/downloads/public/screencasts/MOFScript/
25
Process: Adaptation
1- Iterate over the changes reported in the Difference
model
2- check that the deleted
column's table corresponds
with the table name of the
statement
3- the statement,
the table name and
the removed
column are added
to a list of
parameters
4- outputs an SQL statement without the removed
column, using a function with the list of parameters
that modifies the expression
26
Generate traceability model
Domain
model
Traceability
model
MOFScript
+ Adapter
printSQL(statement l,ine, column, print_type)
<Transformation: line, column >->
<Code: line, column>
Difference
model
27
Visualize traceability model
Traceability
model
Handy
MOF

More Related Content

Similar to CAiSE 2014 An adapter-based approach for M2T transformations

PhD Maintainability of transformations in evolving MDE ecosystems
PhD Maintainability of transformations in evolving MDE ecosystemsPhD Maintainability of transformations in evolving MDE ecosystems
PhD Maintainability of transformations in evolving MDE ecosystemsJokin García Pérez
 
Flink Forward SF 2017: Timo Walther - Table & SQL API – unified APIs for bat...
Flink Forward SF 2017: Timo Walther -  Table & SQL API – unified APIs for bat...Flink Forward SF 2017: Timo Walther -  Table & SQL API – unified APIs for bat...
Flink Forward SF 2017: Timo Walther - Table & SQL API – unified APIs for bat...Flink Forward
 
Cross-Project Build Co-change Prediction
Cross-Project Build Co-change PredictionCross-Project Build Co-change Prediction
Cross-Project Build Co-change PredictionShane McIntosh
 
Presentation vision transformersppt.pptx
Presentation vision transformersppt.pptxPresentation vision transformersppt.pptx
Presentation vision transformersppt.pptxhtn540
 
CMPT470-usask-guest-lecture
CMPT470-usask-guest-lectureCMPT470-usask-guest-lecture
CMPT470-usask-guest-lectureMasud Rahman
 
Model Transformation Reuse
Model Transformation ReuseModel Transformation Reuse
Model Transformation Reusemiso_uam
 
Developer's Approach to Code Management
Developer's Approach to Code ManagementDeveloper's Approach to Code Management
Developer's Approach to Code ManagementMichael Rosenblum
 
Rete network slicing for Model Queries
Rete network slicing for Model QueriesRete network slicing for Model Queries
Rete network slicing for Model QueriesZoltán Ujhelyi
 
Just-in-time Detection of Protection-Impacting Changes on WordPress and Media...
Just-in-time Detection of Protection-Impacting Changes on WordPress and Media...Just-in-time Detection of Protection-Impacting Changes on WordPress and Media...
Just-in-time Detection of Protection-Impacting Changes on WordPress and Media...Amine Barrak
 
.Net december 2017 updates - Tamir Dresher
.Net december 2017 updates - Tamir Dresher.Net december 2017 updates - Tamir Dresher
.Net december 2017 updates - Tamir DresherTamir Dresher
 
Name _______________________________ Class time __________.docx
Name _______________________________    Class time __________.docxName _______________________________    Class time __________.docx
Name _______________________________ Class time __________.docxrosemarybdodson23141
 
Slides Apde2002 Enhance
Slides Apde2002 EnhanceSlides Apde2002 Enhance
Slides Apde2002 Enhancempereda
 
Data Mining-based Tools to Support Library Update. PhD Defence of Oleksandr Z...
Data Mining-based Tools to Support Library Update. PhD Defence of Oleksandr Z...Data Mining-based Tools to Support Library Update. PhD Defence of Oleksandr Z...
Data Mining-based Tools to Support Library Update. PhD Defence of Oleksandr Z...Oleksandr Zaitsev
 
Constructing DSMLs
Constructing DSMLsConstructing DSMLs
Constructing DSMLsmiso_uam
 

Similar to CAiSE 2014 An adapter-based approach for M2T transformations (20)

SPLC Presentation
SPLC PresentationSPLC Presentation
SPLC Presentation
 
PhD Maintainability of transformations in evolving MDE ecosystems
PhD Maintainability of transformations in evolving MDE ecosystemsPhD Maintainability of transformations in evolving MDE ecosystems
PhD Maintainability of transformations in evolving MDE ecosystems
 
Flink Forward SF 2017: Timo Walther - Table & SQL API – unified APIs for bat...
Flink Forward SF 2017: Timo Walther -  Table & SQL API – unified APIs for bat...Flink Forward SF 2017: Timo Walther -  Table & SQL API – unified APIs for bat...
Flink Forward SF 2017: Timo Walther - Table & SQL API – unified APIs for bat...
 
Cross-Project Build Co-change Prediction
Cross-Project Build Co-change PredictionCross-Project Build Co-change Prediction
Cross-Project Build Co-change Prediction
 
Presentation vision transformersppt.pptx
Presentation vision transformersppt.pptxPresentation vision transformersppt.pptx
Presentation vision transformersppt.pptx
 
GraphQL Basics
GraphQL BasicsGraphQL Basics
GraphQL Basics
 
CMPT470-usask-guest-lecture
CMPT470-usask-guest-lectureCMPT470-usask-guest-lecture
CMPT470-usask-guest-lecture
 
Model Transformation Reuse
Model Transformation ReuseModel Transformation Reuse
Model Transformation Reuse
 
Defense
DefenseDefense
Defense
 
Defense
DefenseDefense
Defense
 
Developer's Approach to Code Management
Developer's Approach to Code ManagementDeveloper's Approach to Code Management
Developer's Approach to Code Management
 
Rete network slicing for Model Queries
Rete network slicing for Model QueriesRete network slicing for Model Queries
Rete network slicing for Model Queries
 
Just-in-time Detection of Protection-Impacting Changes on WordPress and Media...
Just-in-time Detection of Protection-Impacting Changes on WordPress and Media...Just-in-time Detection of Protection-Impacting Changes on WordPress and Media...
Just-in-time Detection of Protection-Impacting Changes on WordPress and Media...
 
.Net december 2017 updates - Tamir Dresher
.Net december 2017 updates - Tamir Dresher.Net december 2017 updates - Tamir Dresher
.Net december 2017 updates - Tamir Dresher
 
Name _______________________________ Class time __________.docx
Name _______________________________    Class time __________.docxName _______________________________    Class time __________.docx
Name _______________________________ Class time __________.docx
 
Revealing C# 5
Revealing C# 5Revealing C# 5
Revealing C# 5
 
Ladc presentation
Ladc presentationLadc presentation
Ladc presentation
 
Slides Apde2002 Enhance
Slides Apde2002 EnhanceSlides Apde2002 Enhance
Slides Apde2002 Enhance
 
Data Mining-based Tools to Support Library Update. PhD Defence of Oleksandr Z...
Data Mining-based Tools to Support Library Update. PhD Defence of Oleksandr Z...Data Mining-based Tools to Support Library Update. PhD Defence of Oleksandr Z...
Data Mining-based Tools to Support Library Update. PhD Defence of Oleksandr Z...
 
Constructing DSMLs
Constructing DSMLsConstructing DSMLs
Constructing DSMLs
 

Recently uploaded

A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxComplianceQuest1
 
Unlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language ModelsUnlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language Modelsaagamshah0812
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...MyIntelliSource, Inc.
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about usDynamic Netsoft
 
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...soniya singh
 
Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Intelisync
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...ICS
 
5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdfWave PLM
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...OnePlan Solutions
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comFatema Valibhai
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...gurkirankumar98700
 
Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...aditisharan08
 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providermohitmore19
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxbodapatigopi8531
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfjoe51371421
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataBradBedford3
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software DevelopersVinodh Ram
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantAxelRicardoTrocheRiq
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - InfographicHr365.us smith
 

Recently uploaded (20)

A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docx
 
Unlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language ModelsUnlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language Models
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about us
 
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
 
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
 
Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)Introduction to Decentralized Applications (dApps)
Introduction to Decentralized Applications (dApps)
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
 
5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...
 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.com
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
 
Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...Unit 1.1 Excite Part 1, class 9, cbse...
Unit 1.1 Excite Part 1, class 9, cbse...
 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service provider
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptx
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdf
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software Developers
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service Consultant
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - Infographic
 

CAiSE 2014 An adapter-based approach for M2T transformations

  • 1. An Adapter-Based Approach to Co-evolve Generated SQL in M2T Transformations Jokin García1 , Oscar Díaz1 and Jordi Cabot2 Onekin1 University of the Basque Country, Spain AtlanMod2 Ecole des Mines, Nantes (France) Thessaloniki - 19th of june, 2014 CAiSE
  • 2. 2 Index  Problem statement  Case study  Solution  Evaluation  Conclusions, future work
  • 3. 3 Problem statement: context  Software components on top of platforms  Dependencies  Platform evolution is a common situation.  DB  API APIApplication
  • 4. 4 Problem statement: context  Perpetual beta state of platforms.  External dependency
  • 5. 5 Problem statement: context Domain model M2T Transformation Code Model refs. (Dynamic) Embedded platform code (Static) Transformation code
  • 6. 6 Problem statement: problem  Different versions of the platform leave the code and M2T transformation outdated. DB DB’ Δ Code M2T ?
  • 7. 7 Problem statement: solution  Adapter to adapt generated code to new platform Media Wiki Media Wiki’ Δ Code M2T Adapter Code
  • 8. 8 Case study: MediaWiki DB in Wikiwhirl
  • 9. 9 Case study: MediaWiki  Used by Wikipedia and more than 40.000 wikis  In 4,5 years: 171 schema upgrades
  • 10. 10 Case study: MediaWiki DB in Wikiwhirl Platform-dependent concepts Refs to domain model
  • 11. 11 Solution  Synchronize the generated code with platform  Using adapters at runtime Media Wiki’ M2T Adapter Code INSERT INTO categorylinks (cl_from, cl_to, cl_sortkey, cl_timestamp, cl_type, cl_sortkey_prefix, cl_collation) VALUES (@pageId, ‘Softwareproject’, ‘House_Testing’, (DATE_FORMAT(CURRENT_TIMESTAMP(), ‘%Y%m%d %k%i%s’), ‘page’, ‘’, ‘0’); print("INSERT into categorylinks (cl_from, cl_to, cl_sortkey, cl_timestamp) VALUES (@pageId, '" + categoryTitle + "','" + pageTitle) New columns “cl_type”, “cl_sortkey_prefix” and “cl_collation”
  • 12. 12 Process outline Code (MediaWiki DB) New MediaWiki schema Old MediaWiki schema Domain model Difference model New schema model Old schema model
  • 13. 13 Process outline Code (MediaWiki DB) New MediaWiki schema Old MediaWiki schema Domain model Difference model New schema model Old schema model Don't worry: all in one click
  • 15. 15 Process: Schema Modification Operators (SMO) SMO % of usage Change type Adaptation Create table 8.9 NBC New comment in the transformation on the existence of this table in the new version Drop table 3.3 BRC Delete statement associated to the table Rename table 1.1 BRC Update name Copy table 2.2 NBC (None) Add column 38.7 NBC/ BRC For insert statements: if the attribute is Not Null, add the new column in the statement with a default value (from the DB if there is available or according to the type if there is not) Drop column 26.4 BRC Delete the column and the value in the statement Rename column 16 BRC Update name Copy column 0.4 BRC Like add column case Move column 1.5 BRC Like drop column + add column cases
  • 16. 16 Process: Adaptation  Plaform-specific, schema-independent  Replace all “println” instructions with “printSQL”  Import “printSQL” library  ZQL extension  For each printSQL invocation:  Iterate over the changes reported in the Difference model  Checks if any of the changes impacts the current statement.  Needed information to adapt the statement is retrieved and added to a list of parameters: the statement, affected table, column, …  A function that adapts the statement is called and new statement is printed.
  • 17. 17 Process: Adaptation output Added columns Delete tables Delete columns
  • 18. 18 Roles  Producer  Injector for target platform  Implement adapter as a library for transformation  Consumer  Import adapter library in the transformation  Execute the batch
  • 19. 19 Evaluation  Manual Cost = D + P * #Impacts  D: Detection time  P: Propagation time  Assisted Cost = C + V * #Impacts  C: Configuration time  V: Verification time
  • 21. 21 Dump changes from code to transformation  Assist manual propagation  Record generation with change to be done and where (line and column in the transformation) M2T transformation M2T transformation’HOT print(“select * from …”) printSQL(“select * from …”, line, column) RECORD: #Added columns cl_type, cl_sortkey_prefix and cl_collation #transformation line: 12, column: 11 INSERT INTO categorylinks (cl_from, cl_to, cl_sortkey, cl_timestamp, cl_type, cl_sortkey_prefix, cl_collation) VALUES (@pageId, ‘Softwareproject’, ‘House_Testing’, (DATE_FORMAT(CURRENT_TIMESTAMP(), ‘%Y%m%d%k %i%s’), ‘page’, ‘’, ‘0’);
  • 22. 22 Conclusions  Mechanism to adapt code generated by M2T transformations to platform evolution  Apply in a specific case study  Premises: platform instability and transformation coupling
  • 23. 23 Issues and future work  Generalization: other platforms  Methodology for adapter development
  • 25. 25 Process: Adaptation 1- Iterate over the changes reported in the Difference model 2- check that the deleted column's table corresponds with the table name of the statement 3- the statement, the table name and the removed column are added to a list of parameters 4- outputs an SQL statement without the removed column, using a function with the list of parameters that modifies the expression
  • 26. 26 Generate traceability model Domain model Traceability model MOFScript + Adapter printSQL(statement l,ine, column, print_type) <Transformation: line, column >-> <Code: line, column> Difference model

Editor's Notes

  1. Good afternoon. It’s an honor for me to present my work entitled “An adapter based…”
  2. The structure I will use in the presentation is the following: First, I am going to focus on the problem statement, putting the problem into context. Secondly, I am going to motivate the problem with an specific scenario, that will be used along the presentation Then, I am going to explain what solution I am proposing to the problem. Later, I am going to show the results of the evaluation I carried out. And finally, I am going to end with the conclusions and future work
  3. context: Broadly speaking, software components, often, do not work in isolation, but are built on top of platforms that provide some functionality. While this offers numerous advantages, it also creates dependencies. The evolution of these platforms is a common situation. One paradigmatic platform is a database. In the database world, the evolution of schemas has always been a concern. This is the platform that has been used in this work Another example would be APIs, which evolution can make client applications get outdated.
  4. Two characteristics of platforms that make them problematic, are: On the one hand, the perpetual beta phenomenon. This means that developers have to work with software components that are in a beta version as if they were in production level. This increases the frequency of releases, and therefore the number of the required co-evolution actions. - On the other hand, the platform is often an external dependency, i.e., it belongs to a different organization. Changes in these components are usually out of the control of the rest of partners, and might be accompanied by poor documentation, lost communication with the partner responsible for the change, and so on. This rules out the possibility of tracking platform upgrades to be later replicated.
  5. M2T transformations are composed of static and dynamic parts. They interleave target platform code, instructions from the transformation language (conditional and iteratation instructions) and references to the input model. Platform-specific code (information of the tables, columns, ... ) is embedded in the transformation Therefore, there is domain variability but not platform variability. • In a database scenario, transformations do not specify but construct SQL scripts. The SQL script is dynamically generated once references to the input model are resolved. In the transformation, there are references to a model which metamodel is unknown a priori, that will be resolved at runtime.
  6. problem: Forward Engineering advocates for code to be generated dynamically through M2T transformations that target a specific platform. In this setting, platform evolution can leave the transformation, and hence the generated code, outdated. Where is platform-dependent information? In the transformation. MDA guide: 10 years later http://modeling-languages.com/anybody-using-both-mda-platform-independent-and-platform-specific-models/
  7. Solution: It is proposed, to do transformations more resilient to changes to add an adaptability mechanism for the most vulnerable parts of the transformation, those which are dependent on the platform. The solution it is proposed for this problem is to use the well-known adapter pattern from Object Oriented with M2T transformations.
  8. We actually suffered this problem in a project. In the CAiSE of 2012 it was presented a tool called Wikiwhirl. WikiWhirl abstracts wiki structure in terms of mindmaps, where refactoring operations of the wiki (WikiWhirl expressions) are expressed as reshapes of mindmaps. Since wikis end up being supported as DBs, WikiWhirl expressions are transformed into SQL scripts. WikiWhirl is a Domain-Specific Language (DSL) built on top of MediaWiki. WikiWhirl is interpreted, i.e. a WikiWhirl model (an expression described along the WikiWhirl syntax) delivers an SQL script that is enacted. The matter is that this SQL script goes along the MediaWiki DB schema. If this schema changes, the script might break apart. Since MediaWiki is part of the Wikimedia Foundation, we do not have control upon what and when MediaWiki releases are delivered. And release frequency can be large, which introduces a heavy maintenance burden upon WikiWhirl. In other words, That mindmap (upper part of the figure) will be manipulated by the user in order to refactor the wiki; and finally that changes will be propagated to the actual wiki, in the bottom of the figure. This begs the question of how to make WikiWhirl be resilient to MediaWiki upgrades. This database scenario is used as a paradigmatic example of platform evolution. A well-known database has been used: Mediawiki database, which is used by the Wikipedia.
  9. MediaWiki is a wiki engine, currently used by almost 40,000 wikis. In a 4½ year period, the MediaWiki DB had 171 schema upgrades. This gives us an idea of the importance and frequency of changes.
  10. This slide shows a snippet of the transformation from the mindmap to the wiki. These statements are built upon the DB schema of MediaWiki, and in so doing, create an external dependency of WikiWhirl w.r.t. MediaWiki. We can see the table and column names in the prints
  11. To tackle the mentioned problem, data manipulation requests (i.e. insert, delete, update, select) are re-directed to the adapter during the transformation. The adapter outputs the code according to the latest schema release. In this way, the main source of instability (i.e. schema upgrades) is isolated in the adapter. We do not need to change the transformation.
  12. The process overview would be the following. First, DB schemas (i.e. New schema, Old Schema) are injected as Ecore models with Schemol tool(step 1); next, the schema difference is computed (i.e. Difference model) with EMFCompare (step 2); finally, this schema difference feeds the adapter used by the transformation (i.e. MOFScript program).
  13. As everything is implemeted in Java and Ant, the whole process can be executed with a batch file.
  14. Inject and compare example: As we can see, “trackbacks” and “math” tables have been removed. “user_options” column has been removed from “user” table and three new columns have been added to “categorylinks” table. As we can see, both schema versions are transformed into models with Schemol, and after comparing with EMFCompare, a difference model is retrieved.
  15. The Difference model is described as a set of DB operators. Curino et al. proved that a set of (eleven) Schema Modification Operators (SMO) can completely describe a complex schema evolution scenario (in fact, they did the experiment for the Wikipedia). The Table indicates the frequency of these change for the MediaWiki case. Most frequent changes (e.g. &amp;apos;create table&amp;apos;, &amp;apos;add column&amp;apos;, &amp;apos;drop column&amp;apos; or &amp;apos;rename column&amp;apos;) can be identified from schema differences. Complex changes (e.g. &amp;apos;distribute table&amp;apos; or &amp;apos;merge table&amp;apos;) are a sequence of simple changes. Fortunately, as we can see in the “change type” column, most of the changes are NBC or BRC, which means that human intervention is not required for their adaptation. For each SMO, there is an adaptation action that restores the consistency. For instance, if a column is removed, that column will be removed as well from the statements. NBC: Non-Breaking Changes. Changes that do not affect the code or transformation BRC: Breaking Resolvable Changes: Changes that can be automatically propagated BUC: Breaking Unresolvable Changes: changes that require human intervention to propagate the changes. In [Model transformation co-evolution: a semi-automatic approach] we propose some rules (implemented in a M2M transformation) to relate simple changes to build complex ones. For instance, there is a move column case if the same column is deleted from a table and added to another table. Unfortunately, distribute table and merge table cases cannot be automatically detected and therefore are not included in the table. This kind of changes tend to be scarce. For MediaWiki, &amp;apos;distribute table&amp;apos; never occurred while &amp;apos;merge table&amp;apos; accounts for 1,5% of the total changes.
  16. The adapter is platform-specific, and will only adapt SQL code, but it is schema-agnostic (it does not matter what type of DB schema has to be managed). 3-The approach mainly consists of replacing the “print” statements with invocations to the adapter (e.g. printSQL function). On the invocation, the adapter checks whether the &amp;lt;SQL statement&amp;gt; acts upon a table that is being subject to change. If so, the adapter returns a piece of SQL code compliant with the new DB schema. The adaptations are implemented in a library that has to be imported in the M2T transformation. This library contains functions that adapt the statements to the last version of the platform, leaving the transformation untouched. The adapter is implicitly called by the transformation at runtime, adapting the statements to the changes. Implementation wise, the adapter has two inputs: the Difference model and the model for the new schema model (to obtain the full description of new attributes, if applicable). The ZQL open-source SQL parser is used to parse SQL statements to Java structures. This parser is extended to account for adaptation functions to modify the statements (e.g. removeColumn). The snippet provides a glimpse of the adapter for the case “remove column”. The structure is similar for the other adaptations too. It starts by iterating over the changes reported in the Difference model (line 5). Next, it checks (line 6) that the deleted column&amp;apos;s table corresponds with the table name of the statement (retrieved in lines 3-4). Then, all, the statement, the table name and the removed column are added to a list of parameters (lines 7-10). Finally, the adapter outputs an SQL statement without the removed column, using a function with the list of parameters that modifies the expression (lines 12-13).
  17. Going back to our scenario, this would an example of the output: 1. the introduction of three new attributes in the “categorylinks” table, namely, cl_type, cl_sortkey_prefix and cl_collation. Accordingly, the adapter modifies SQL insert/update statements where new columns which are &amp;apos;Not Null&amp;apos; are initialized with their default values; 2. the deletion of tables “math” and “trackback”. This causes the affected printSQL statements to be left as a comment; 3. the deletion of column “user_options” in the “user” table. Consequently, the affected printSQL statements, output the SQL but removing the affected column. In addition, a comment is introduced to note this fact (lines 8-13 below).
  18. I want to make notice that there are two roles. On the one hand the adapter producers are those that implements the adapter for a specific platform. They have to implement both the injector and the adapter. On the other hand the consumers are transformation developers that simply use the adapter.
  19. It has been done an evaluation that compares the performance of the approach with the manual adaptation. The experiment was conducted by 8 PhD students. This evaluation is done from the point of view of the consumer. They were two groups: one of them had to do the adaptation manually and the other using the adapter: - Manual: participants had to check the MediaWiki website, navigate through the hyperlinks, and collect those changes that might impact the code. The experiment outputted an average of 38&amp;apos; for D_Mediawiki. I found that this was the most cumbersome task. But it will depend on the scenario. Next, the designer peers at the code, updates it, and checks the generated code. On average, this accounts for 4&amp;apos; for a single update (i.e. PBR) -Assisted: Participants conducted two tasks: (1) configuration of the batch that launches the assisted adaptation, and (2), verification of the generated SQL script. (Some developers check what have been generated by the transformation and others not) To compute the profitability of the approach for another platform, it is suggested to apply specific constant values (D, P, V) to cost equations. D: the time estimated for detecting whether the new MediaWiki release impacts the transformation, P: the time needed to Propagate a single change to the MOFScript code, and #Impacts: the number of instructions in the transformation Impacted by the upgrade. D very much depends on the documentation available. C: the time needed to Configure the batch; V: the time needed to Verify that a single automatically adapted instruction is correct and to alter it, if applicable.
  20. The cost reduction rests on the existence of an infrastructure, namely, the adapter and the batch. The adapter is domain-agnostic, and hence, can be reused in other domains. On these grounds, I do not consider the adapter as part of the development effort. As I said, the evaluation is from the point of view of the consumer. However, there is a cost of familiarizing with the tool, that includes the configuration of the batch (e.g. DB settings, file paths and the like), and above all, the learning time. We estimated this accounts for 120&amp;apos; (reflected as the upfront investment for the assisted approach in Figure). Penalize On these grounds, the breakeven is reached after the third release.
  21. After some evolution iterations, the developer may decide that she wants to transfer the changes done by the adapter to the transformation itself. How can we dump the adaptations done in the code into the transformation? What I propose is a semi-automatic solution: to do an impact analysis of the changes in the platform so the developer can adapt the transformation. In the last step of the process (3) apart from adapting the generated code, it is created a record of the changes done, in case the developer wants to update the transformation itself so it can serve as an aid. This record contains the platform change, the transformation position affected by it (line and column) and the new statement In order to do this impact analysis, first it is needed to add as parameters in the prints the line and column of them in the transformation. This is done automatically using a Higher-Order Transformation. [Subir código y screencast de esto]
  22. - It is advocated for a preventive approach where the transformation is engineered for schema instability. In this sense it has been presented a mechanism to adapt generated code in a Forward Engineering scenario to platform evolution. - It has been applied in a specific case study (Mediawiki and Wikiwhirl) - The suitability of the approach boils down to two main factors: the DB schema instability and the transformation coupling
  23. -The main issue for me is that it has been proposed an adaptability technique and tried it in one platform. The question that arises is if it could be used with other platforms. For instance, with API evolution, XML configuration files and so on. I am more of synthetic thinking than analytic thinking, so I start from an example and then try to abstract. - Related with the previous issue: For the role of the producer, it is needed a generic methodology that defines the steps needed to develop an adapter for any domain. This evaluation has limitations: the number of participants is too small for what is recommended, and that there are few DB iterations. Regarding the participant number (8) I could not find more people with the required knowledge.
  24. Than you for listening. Now it’s question time, I give the floor to you.
  25. The adapter is platform-specific, and will only adapt SQL code, but it is domain-agnostic (it does not matter what type of DB schema has to be managed). 3-The approach mainly consists of replacing the “print” statements with invocations to the adapter (e.g. printSQL function). On the invocation, the adapter checks whether the &amp;lt;SQL statement&amp;gt; acts upon a table that is being subject to change. If so, the adapter returns a piece of SQL code compliant with the new DB schema. The adaptations are implemented in a library that has to be imported in the M2T transformation. This library contains functions that adapt the statements to the last version of the platform, leaving the transformation untouched. The adapter is implicitly called by the transformation at runtime, adapting the statements to the changes. Implementation wise, the adapter has two inputs: the Difference model and the model for the new schema model (to obtain the full description of new attributes, if applicable). The ZQL open-source SQL parser is used to parse SQL statements to Java structures. This parser is extended to account for adaptation functions to modify the statements (e.g. removeColumn). The snippet provides a glimpse of the adapter for the case “remove column”. The structure is similar for the other adaptations too. It starts by iterating over the changes reported in the Difference model (line 5). Next, it checks (line 6) that the deleted column&amp;apos;s table corresponds with the table name of the statement (retrieved in lines 3-4). Then, all, the statement, the table name and the removed column are added to a list of parameters (lines 7-10). Finally, the adapter outputs an SQL statement without the removed column, using a function with the list of parameters that modifies the expression (lines 12-13).
  26. More information on this traceability model and its visualization will be given in the ICMT next month