In Model-Driven Engineering bidirectionality in transforma- tions is regarded as a key mechanism. Recent approaches to non-deterministic transformations have been proposed for dealing with non-bijectivity. Among them, the JTL language is based on a relational model transformation engine which restores consistency by returning all admissible models. This can be regarded as an uncertainty reducing process: the un- known uncertainty at design-time is translated into known uncertainty at run-time by generating multiple choices. Un- fortunately, little changes in a model usually correspond to a combinatorial explosion of the solution space. In this pa- per, we propose to represent the multiple solutions in a in- tensional manner by adopting a model for uncertainty. The technique is applied to JTL demonstrating the advantages of the proposal.
Supporting Users to Manage Breaking and Unresolvable Changes in Coupled Evolu...Alfonso Pierantonio
In Model-Driven Engineering (MDE) metamodels play a key role since they underpin the specification of different kinds of modeling artifacts, and the development of a wide range of model management tools. Consequently, when a metamodel is changed modelers and developers have to deal with the induced coupled evolutions i.e., adapting all those artifacts that might have been affected by the operated meta- model changes. Over the last years, several approaches have been proposed to deal with the coupled evolution problem, even though the treatment of changes is still a time consum- ing and error-prone activity. In this paper we propose an ap- proach supporting users during the adaptation steps that can- not be fully automated. The approach has been implemented by extending the EMFMigrate language and by exploiting the user input facility of the Epsilon Object Language. The approach has been applied to cope with the coupled evolu- tion of metamodels and model-to-text transformations.
Nonlinear Programming: Theories and Algorithms of Some Unconstrained Optimiza...Dr. Amarjeet Singh
Nonlinear programming problem (NPP) had become an important branch of operations research, and it was the mathematical programming with the objective function or constraints being nonlinear functions. There were a variety of traditional methods to solve nonlinear programming problems such as bisection method, gradient projection method, the penalty function method, feasible direction method, the multiplier method. But these methods had their specific scope and limitations, the objective function and constraint conditions generally had continuous and differentiable request. The traditional optimization methods were difficult to adopt as the optimized object being more complicated. However, in this paper, mathematical programming techniques that are commonly used to extremize nonlinear functions of single and multiple (n) design variables subject to no constraints are been used to overcome the above challenge. Although most structural optimization problems involve constraints that bound the design space, study of the methods of unconstrained optimization is important for several reasons. Steepest Descent and Newton’s methods are employed in this paper to solve an optimization problem.
This document provides an introduction to design patterns. It begins by explaining what design patterns are, their benefits, and common elements of design patterns like name, problem, solution, and consequences. It then discusses different types of design patterns classified by purpose (creational, structural, behavioral) and scope (class, object). An example of applying the strategy pattern to a duck simulation is used to illustrate how patterns can solve common object-oriented design problems by separating variable aspects from those that remain the same. The document advocates programming to interfaces rather than implementations to avoid tight coupling and allow independent extension of behavior.
This document discusses design patterns, including their definition, elements, and relationship to frameworks. It provides an overview of design patterns, describing them as general and reusable solutions to common software problems. The document outlines the core elements of patterns, such as their name, problem, context, solution, and examples. It also discusses different categories of patterns, such as creational, structural, and behavioral patterns.
The document discusses the GRASP (General Responsibility Assignment Software Principles) patterns and principles for assigning responsibilities in object-oriented design. It defines GRASP as helping to clearly outline which objects are responsible for which actions. There are nine GRASP principles covered: Creator, Controller, Information Expert, Low Coupling, High Cohesion, Indirection, Polymorphism, Protected Variations, and Pure Fabrication. These principles provide guidelines for assigning responsibilities to classes to achieve well-structured and maintainable code. The document then explains each principle in more detail using a chess game as an example domain.
I explore ways to combine complex network science with the Rubin model of conference inference. In broad strokes, I discuss the difference between exogenous shocks and endogenous process, and how granularity in time can be used to tease causality out of a complex system.
A constraint programming based approach toabirISECS
This paper proposes using constraint programming to check for inconsistencies in ontologies. Constraint programming allows constraints to be expressed across different domains and uses existing techniques to process constraints. The paper translates common ontology axioms into constraint satisfaction problems that can then be solved using constraint programming techniques like filtering to reduce the search space and find consistent solutions. Evaluating ontologies for inconsistencies is important as they evolve over time, and constraint programming provides an effective way to formally check complex ontologies with user-defined constraints.
The document summarizes topics related to real-time software engineering including embedded system design, architectural patterns for real-time software, timing analysis, and real-time operating systems. It discusses key characteristics of embedded systems like responsiveness, the need to respond to stimuli within specified time constraints, and how real-time systems are often modeled as cooperating processes controlled by a real-time executive. The document also outlines common architectural patterns for real-time systems including observe and react, environmental control, and process pipeline.
Supporting Users to Manage Breaking and Unresolvable Changes in Coupled Evolu...Alfonso Pierantonio
In Model-Driven Engineering (MDE) metamodels play a key role since they underpin the specification of different kinds of modeling artifacts, and the development of a wide range of model management tools. Consequently, when a metamodel is changed modelers and developers have to deal with the induced coupled evolutions i.e., adapting all those artifacts that might have been affected by the operated meta- model changes. Over the last years, several approaches have been proposed to deal with the coupled evolution problem, even though the treatment of changes is still a time consum- ing and error-prone activity. In this paper we propose an ap- proach supporting users during the adaptation steps that can- not be fully automated. The approach has been implemented by extending the EMFMigrate language and by exploiting the user input facility of the Epsilon Object Language. The approach has been applied to cope with the coupled evolu- tion of metamodels and model-to-text transformations.
Nonlinear Programming: Theories and Algorithms of Some Unconstrained Optimiza...Dr. Amarjeet Singh
Nonlinear programming problem (NPP) had become an important branch of operations research, and it was the mathematical programming with the objective function or constraints being nonlinear functions. There were a variety of traditional methods to solve nonlinear programming problems such as bisection method, gradient projection method, the penalty function method, feasible direction method, the multiplier method. But these methods had their specific scope and limitations, the objective function and constraint conditions generally had continuous and differentiable request. The traditional optimization methods were difficult to adopt as the optimized object being more complicated. However, in this paper, mathematical programming techniques that are commonly used to extremize nonlinear functions of single and multiple (n) design variables subject to no constraints are been used to overcome the above challenge. Although most structural optimization problems involve constraints that bound the design space, study of the methods of unconstrained optimization is important for several reasons. Steepest Descent and Newton’s methods are employed in this paper to solve an optimization problem.
This document provides an introduction to design patterns. It begins by explaining what design patterns are, their benefits, and common elements of design patterns like name, problem, solution, and consequences. It then discusses different types of design patterns classified by purpose (creational, structural, behavioral) and scope (class, object). An example of applying the strategy pattern to a duck simulation is used to illustrate how patterns can solve common object-oriented design problems by separating variable aspects from those that remain the same. The document advocates programming to interfaces rather than implementations to avoid tight coupling and allow independent extension of behavior.
This document discusses design patterns, including their definition, elements, and relationship to frameworks. It provides an overview of design patterns, describing them as general and reusable solutions to common software problems. The document outlines the core elements of patterns, such as their name, problem, context, solution, and examples. It also discusses different categories of patterns, such as creational, structural, and behavioral patterns.
The document discusses the GRASP (General Responsibility Assignment Software Principles) patterns and principles for assigning responsibilities in object-oriented design. It defines GRASP as helping to clearly outline which objects are responsible for which actions. There are nine GRASP principles covered: Creator, Controller, Information Expert, Low Coupling, High Cohesion, Indirection, Polymorphism, Protected Variations, and Pure Fabrication. These principles provide guidelines for assigning responsibilities to classes to achieve well-structured and maintainable code. The document then explains each principle in more detail using a chess game as an example domain.
I explore ways to combine complex network science with the Rubin model of conference inference. In broad strokes, I discuss the difference between exogenous shocks and endogenous process, and how granularity in time can be used to tease causality out of a complex system.
A constraint programming based approach toabirISECS
This paper proposes using constraint programming to check for inconsistencies in ontologies. Constraint programming allows constraints to be expressed across different domains and uses existing techniques to process constraints. The paper translates common ontology axioms into constraint satisfaction problems that can then be solved using constraint programming techniques like filtering to reduce the search space and find consistent solutions. Evaluating ontologies for inconsistencies is important as they evolve over time, and constraint programming provides an effective way to formally check complex ontologies with user-defined constraints.
The document summarizes topics related to real-time software engineering including embedded system design, architectural patterns for real-time software, timing analysis, and real-time operating systems. It discusses key characteristics of embedded systems like responsiveness, the need to respond to stimuli within specified time constraints, and how real-time systems are often modeled as cooperating processes controlled by a real-time executive. The document also outlines common architectural patterns for real-time systems including observe and react, environmental control, and process pipeline.
The document discusses issues with bidirectionality in model transformations. It notes that while bidirectionality is relevant, it rarely produces anticipated benefits due to several factors: transformations are often non-deterministic, existing bidirectional languages introduce opaque semantics by enforcing consistency through unknown update policies, and developers lack control over the transformation behavior. The document proposes that bidirectional transformations should generate all possible results to manage uncertainty, define update policies at design time to give developers control, and allow manual selection when automatic policies cannot be determined.
1. process mapping is a critical part of the define and measure phabhi353063
1. Process mapping is a critical part of the Define and Measure phases of continuous improvement projects. It provides a visual representation of how a process works which allows teams to identify areas for improvement.
2. Statistics and probability play an important role in continuous improvement projects through data analysis. They help establish models to determine the capabilities of processes and make informed decisions with limited data. Statistical evaluation is used to identify improvements based on sample data and probability distributions.
"I don't trust AI": the role of explainability in responsible AIErika Agostinelli
The document discusses the importance of explainability in responsible AI. It outlines different types of explanations like global vs local and direct vs post-hoc explanations. It also describes who explanations are needed for, such as data scientists, end users, and regulators. Open-source explanation tools are presented, including AIX360 and What-If Tool. An example using AIX360 to explain a loan approval model with different techniques is described in detail.
Analisi di conformità dei requisiti - S. Vuotto (Università di Sassari)Sardegna Ricerche
Presentazione di Simone Vuotto (Università di Sassari) in occasione della riunione del 7 giugno 2019 sul progetto PROSSIMO - PROgettazione, Sviluppo e ottimizzazione di Sistemi Intelligenti MultiOggetto.
The document discusses Tom Brimeyer's Hypothyroidism Revolution program, which is a comprehensive guide for reversing hypothyroidism naturally and permanently in three phases. The first phase focuses on eliminating food sensitivities and toxins. The second phase introduces a thyroid-supporting diet. The third phase incorporates a healthy lifestyle including special exercises. The program contains over 160 pages explaining the three phases in detail. It aims to help sufferers of hypothyroidism achieve optimal health through natural means.
This document summarizes a research study on complex problem solving using a computer simulation called Syntex. The study involved 54 students from Zhejiang University divided into 18 groups. The groups made management decisions for a simulated company each month. The researchers found differences between the "best" and "worst" groups based on company performance. Best groups increased capital and hired more employees over time compared to worst groups. The researchers also analyzed decision making processes and information gathering between the groups. They explored perspectives including the problem solving process, group interactions, and cross-cultural differences. The document discusses research design, validity, variables measured, and limitations of generalizing the results.
IntroductionThe TJF Company is an organization that is beginni.docxmariuse18nolet
Introduction
The TJF Company is an organization that is beginning to see a real need for Information Technology (IT) support. The unavailability of systems and delayed access to information has made it increasingly difficult for TJF employees to complete their work and support their valued customers. Because of decreased customer satisfaction levels, revenues are down.
Originally the company was able to perform with a small IT staff however due to company growth the IT department has outsourced its Tier 1 helpdesk duties. These duties are limited to simple tasks that are needed: resetting passwords, installing or updating programs, and general low level assignments. The outsourcing was put in place to free up time of the network engineers and system administrators for larger projects. Since instituting the change there has been some feedback surrounding general dissatisfaction with the Tier 1 service. In order to properly gauge the amount of satisfaction and to determine if the company will continue to use the vendor we need to ask a research question.
Research Question:
1. Is there a direct correlation with the satisfaction level of IT service the employees receive and their ability to properly support their customer?
The Variables in Question Are:
1. The quality of the assistance from IT to the employees;
2. The satisfaction level of external customers.
Satisfaction is an abstract quality so it would need to be determined in survey as Very Satisfied – Satisfied – Indifferent – Dissatisfied – Very Dissatisfied. Because Frequency of repeated calls is quantifiable we will be able to look at the Help Desk tickets in order to determine the number of tickets called in over a period of time.
Hypothesis Statements:
1. Is there a direct correlation with the satisfaction level of IT service the employees receive and their ability to properly support their customer?
H10: There is a direct correlation with the satisfaction level of IT service the employees receive and their ability to properly support their customer?
H1a: There is not a direct correlation with the satisfaction level of IT service the employees receive and their ability to properly support their customer?
Business Research Project Part 1: Formulation of the Research Problem
Running head: BUSINESS RESEARCH PROJECT PART 1
1
Business Research Project Part 1: Formulation of the Research Problem
2
Business Research Project Part 1: Formulation of the Research Problem
With the rising gas prices, consumers tend to conduct research when shopping for a new vehicle. Team B has been tasked with conducting research for Automotive Trends. The team is looking into the correlation between vehicles weight and highway fuel economy. The purpose of this paper is to review the dependent and independent variables for these and to come up with a hypothesis. Team B will generate a research question to get this started.
Research Question
When looking at the research question,.
This document presents a quality issue analysis and alerting system developed for Lenovo to more proactively identify potential product issues 1-5 weeks in advance of their current reactive approach. The authors analyzed multiple data metrics provided by Lenovo and determined the Negative Comment Ratio (NCR) was best for modeling. NCR was converted to a weekly rate change and categorized. Transition probabilities between categories were determined and steady state probabilities calculated. Finally, a "Threat Level" scoring system from 1-10 was developed to track past rate changes over time and issue alerts when threat levels reached 8-10, successfully identifying issues for all products within the desired timeframes while limiting false positives.
Mise14 @ ICSE1 14 Uncertainty in Bidirectional TransformationsAlfonso Pierantonio
This document discusses challenges with bidirectional transformations between models. It notes that while bidirectionality is important, existing approaches have not achieved anticipated benefits due to issues with non-determinism and unclear semantics. The document proposes handling uncertainty in bidirectional transformations by generating a model with uncertainty rather than a set of models. This represents the solution space and allows traversal. It extends the semantics of the Janus Transformation Language to directly generate the uncertainty model corresponding to the solution space. Managing uncertainty in this way is intended to help address the challenges with bidirectional transformations.
Here are the key points about hyperlactatemia in pediatric patients:
- Hyperlactatemia occurs when there is an imbalance between tissue oxygen supply and demand, leading to increased anaerobic glycolysis and lactate production.
- It is commonly seen in pediatric ICU patients, especially following surgery, trauma, or septic shock which cause multiple organ dysfunction.
- Higher lactate levels are associated with worse clinical outcomes and prognosis in critically ill children.
- The PRISM III score, which evaluates the risk of mortality in pediatric ICU patients, was calculated for the patients in this study.
- Treatment aims to support organ function, optimize tissue oxygen delivery, and address any underlying causes contributing to the hyper
What We Learned from Three Years of Sciencing the Crap Out of DevOpsSeniorStoryteller
This document summarizes research from three years of studying DevOps practices. Some key findings include:
- Continuous delivery practices like reducing lead time and increasing release frequency are correlated with higher IT performance. However, tools like configuration management tools are not correlated.
- Ineffective testing practices include developers not creating tests or environments being difficult to reproduce. But having QA primarily create tests is not ineffective.
- While managing work-in-progress is thought to be important, the correlation between WIP and IT performance is actually negligible.
- DevOps culture and practices around information sharing and collaboration are valid constructs that are predictive of both IT and organizational performance. But data testing is needed to validate assumptions.
The document discusses panel data models and issues related to missing data and measurement errors. It provides a general specification for panel data models that includes time effects, individual effects, and time-varying and time-invariant explanatory variables. It also discusses assumptions required for different estimation methods and how to address endogeneity and heterogeneity. The document outlines challenges related to unbalanced panels, attrition, and missing data, distinguishing between missing at random and missing completely at random.
Towards a framework for making applications provenance aware: UML2PROVUniversidad de La Rioja
The document presents a systematic review of provenance systems. It defines a taxonomy for characterizing provenance systems based on general aspects, provenance characteristics, and non-functional requirements. The taxonomy is used to analyze and compare 25 provenance systems. Key findings include identifying open research problems in provenance systems. The review aims to provide a unified understanding of provenance systems and motivate the definition of the UML2PROV framework to bridge application design and provenance design.
Running head CRITICAL THINKING IN PSYCHOLOGY 2 CRITICAL THI.docxhealdkathaleen
Running head: CRITICAL THINKING IN PSYCHOLOGY
2
CRITICAL THINKING IN PSYCHOLOGY
2
Dear Brittany,
Thank you for sending your work to The Writing Center. I have provided written feedback in the margins of the paper to highlight some specific areas for you to address. I have also provided feedback and direction through a video that you can access by clicking on the link below:
https://drive.google.com/file/d/1zUdojw7FxBWT0T-1F-MnOJ2jqteJ7VOx/view?usp=sharing
The link may take a few minutes to load, and, in the meantime, you will see a blank screen. Please wait for the video to appear.
Please see the Paper Review and QA in Cranium Cafe Guide for details about scheduled paper review.
Please note: You should be able to see the tutor’s written comments in the right margin of your paper. If you cannot, please make sure your Microsoft Word Review settings are set to Final Show Markup. If you are still unable to see the marginal comments, please contact the Writing Center. If you have any questions or trouble accessing your feedback, please email me at [email protected] You are also welcome to join us during Live Tutoring hours: https://campus2.purdueglobal.edu/page/writing-center
We want to hear from you! Please take the time to read the comments on your paper and watch your review. Then let us know what you think. Please click here (https://form.jotform.com/72916589616168) to complete the Paper Review Survey and provide your feedback.
Thank you for your time, and I wish you well with your writing!
Warmly,
Allen, tutor
Critical Thinking in Psychology
Brittany Salley
February 9, 2020
Critical Thinking in Psychology
Introduction
Introducing change in an organization more often than is met with confrontation, especially if such a change comprises of a redesign of the elementary work process and introduction of advanced technology
. The article
scrutinizes the introduction of intricate software systems to standard work processes within a business.
Summary of the Article
Literature review.
Organizational retooling calls for cautious planning and communication to achieve effective execution. Risk acceptance at some level may provide an advantage to a business. Flexibility in an organization can be amplified when the management is enthusiastic to integrate lesson learned as part of the transformation management process. The majority of the participants lack a high level of technical knowledge (Long, S., 2010
). The designers of social technology more often overlook the social knowledge fundamentals of change and fail to contemplate the degree of stockholder learning related to complete coziness with the technology. The degree of learning mandatory for effective application of technology can serve as a transformation management guide. Advance technological motivated transformation management and transformational restructuring necessitate equivalent changes in learning and organizational culture. Operative technology practice m ...
Target Reference Model for Enterprise Business Architecture. 1) Front-line Support, 2) Collaboration among Experts, 3) Ecosystems for Sharing Value (shared reality), 4) Efficiency in Coordination, 5) Flexible Structures.
The document presents a changeability evaluation model for object-oriented software. It begins with an introduction to changeability and its importance. It then reviews existing literature on measuring changeability. A relationship is established between changeability and object-oriented design properties like coupling, inheritance, and polymorphism. The paper then develops a changeability evaluation model using multiple linear regression. The model relates changeability as the dependent variable to design properties as independent variables. The model is validated using experimental tests on data from class diagrams, which show the model is highly significant.
The document presents a changeability evaluation model for object-oriented software. It begins with an introduction to changeability and its importance. It then reviews existing literature on measuring changeability. A relationship is established between changeability and object-oriented design properties like coupling, inheritance, and polymorphism. The paper then develops a changeability evaluation model using multiple linear regression. The model relates changeability as the dependent variable to object-oriented design metrics as independent variables. The model is validated experimentally using data from class diagrams, showing it is highly significant.
Open Access (OA) is a mechanism that allows for free and immediate access to research results and data. It aims to enhance global dissemination, reduce research duplication, and increase the use of scientific contributions in teaching programs, among others. However, a survey has revealed that many researchers need more adequate knowledge about OA and the transition to it. While making research products openly available is a great idea for communicating science and knowledge, shifting the costs from readers to authors induces risks that must be identified, understood, and analyzed. It is worth noting that OA does not eliminate publishing costs. The move to OA can lead to financial bias if publishers take advantage of the opportunity to publish more or engage in unethical practices. This could create an unequal playing field, where some researchers have an advantage over others due to their access to resources. The talk describes the scientific publishing market, the problems emerging from the current transition to OA, and potential countermeasures to mitigate the current difficulties.
Uncertainty and variability in industry-scale projects: Pearls, perils and p...Alfonso Pierantonio
The state-of-the-art in software abstraction is model-driven engineering. It provides system architects with abstract representations of complex system functionality, complementary views of a given system (e.g., behavioral versus structural), and vertical refinement of high-level system requirements models into design models and eventually down to (automatically-generated) executable code. However, the complexity caused by the many models used in large-scale projects might give place to significant sources of uncertainty due to (implicit and explicit) dependencies, consistencies, and correlations among the modeling artifacts. Keeping such models consistent during the development process requires spelling out the change requirements that enforce well-thought-out change propagation and co-evolution plans.
In this talk, I will survey threats, challenges, and misconceptions that occurred in the context of an industry-scale project in the domain of computer-based interlocking systems. In particular, the different kinds of model relations required managing several forms of (epistemic) uncertainty emerged in various scenarios, including roundtripping among modeling notations and several forms of co-evolution involving metamodels, models, and transformations. To this end, a megamodel is given to better characterize the identified solutions that required devising specialized tools and notations for leveraging automation and translating uncertainty into variability models.
More Related Content
Similar to Managing Uncertainty in Bidirectional Model Transformations
The document discusses issues with bidirectionality in model transformations. It notes that while bidirectionality is relevant, it rarely produces anticipated benefits due to several factors: transformations are often non-deterministic, existing bidirectional languages introduce opaque semantics by enforcing consistency through unknown update policies, and developers lack control over the transformation behavior. The document proposes that bidirectional transformations should generate all possible results to manage uncertainty, define update policies at design time to give developers control, and allow manual selection when automatic policies cannot be determined.
1. process mapping is a critical part of the define and measure phabhi353063
1. Process mapping is a critical part of the Define and Measure phases of continuous improvement projects. It provides a visual representation of how a process works which allows teams to identify areas for improvement.
2. Statistics and probability play an important role in continuous improvement projects through data analysis. They help establish models to determine the capabilities of processes and make informed decisions with limited data. Statistical evaluation is used to identify improvements based on sample data and probability distributions.
"I don't trust AI": the role of explainability in responsible AIErika Agostinelli
The document discusses the importance of explainability in responsible AI. It outlines different types of explanations like global vs local and direct vs post-hoc explanations. It also describes who explanations are needed for, such as data scientists, end users, and regulators. Open-source explanation tools are presented, including AIX360 and What-If Tool. An example using AIX360 to explain a loan approval model with different techniques is described in detail.
Analisi di conformità dei requisiti - S. Vuotto (Università di Sassari)Sardegna Ricerche
Presentazione di Simone Vuotto (Università di Sassari) in occasione della riunione del 7 giugno 2019 sul progetto PROSSIMO - PROgettazione, Sviluppo e ottimizzazione di Sistemi Intelligenti MultiOggetto.
The document discusses Tom Brimeyer's Hypothyroidism Revolution program, which is a comprehensive guide for reversing hypothyroidism naturally and permanently in three phases. The first phase focuses on eliminating food sensitivities and toxins. The second phase introduces a thyroid-supporting diet. The third phase incorporates a healthy lifestyle including special exercises. The program contains over 160 pages explaining the three phases in detail. It aims to help sufferers of hypothyroidism achieve optimal health through natural means.
This document summarizes a research study on complex problem solving using a computer simulation called Syntex. The study involved 54 students from Zhejiang University divided into 18 groups. The groups made management decisions for a simulated company each month. The researchers found differences between the "best" and "worst" groups based on company performance. Best groups increased capital and hired more employees over time compared to worst groups. The researchers also analyzed decision making processes and information gathering between the groups. They explored perspectives including the problem solving process, group interactions, and cross-cultural differences. The document discusses research design, validity, variables measured, and limitations of generalizing the results.
IntroductionThe TJF Company is an organization that is beginni.docxmariuse18nolet
Introduction
The TJF Company is an organization that is beginning to see a real need for Information Technology (IT) support. The unavailability of systems and delayed access to information has made it increasingly difficult for TJF employees to complete their work and support their valued customers. Because of decreased customer satisfaction levels, revenues are down.
Originally the company was able to perform with a small IT staff however due to company growth the IT department has outsourced its Tier 1 helpdesk duties. These duties are limited to simple tasks that are needed: resetting passwords, installing or updating programs, and general low level assignments. The outsourcing was put in place to free up time of the network engineers and system administrators for larger projects. Since instituting the change there has been some feedback surrounding general dissatisfaction with the Tier 1 service. In order to properly gauge the amount of satisfaction and to determine if the company will continue to use the vendor we need to ask a research question.
Research Question:
1. Is there a direct correlation with the satisfaction level of IT service the employees receive and their ability to properly support their customer?
The Variables in Question Are:
1. The quality of the assistance from IT to the employees;
2. The satisfaction level of external customers.
Satisfaction is an abstract quality so it would need to be determined in survey as Very Satisfied – Satisfied – Indifferent – Dissatisfied – Very Dissatisfied. Because Frequency of repeated calls is quantifiable we will be able to look at the Help Desk tickets in order to determine the number of tickets called in over a period of time.
Hypothesis Statements:
1. Is there a direct correlation with the satisfaction level of IT service the employees receive and their ability to properly support their customer?
H10: There is a direct correlation with the satisfaction level of IT service the employees receive and their ability to properly support their customer?
H1a: There is not a direct correlation with the satisfaction level of IT service the employees receive and their ability to properly support their customer?
Business Research Project Part 1: Formulation of the Research Problem
Running head: BUSINESS RESEARCH PROJECT PART 1
1
Business Research Project Part 1: Formulation of the Research Problem
2
Business Research Project Part 1: Formulation of the Research Problem
With the rising gas prices, consumers tend to conduct research when shopping for a new vehicle. Team B has been tasked with conducting research for Automotive Trends. The team is looking into the correlation between vehicles weight and highway fuel economy. The purpose of this paper is to review the dependent and independent variables for these and to come up with a hypothesis. Team B will generate a research question to get this started.
Research Question
When looking at the research question,.
This document presents a quality issue analysis and alerting system developed for Lenovo to more proactively identify potential product issues 1-5 weeks in advance of their current reactive approach. The authors analyzed multiple data metrics provided by Lenovo and determined the Negative Comment Ratio (NCR) was best for modeling. NCR was converted to a weekly rate change and categorized. Transition probabilities between categories were determined and steady state probabilities calculated. Finally, a "Threat Level" scoring system from 1-10 was developed to track past rate changes over time and issue alerts when threat levels reached 8-10, successfully identifying issues for all products within the desired timeframes while limiting false positives.
Mise14 @ ICSE1 14 Uncertainty in Bidirectional TransformationsAlfonso Pierantonio
This document discusses challenges with bidirectional transformations between models. It notes that while bidirectionality is important, existing approaches have not achieved anticipated benefits due to issues with non-determinism and unclear semantics. The document proposes handling uncertainty in bidirectional transformations by generating a model with uncertainty rather than a set of models. This represents the solution space and allows traversal. It extends the semantics of the Janus Transformation Language to directly generate the uncertainty model corresponding to the solution space. Managing uncertainty in this way is intended to help address the challenges with bidirectional transformations.
Here are the key points about hyperlactatemia in pediatric patients:
- Hyperlactatemia occurs when there is an imbalance between tissue oxygen supply and demand, leading to increased anaerobic glycolysis and lactate production.
- It is commonly seen in pediatric ICU patients, especially following surgery, trauma, or septic shock which cause multiple organ dysfunction.
- Higher lactate levels are associated with worse clinical outcomes and prognosis in critically ill children.
- The PRISM III score, which evaluates the risk of mortality in pediatric ICU patients, was calculated for the patients in this study.
- Treatment aims to support organ function, optimize tissue oxygen delivery, and address any underlying causes contributing to the hyper
What We Learned from Three Years of Sciencing the Crap Out of DevOpsSeniorStoryteller
This document summarizes research from three years of studying DevOps practices. Some key findings include:
- Continuous delivery practices like reducing lead time and increasing release frequency are correlated with higher IT performance. However, tools like configuration management tools are not correlated.
- Ineffective testing practices include developers not creating tests or environments being difficult to reproduce. But having QA primarily create tests is not ineffective.
- While managing work-in-progress is thought to be important, the correlation between WIP and IT performance is actually negligible.
- DevOps culture and practices around information sharing and collaboration are valid constructs that are predictive of both IT and organizational performance. But data testing is needed to validate assumptions.
The document discusses panel data models and issues related to missing data and measurement errors. It provides a general specification for panel data models that includes time effects, individual effects, and time-varying and time-invariant explanatory variables. It also discusses assumptions required for different estimation methods and how to address endogeneity and heterogeneity. The document outlines challenges related to unbalanced panels, attrition, and missing data, distinguishing between missing at random and missing completely at random.
Towards a framework for making applications provenance aware: UML2PROVUniversidad de La Rioja
The document presents a systematic review of provenance systems. It defines a taxonomy for characterizing provenance systems based on general aspects, provenance characteristics, and non-functional requirements. The taxonomy is used to analyze and compare 25 provenance systems. Key findings include identifying open research problems in provenance systems. The review aims to provide a unified understanding of provenance systems and motivate the definition of the UML2PROV framework to bridge application design and provenance design.
Running head CRITICAL THINKING IN PSYCHOLOGY 2 CRITICAL THI.docxhealdkathaleen
Running head: CRITICAL THINKING IN PSYCHOLOGY
2
CRITICAL THINKING IN PSYCHOLOGY
2
Dear Brittany,
Thank you for sending your work to The Writing Center. I have provided written feedback in the margins of the paper to highlight some specific areas for you to address. I have also provided feedback and direction through a video that you can access by clicking on the link below:
https://drive.google.com/file/d/1zUdojw7FxBWT0T-1F-MnOJ2jqteJ7VOx/view?usp=sharing
The link may take a few minutes to load, and, in the meantime, you will see a blank screen. Please wait for the video to appear.
Please see the Paper Review and QA in Cranium Cafe Guide for details about scheduled paper review.
Please note: You should be able to see the tutor’s written comments in the right margin of your paper. If you cannot, please make sure your Microsoft Word Review settings are set to Final Show Markup. If you are still unable to see the marginal comments, please contact the Writing Center. If you have any questions or trouble accessing your feedback, please email me at [email protected] You are also welcome to join us during Live Tutoring hours: https://campus2.purdueglobal.edu/page/writing-center
We want to hear from you! Please take the time to read the comments on your paper and watch your review. Then let us know what you think. Please click here (https://form.jotform.com/72916589616168) to complete the Paper Review Survey and provide your feedback.
Thank you for your time, and I wish you well with your writing!
Warmly,
Allen, tutor
Critical Thinking in Psychology
Brittany Salley
February 9, 2020
Critical Thinking in Psychology
Introduction
Introducing change in an organization more often than is met with confrontation, especially if such a change comprises of a redesign of the elementary work process and introduction of advanced technology
. The article
scrutinizes the introduction of intricate software systems to standard work processes within a business.
Summary of the Article
Literature review.
Organizational retooling calls for cautious planning and communication to achieve effective execution. Risk acceptance at some level may provide an advantage to a business. Flexibility in an organization can be amplified when the management is enthusiastic to integrate lesson learned as part of the transformation management process. The majority of the participants lack a high level of technical knowledge (Long, S., 2010
). The designers of social technology more often overlook the social knowledge fundamentals of change and fail to contemplate the degree of stockholder learning related to complete coziness with the technology. The degree of learning mandatory for effective application of technology can serve as a transformation management guide. Advance technological motivated transformation management and transformational restructuring necessitate equivalent changes in learning and organizational culture. Operative technology practice m ...
Target Reference Model for Enterprise Business Architecture. 1) Front-line Support, 2) Collaboration among Experts, 3) Ecosystems for Sharing Value (shared reality), 4) Efficiency in Coordination, 5) Flexible Structures.
The document presents a changeability evaluation model for object-oriented software. It begins with an introduction to changeability and its importance. It then reviews existing literature on measuring changeability. A relationship is established between changeability and object-oriented design properties like coupling, inheritance, and polymorphism. The paper then develops a changeability evaluation model using multiple linear regression. The model relates changeability as the dependent variable to design properties as independent variables. The model is validated using experimental tests on data from class diagrams, which show the model is highly significant.
The document presents a changeability evaluation model for object-oriented software. It begins with an introduction to changeability and its importance. It then reviews existing literature on measuring changeability. A relationship is established between changeability and object-oriented design properties like coupling, inheritance, and polymorphism. The paper then develops a changeability evaluation model using multiple linear regression. The model relates changeability as the dependent variable to object-oriented design metrics as independent variables. The model is validated experimentally using data from class diagrams, showing it is highly significant.
Similar to Managing Uncertainty in Bidirectional Model Transformations (20)
Open Access (OA) is a mechanism that allows for free and immediate access to research results and data. It aims to enhance global dissemination, reduce research duplication, and increase the use of scientific contributions in teaching programs, among others. However, a survey has revealed that many researchers need more adequate knowledge about OA and the transition to it. While making research products openly available is a great idea for communicating science and knowledge, shifting the costs from readers to authors induces risks that must be identified, understood, and analyzed. It is worth noting that OA does not eliminate publishing costs. The move to OA can lead to financial bias if publishers take advantage of the opportunity to publish more or engage in unethical practices. This could create an unequal playing field, where some researchers have an advantage over others due to their access to resources. The talk describes the scientific publishing market, the problems emerging from the current transition to OA, and potential countermeasures to mitigate the current difficulties.
Uncertainty and variability in industry-scale projects: Pearls, perils and p...Alfonso Pierantonio
The state-of-the-art in software abstraction is model-driven engineering. It provides system architects with abstract representations of complex system functionality, complementary views of a given system (e.g., behavioral versus structural), and vertical refinement of high-level system requirements models into design models and eventually down to (automatically-generated) executable code. However, the complexity caused by the many models used in large-scale projects might give place to significant sources of uncertainty due to (implicit and explicit) dependencies, consistencies, and correlations among the modeling artifacts. Keeping such models consistent during the development process requires spelling out the change requirements that enforce well-thought-out change propagation and co-evolution plans.
In this talk, I will survey threats, challenges, and misconceptions that occurred in the context of an industry-scale project in the domain of computer-based interlocking systems. In particular, the different kinds of model relations required managing several forms of (epistemic) uncertainty emerged in various scenarios, including roundtripping among modeling notations and several forms of co-evolution involving metamodels, models, and transformations. To this end, a megamodel is given to better characterize the identified solutions that required devising specialized tools and notations for leveraging automation and translating uncertainty into variability models.
Alfonso Pierantonio presents a viewpoint-based approach to fixing limitations in traditional models of object classification. Traditional models use static, binary classification that cannot accommodate dynamic reclassification or overlapping categories. The proposed approach classifies objects based on their properties over time within different viewpoints, allowing dynamic classification as an object's state changes. Viewpoints can overlap and transients can be modeled with fuzzy logic. This brings classification in line with objects' natural changes and addresses issues like dynamic reclassification and in-between categories.
Starting a career in research is one of the most uncertain professional ambition in modern societies. Besides the technical obstacles of becoming a world-class expert in a specific topic (you have to!), it presents a diversity of daunting psycho-social difficulties that might be conducive to harmful consequences. The talk is informal in nature and tries to reflect the speaker’s experience (as a computer scientist) at the beginning of his career and later as the mentor of students and postdocs. Besides expected definitions about what research is or should be, it tries to discuss how students often tend to adopt the irrational idea of having ‘perfect reasoning.’ It also will consider empiricism, as a democratic tool for entering research, and the language as a barrier for those who do not speak English as a first language. The final remark will be about ‘silence’ as a beneficial or pathological aspect of both researchers and mentors.
Adoption of MDE technologies (and techniques) could be dis- cussed within the context of existing technology acceptance models (TAMs). For instance, Davis’ basic TAM model [4] emphasizes (perceived) usefulness and ease of use. While these factors are clearly relevant, we aim at a more refined view by paying special attention to how MDE, at this stage, is driven by research and university teaching. That is, we describe the challenge of improving chances of MDE adoption (i.e., improved ‘adoptability’) in terms of maturing three legs of an ‘adoption chair’: i) reproducibility of research re- sults; ii) reusability of essential technologies; iii) teachability of the underlying techniques.
Keynote at Educators Symposium, ACM/IEEE 19th Intl. Conference on Model Drive...Alfonso Pierantonio
This document discusses teaching modeling and model-driven engineering (MDE). It describes how modeling and MDE concepts are taught at the University of L'Aquila, including various courses covering topics like software engineering, architecture, and MDE. It notes challenges in teaching abstraction and automation skills to students. The document also expresses some disillusionment with MDE, noting that code generation is not widely adopted in practice due to high costs and skills required. To investigate this, the author queries a database of EU projects to analyze how many involved MDE concepts.
This document provides an overview of model management techniques investigated by the author, including coupled evolution, semantic issues in bidirectional model transformations, and the MDE Forge collaborative modeling platform. It introduces model-driven engineering (MDE) and discusses how models provide abstraction and can be automated through model transformations to perform complex tasks like incremental changes and traceability management. The challenges of metamodel and model co-evolution are described, as well as approaches to manage changes across an entire MDE ecosystem. Uncertainty in bidirectional model transformations is also covered.
Automated chaining of model transformations with incompatible metamodelsAlfonso Pierantonio
The document discusses automating the chaining of model transformations between incompatible metamodels. It proposes enhancing composability by using co-evolution techniques to determine when metamodels that are technically incompatible may still allow transformations to be chained. When the incompatibility is due to resolvable changes between metamodel versions, an adapter transformation can be automatically generated to migrate models between the metamodels. This approach allows more transformations in a repository to be discoverable and composable when chaining is needed between seemingly incompatible metamodels.
Evolutionary Togetherness: How to Manage Coupled Evolution in Metamodeling Ec...Alfonso Pierantonio
The document discusses model-driven engineering and metamodeling ecosystems. It notes that in MDE, metamodels are cornerstones that define related artifacts like models, transformations, and editors. When a metamodel changes, it can invalidate these other artifacts in the ecosystem. The document examines challenges in co-evolving all artifacts when a metamodel changes, such as manually adapting models which is tedious and error-prone. It proposes that an ecosystem needs infrastructure to consistently co-evolve artifacts, such as by defining relationships between elements and detecting change impacts to determine necessary adaptations. A megamodel is proposed as a way to formally specify an ecosystem and the dependencies between its elements.
Managing the evolution of F/OSS with Model Driven TechniquesAlfonso Pierantonio
The document discusses challenges in managing the evolution of complex free and open source software (FOSS) distributions, which can comprise thousands of interdependent packages. It proposes addressing these challenges through a model-driven approach involving the use of models and simulations to represent FOSS distributions, analyze package dependencies and scripts, and predict potential failures from package upgrades before deployment. A demonstration is provided of tools developed to harvest information from a Linux system into models and simulate package upgrades to detect errors.
This document discusses managing co-evolution in model-driven engineering (MDE). It defines that metamodels are living entities that change over time, and when metamodels change, related artifacts like models, transformations, and generic tools must adapt to remain valid. It identifies different relations between metamodels and artifacts that can be affected by metamodel changes, like the conformance of models to metamodels. It also classifies different types of changes and adaptations that may be needed. The document then introduces EMFMigrate, a programmatic approach using a domain-specific language to specify migration strategies using rules to adapt affected artifacts when metamodels change in a consistent, reusable way.
This document discusses model differencing, which is the ability to detect and represent changes between versions of a model. It begins by outlining the key challenges of model differencing and proposes decomposing the problem into calculation of differences, representation of differences, and applications of differences. It then examines approaches for representing differences, such as edit scripts and coloring, and proposes a difference metamodel for abstractly representing differences. The document concludes by discussing how difference models can be used for model patching and composition of differences.
Evolution in the Large and in the Small in Model-Driven DevelopmentAlfonso Pierantonio
Model Driven Engineering (MDE) is increasingly gaining acceptance in the development of software systems as a mean to leverage abstraction and render business logic resilient to technological changes. Coordinated collections of models and modeling languages are used to describe
applications on different abstraction levels and from different perspectives. In general, both models and metamodels are not preserved from the evolutionary pressure which inevitably affects almost any artifacts, possibly causing a cascade of adaptations which severely affects the modeling languages or the model population.
This talk analyzes the different kinds of co-adaptations which are required, distinguishing among co-evolution in the large and in the small. In particular, the coupling between models and metamodels implies that when a metamodel undergoes a modification, the conforming models require to be accordingly co-adapted. Analogously, whenever a new version of a model is produced, the generated application may require an explicit adaptation of the generated artifacts, especially when specific
assets are not directly reflected by the models and transformations, as for instance when dealing with serialized objects or with page content which is persistently stored in a database.
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
Or: Beyond linear.
Abstract: Equivariant neural networks are neural networks that incorporate symmetries. The nonlinear activation functions in these networks result in interesting nonlinear equivariant maps between simple representations, and motivate the key player of this talk: piecewise linear representation theory.
Disclaimer: No one is perfect, so please mind that there might be mistakes and typos.
dtubbenhauer@gmail.com
Corrected slides: dtubbenhauer.com/talks.html
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
Current Ms word generated power point presentation covers major details about the micronuclei test. It's significance and assays to conduct it. It is used to detect the micronuclei formation inside the cells of nearly every multicellular organism. It's formation takes place during chromosomal sepration at metaphase.
ESPP presentation to EU Waste Water Network, 4th June 2024 “EU policies driving nutrient removal and recycling
and the revised UWWTD (Urban Waste Water Treatment Directive)”
The binding of cosmological structures by massless topological defectsSérgio Sacani
Assuming spherical symmetry and weak field, it is shown that if one solves the Poisson equation or the Einstein field
equations sourced by a topological defect, i.e. a singularity of a very specific form, the result is a localized gravitational
field capable of driving flat rotation (i.e. Keplerian circular orbits at a constant speed for all radii) of test masses on a thin
spherical shell without any underlying mass. Moreover, a large-scale structure which exploits this solution by assembling
concentrically a number of such topological defects can establish a flat stellar or galactic rotation curve, and can also deflect
light in the same manner as an equipotential (isothermal) sphere. Thus, the need for dark matter or modified gravity theory is
mitigated, at least in part.
When I was asked to give a companion lecture in support of ‘The Philosophy of Science’ (https://shorturl.at/4pUXz) I decided not to walk through the detail of the many methodologies in order of use. Instead, I chose to employ a long standing, and ongoing, scientific development as an exemplar. And so, I chose the ever evolving story of Thermodynamics as a scientific investigation at its best.
Conducted over a period of >200 years, Thermodynamics R&D, and application, benefitted from the highest levels of professionalism, collaboration, and technical thoroughness. New layers of application, methodology, and practice were made possible by the progressive advance of technology. In turn, this has seen measurement and modelling accuracy continually improved at a micro and macro level.
Perhaps most importantly, Thermodynamics rapidly became a primary tool in the advance of applied science/engineering/technology, spanning micro-tech, to aerospace and cosmology. I can think of no better a story to illustrate the breadth of scientific methodologies and applications at their best.
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxRASHMI M G
Abnormal or anomalous secondary growth in plants. It defines secondary growth as an increase in plant girth due to vascular cambium or cork cambium. Anomalous secondary growth does not follow the normal pattern of a single vascular cambium producing xylem internally and phloem externally.
Phenomics assisted breeding in crop improvementIshaGoswami9
As the population is increasing and will reach about 9 billion upto 2050. Also due to climate change, it is difficult to meet the food requirement of such a large population. Facing the challenges presented by resource shortages, climate
change, and increasing global population, crop yield and quality need to be improved in a sustainable way over the coming decades. Genetic improvement by breeding is the best way to increase crop productivity. With the rapid progression of functional
genomics, an increasing number of crop genomes have been sequenced and dozens of genes influencing key agronomic traits have been identified. However, current genome sequence information has not been adequately exploited for understanding
the complex characteristics of multiple gene, owing to a lack of crop phenotypic data. Efficient, automatic, and accurate technologies and platforms that can capture phenotypic data that can
be linked to genomics information for crop improvement at all growth stages have become as important as genotyping. Thus,
high-throughput phenotyping has become the major bottleneck restricting crop breeding. Plant phenomics has been defined as the high-throughput, accurate acquisition and analysis of multi-dimensional phenotypes
during crop growing stages at the organism level, including the cell, tissue, organ, individual plant, plot, and field levels. With the rapid development of novel sensors, imaging technology,
and analysis methods, numerous infrastructure platforms have been developed for phenotyping.
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
The debris of the ‘last major merger’ is dynamically youngSérgio Sacani
The Milky Way’s (MW) inner stellar halo contains an [Fe/H]-rich component with highly eccentric orbits, often referred to as the
‘last major merger.’ Hypotheses for the origin of this component include Gaia-Sausage/Enceladus (GSE), where the progenitor
collided with the MW proto-disc 8–11 Gyr ago, and the Virgo Radial Merger (VRM), where the progenitor collided with the
MW disc within the last 3 Gyr. These two scenarios make different predictions about observable structure in local phase space,
because the morphology of debris depends on how long it has had to phase mix. The recently identified phase-space folds in Gaia
DR3 have positive caustic velocities, making them fundamentally different than the phase-mixed chevrons found in simulations
at late times. Roughly 20 per cent of the stars in the prograde local stellar halo are associated with the observed caustics. Based
on a simple phase-mixing model, the observed number of caustics are consistent with a merger that occurred 1–2 Gyr ago.
We also compare the observed phase-space distribution to FIRE-2 Latte simulations of GSE-like mergers, using a quantitative
measurement of phase mixing (2D causticality). The observed local phase-space distribution best matches the simulated data
1–2 Gyr after collision, and certainly not later than 3 Gyr. This is further evidence that the progenitor of the ‘last major merger’
did not collide with the MW proto-disc at early times, as is thought for the GSE, but instead collided with the MW disc within
the last few Gyr, consistent with the body of work surrounding the VRM.
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...Sérgio Sacani
Context. With a mass exceeding several 104 M⊙ and a rich and dense population of massive stars, supermassive young star clusters
represent the most massive star-forming environment that is dominated by the feedback from massive stars and gravitational interactions
among stars.
Aims. In this paper we present the Extended Westerlund 1 and 2 Open Clusters Survey (EWOCS) project, which aims to investigate
the influence of the starburst environment on the formation of stars and planets, and on the evolution of both low and high mass stars.
The primary targets of this project are Westerlund 1 and 2, the closest supermassive star clusters to the Sun.
Methods. The project is based primarily on recent observations conducted with the Chandra and JWST observatories. Specifically,
the Chandra survey of Westerlund 1 consists of 36 new ACIS-I observations, nearly co-pointed, for a total exposure time of 1 Msec.
Additionally, we included 8 archival Chandra/ACIS-S observations. This paper presents the resulting catalog of X-ray sources within
and around Westerlund 1. Sources were detected by combining various existing methods, and photon extraction and source validation
were carried out using the ACIS-Extract software.
Results. The EWOCS X-ray catalog comprises 5963 validated sources out of the 9420 initially provided to ACIS-Extract, reaching a
photon flux threshold of approximately 2 × 10−8 photons cm−2
s
−1
. The X-ray sources exhibit a highly concentrated spatial distribution,
with 1075 sources located within the central 1 arcmin. We have successfully detected X-ray emissions from 126 out of the 166 known
massive stars of the cluster, and we have collected over 71 000 photons from the magnetar CXO J164710.20-455217.
Managing Uncertainty in Bidirectional Model Transformations
1. Dipartimento di Ingegneria e Scienze
Università degli Studi dell’Aquila
dell’Informazione e Matematica
Managing Uncertainty in
Bidirectional Model
Transformations
Alfonso Pierantonio
Joint work with
Romina Eramo and Gianni Rosa
2. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
2
Bidirectionality
Bidirectionality is about keeping a set or related
models synchronized or in a consistent state:
updates to a source entail updates to the others.
3. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
3
Bidirectionality
Bidirectionality is about keeping a set or related
models synchronized or in a consistent state:
updates to a source entail updates to the others.
Many semantic assumptions and misconceptions can
be explored by comparing
synchronization and consistency management
4. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
4
Bidirectionality
Bidirectionality is about keeping a set or related
models synchronized or in a consistent state:
updates to a source entail updates to the others.
Many semantic assumptions and misconceptions can
be explored by comparing
synchronization and consistency management
Can we really use these terms interchangeably, do they
represent the same concept?
5. As an example, let us consider
Hierarchical State Machines
(HSM)
6. Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transaction
startup
failed
doneshutdown
in(card)cancel
error
fixed
Model m1: HSM
As an example, let us consider
Hierarchical State Machines
(HSM)
7. As an example, let us consider
Hierarchical State Machines
(HSM)
and State Machines (SM)
8. As an example, let us consider
Hierarchical State Machines
(HSM)
and State Machines (SM)
Model m2: SM
in(card)
Out of Service
Off Idle
Active
shutdown
cancel
fixed
startup
done
error
9. As an example, let us consider
Hierarchical State Machines
(HSM)
and State Machines (SM)
A possible consistency relationship
between them is the following
10. Model m2: SM
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transaction
startup
failed
doneshutdown
in(card)cancel
error
fixed
Model m1: HSM
in(card)
Out of Service
Off Idle
Active
shutdown
cancel
fixed
startup
done
error
11. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
11
A model for bidirectional transformation
Let and be two metamodels, then the relation
can be defined by means of the
following directional mappings
Stevens, Perdita. "Bidirectional model transformations
in QVT: semantic issues and open questions." Software
& Systems Modeling 9.1 (2010): 7-20.
12. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
12
A model for bidirectional transformation
Correctness
The transformations and must enforce the
relation , ie they are said correct if
Hippocraticness
In case the models are already consistent then the
following must hold
13. Let us come back to our
State Machine example …
14. Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transaction
startup
failed
doneshutdown
in(card)cancel
error
fixed
15. Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transaction
startup
failed
doneshutdown
in(card)cancel
error
fixed
16. Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transaction
startup
failed
doneshutdown
in(card)cancel
error
fixed
Out of Service
Off Idle
Active
17. Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transaction
startup
failed
done
in(card)
Out of Service
Off Idle
Active
cancel
shutdown
error
fixed
18. cancel
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transaction
startup
failed
doneshutdown
in(card)
error
fixed
Out of Service
Off Idle
Active
shutdown
cancel
error
fixed
19. Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transaction
startup
failed
doneshutdown
in(card)cancel
error
fixed
Out of Service
Off Idle
Active
shutdown
cancel
error
fixed
20. in(card)
startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transactionfailed
doneshutdown
in(card)cancel
error
fixed
Out of Service
Off Idle
Active
shutdown
cancel
error
fixed
startup
done
21. in(card)
startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transactionfailed
doneshutdown
in(card)cancel
error
fixed
Out of Service
Off Idle
Active
shutdown
cancel
error
fixed
startup
done
22. in(card)
startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transactionfailed
doneshutdown
in(card)cancel
error
fixed
Out of Service
Off Idle
Active
shutdown
cancel
error
fixed
startup
done
23. in(card)
startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transacfailed
doneshutdown
in(card)cancel
error
fixed
Out of Service
Off Idle
Active
shutdown
cancel
error
fixed
startup
done
24. in(card)
startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transacfailed
doneshutdown
in(card)cancel
error
fixed
Out of Service
Off Idle
Active
shutdown
cancel
error
fixed
startup
done
How this manual
change can be
back propagated?
25. in(card)
startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transacfailed
doneshutdown
in(card)cancel
error
fixed
Out of Service
Off Idle
Active
shutdown
cancel
error
fixed
startup
done
The update entails
multiple choices
27. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
27
Why non-determinism?
The transformation is not-deterministic because
1) The two metamodels are not isomorphic
2) Multiple update strategies are possible and the
designer did not provide a general consistency-
restoration strategy at design-time
Unknown uncertainty: the designer does not hold
enough information for deciding a priori what is the
“wanted” solution
Zan, Tao, Hugo Pacheco, and Zhenjiang Hu. "Writing
bidirectional model transformations as intentional
updates." Procs. 36th International Conference on
Software Engineering, 2014.
28. in(card)
startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transacfailed
doneshutdown
in(card)cancel
error
fixed
Out of Service
Off Idle
Active
shutdown
cancel
error
fixed
startup
done
An update strategy (among
the valid ones) can make
the transformation
deterministic
30. 30
Explicit management of non-determinism
While determinism is a desirable quality
– There is no way to detect whether a bidirectional
transformation is deterministic at static-time
– Languages like QVT Relational consider only one
strategy out of the many possible alternatives
Developers have little or no control: the main reason
because bidirectionality did not hold promise
An explicit management of non-determinism in model
transformations has been investigated
31. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
31
Languages with non-determinism
Recently a number of languages has been proposed
with an explicit management of non-determinism:
[1] A. Cicchetti, D. Di Ruscio, R. Eramo, and A. Pierantonio. JTL: a
bidirectional and change propagating transformation language. In SLE10, pages
183–202, 2010.
[2] N. Macedo and A. Cunha. Implementing QVT-R Bidirectional Model
Transformations Using Alloy. In FASE, pages 297–311, 2013.
[3] G. Callow and R. Kalawsky. A Satisficing Bi-Directional Model
Transformation Engine using Mixed Integer Linear Programming. JOT, 12(1):1:
1–43, 2013.
32. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
32
JTL
The JTL language provide an explicit support to non-
determinism, intended as
“. . . programmers need only specify a consistency relation,
allowing the bx engine to resolve the under-specification
non-deterministically” [1]
All valid solutions are generated at once
[1] F. Abou-Saleh, J. Cheney, J. Gibbons, J. McKinna,
and P. Stevens. Notions of Bidirectional Computation
and Entangled State Monads. MPC, 187–214, 2015.
33. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
33
A model for non-deterministic transformations
Let and be two metamodels, then the relation
can be defined as follows
where and are multivalued functions, ie left-
total relations in which inputs are associated with
multiple outputs.
34. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
34
A model for non-deterministic transformations
Correctness
The transformations and must enforce the
relation , ie they are correct if
Hippocraticness
In case the models are already consistent then the
following must hold
36. startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transactionfailed
done
shutdown
in(card)cancel
error
fixed
in(card)
Out of Service
Off Idle
Active
shutdown
cancel
fixed
startup
error
error
error
error
error
#error = 3
37. startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transactionfailed
shutdown
in(card)cancel
error
fixed
in(card)
Out of Service
Off Idle
Active
shutdown
cancel
fixed
startup
error
Printing
done
print
Printing
print
printprintprint
error
error
error
error
done
#error = 3
#done = 1
#print = 4
38. startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transactionfailed
shutdown
in(card)cancel
error
fixed
in(card)
Out of Service
Off Idle
Active
shutdown
cancel
fixed
startup
completed
error
Printing
done
print
Printing
print
printprintprint
completed
completed completed completed
error
error
error
error
done
#error = 3
#done = 1
#print = 4
#completed = 4
39. 39
Little changes, big impact
Little changes in one of the source, cause a
combinatorial explosion of alternatives in the other
one
For instance, the overall number of models in the
State Machine example is
3 x 4 x 4 x 1 = 48
Dealing with the myriad of generated models is
intrinsically difficult if not impractical
#error = 3
#done = 1
#print = 4
#completed = 4
40. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
40
Intensional representation of the
solution space
We need a construction which permits an intentional
representation of the solution space
Therefore, instead of generating a multitude of
models, we want to represent the solution space of a
non-deterministic transformation with a model with
uncertainty.
R. Eramo, A. Pierantonio, and G. Rosa. Uncertainty in
bidi- rectional transformations. In Procs. of MiSE
2014, 2014.
41. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
41
Models with Uncertainty
Given a metamodel M it is possible to derive its
metamodel with uncertainty U(M) by means of an
automated construction.
A model with uncertainty is used to intensionally
represent all the alternative models
generated by a non-deterministic transformation.
R. Eramo, A. Pierantonio, and G. Rosa. Uncertainty in
bidi- rectional transformations. In Procs. of MiSE
2014, 2014.
43. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
43
A model for transformations with uncertainty
The directional mappings and can be
reformulated as
such that
where
with
44. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
44
JTL extention to uncertainty
In order to let JTL generate models with uncertainty
for the solution space
1. the “intersection” shared among all choices and
2. the uncertain parts
must be identified.
45. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
45
JTL extention to uncertainty
startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transactionfailed
shutdown
in(card)cancel
error
fixed
Printing
print
printprintprint
completed
completed completed completed
error
error
error
done
46. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
46
JTL extension to uncertainty
JTL engine is a logic program written in ASP and
non-deterministically resolved by the DLV solver, it
can derive how models are related by means of a
deductive process
Traceability management offers enough information
to understand how the models can be factorized
Tracing information stores relevant details about the
linkage between source and target model elements
at run-time.
47. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
47
JTL extension to uncertainty
In essence, given the application
the complete tracing domain is determined as
the collection of all tracing links r(a) = b
Def. An uncertainty point is a set of trace links in
such that if
are two distinguished links in , then
48. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
48
JTL extension to uncertainty
startup
Out of Service
Off Idle
Self Test
Maintenance
Active
Authentication
Selecting
transaction
Transactionfailed
shutdown
in(card)cancel
error
fixed
Printing
print
printprintprint
49. This the EMF model with
uncertainty representing all
148 solutions.
Designer can easily
inspect it instead of
navigating the solution
space and for each pair of
models finding the
differences!
50. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
50
Experiment
We considered a JTL HSM2SM transformation
consisting of 10 relations
1. The transformation has been (forward) applied to
a model with 20 elements, 29 association and 30
attributes
2. The target model has been increasingly modified
and the transformation (backward) reapplied
The program has been executed on a machine with
an Intel Core i7-4790K 4.00Ghz processor and 8Gb
RAM, running Windows 8.1
51. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
51
Experiment
CPU time #model elements Memory
52. Alfonso Pierantonio – SLE 2015, Pittsburgh (USA)
52
Conclusions
The explicit management of non-determinism in
bidirectional transformations is an uncertainty
reducing process
The unknown uncertainty during design-time
translates into known (or bound) uncertainty during
run-time
Managing the non-determinism is not enough
because of the combinatorial explosion of
alternatives