The document discusses extending the existing SRML metamodel to implement additional properties of interaction protocols and external policies. The main tasks are to define grammars for coordination and external policies, and extend the metamodel using the Xtext grammar. This will allow users to edit models and view changes in the SRML editor. The document also discusses transforming SRML models to BPEL for business processes, and improving the graphical representation of SRML models in Eclipse.
This document provides an introduction to business modeling. It discusses that software includes more than just code and architecture, and includes all artifacts that contribute to a usable system. It notes that business process modeling notation (BPM) and the unified modeling language (UML) are examples of modeling languages used. The document focuses on domain models and requirements models. It provides an overview of how a domain model incorporates feature models, functional models, entity relationship diagrams, and business process models. It indicates that domain models can be used to understand dependencies and impacts of changes.
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTijseajournal
The present business network infrastructure is quickly varying with latest servers, services, connections,
and ports added often, at times day by day, and with a uncontrollably inflow of laptops, storage media and
wireless networks. With the increasing amount of vulnerabilities and exploits coupled with the recurrent
evolution of IT infrastructure, organizations at present require more numerous vulnerability assessments.
In this paper new approach the Unified process for Network vulnerability Assessments hereafter called as
a unified NVA is proposed for network vulnerability assessment derived from Unified Software
Development Process or Unified Process, it is a popular iterative and incremental software development
process framework.
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISONijseajournal
Performance responsiveness and scalability is a make-or-break quality for software. Nearly everyone runs into performance problems at one time or another. This paper discusses about performance issues faced during Pre Examination Process Automation System (PEPAS) implemented in java technology. The challenges faced during the life cycle of the project and the mitigation actions performed. It compares 3 java technologies and shows how improvements are made through statistical analysis in response time of the application. The paper concludes with result analysis.
Evolution of Modelling Techniques for Service Oriented ArchitectureIJERA Editor
Service-oriented architecture (SOA) is a software design and architecture design pattern based on independent pieces of software providing functionality as services to other applications. The benefit of SOA in the IT infrastructure is to allow parallel use and data exchange between programs which are services to the enterprise. Unified Modelling Language (UML) is a standardized general-purpose modelling language in the field of software engineering. The UML includes a set of graphic notation techniques to create visual models of object-oriented software systems. We want to make UML available for SOA as well. SoaML (Service oriented architecture Modelling Language) is an open source specification project from the Object Management Group (OMG), describing a UML profile and meta-model for the modelling and design of services within a service-oriented architecture. BPMN was also extended for SOA but there were few pitfalls. There is a need of a modelling framework which dedicated to SOA. Michael Bell authored a framework called Service Oriented Modelling Framework (SOMF) which is dedicated for SOA.
Presentation - "A comparison of component-based software engineering and mode...Nikolay Grozev
The document is a master's thesis that compares component-based software engineering (CBSE) and model-driven development (MDD). It includes an introduction, background on CBSE and MDD, an overview of the ProCom component model, a comparison of CBSE and MDD with respect to ProCom, and a conclusion. The thesis aims to systematically compare CBSE and MDD in general and enrich the comparison with a case study of ProCom to analyze the outcomes.
This document compares the J2EE and .NET platforms using a separation continuum model. It defines key terms related to J2EE, .NET, and distributed application architectures. The document outlines a logical tier model and a service-based architecture model for conceptualizing large distributed solutions. It aims to map the technologies provided by J2EE and .NET to the separation continuum for analysis and comparison.
FRAMEWORKS BETWEEN COMPONENTS AND OBJECTSacijjournal
Before the emergence of Component-Based Frameworks, similar issues have been addressed by other
software development paradigms including e.g. Object-Oriented Programming (OOP), ComponentBased Development (CBD), and Object-Oriented Framework. In this study, these approaches especially
object-oriented Frameworks are compared to Component-Based Frameworks and their relationship are
discussed. Different software reuse methods impacts on architectural patterns and support for
application extensions and versioning. It is concluded that many of the mechanisms provided by
Component-Based Framework can be enabled by software elements at the lower level. The main
contribution of Component-Based Framework is the focus on Component development. All of them can be
built on each other in layered manner by adopting suitable design patterns. Still some things such as
which method to develop and upgrade existing application to other approach.
A Review of Feature Model Position in the Software Product Line and Its Extra...CSCJournals
The software has become a modern asset and competitive product. The product line that has long been used in manufacturing and construction industries nowadays has attracted a lot of attention in software industry. Most importance of product line engineering approach is in cost and time issues involved in marketing. Feature model is one of the most important methods of documenting variability in product line that shows product features and their dependencies. Because of the magnitude and complexity of the product line, build and maintain feature models are complex and time-consuming work. In this article feature model importance and position in product line is discussed and feature model extraction methods are reviewed and compared.
This document provides an introduction to business modeling. It discusses that software includes more than just code and architecture, and includes all artifacts that contribute to a usable system. It notes that business process modeling notation (BPM) and the unified modeling language (UML) are examples of modeling languages used. The document focuses on domain models and requirements models. It provides an overview of how a domain model incorporates feature models, functional models, entity relationship diagrams, and business process models. It indicates that domain models can be used to understand dependencies and impacts of changes.
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTijseajournal
The present business network infrastructure is quickly varying with latest servers, services, connections,
and ports added often, at times day by day, and with a uncontrollably inflow of laptops, storage media and
wireless networks. With the increasing amount of vulnerabilities and exploits coupled with the recurrent
evolution of IT infrastructure, organizations at present require more numerous vulnerability assessments.
In this paper new approach the Unified process for Network vulnerability Assessments hereafter called as
a unified NVA is proposed for network vulnerability assessment derived from Unified Software
Development Process or Unified Process, it is a popular iterative and incremental software development
process framework.
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISONijseajournal
Performance responsiveness and scalability is a make-or-break quality for software. Nearly everyone runs into performance problems at one time or another. This paper discusses about performance issues faced during Pre Examination Process Automation System (PEPAS) implemented in java technology. The challenges faced during the life cycle of the project and the mitigation actions performed. It compares 3 java technologies and shows how improvements are made through statistical analysis in response time of the application. The paper concludes with result analysis.
Evolution of Modelling Techniques for Service Oriented ArchitectureIJERA Editor
Service-oriented architecture (SOA) is a software design and architecture design pattern based on independent pieces of software providing functionality as services to other applications. The benefit of SOA in the IT infrastructure is to allow parallel use and data exchange between programs which are services to the enterprise. Unified Modelling Language (UML) is a standardized general-purpose modelling language in the field of software engineering. The UML includes a set of graphic notation techniques to create visual models of object-oriented software systems. We want to make UML available for SOA as well. SoaML (Service oriented architecture Modelling Language) is an open source specification project from the Object Management Group (OMG), describing a UML profile and meta-model for the modelling and design of services within a service-oriented architecture. BPMN was also extended for SOA but there were few pitfalls. There is a need of a modelling framework which dedicated to SOA. Michael Bell authored a framework called Service Oriented Modelling Framework (SOMF) which is dedicated for SOA.
Presentation - "A comparison of component-based software engineering and mode...Nikolay Grozev
The document is a master's thesis that compares component-based software engineering (CBSE) and model-driven development (MDD). It includes an introduction, background on CBSE and MDD, an overview of the ProCom component model, a comparison of CBSE and MDD with respect to ProCom, and a conclusion. The thesis aims to systematically compare CBSE and MDD in general and enrich the comparison with a case study of ProCom to analyze the outcomes.
This document compares the J2EE and .NET platforms using a separation continuum model. It defines key terms related to J2EE, .NET, and distributed application architectures. The document outlines a logical tier model and a service-based architecture model for conceptualizing large distributed solutions. It aims to map the technologies provided by J2EE and .NET to the separation continuum for analysis and comparison.
FRAMEWORKS BETWEEN COMPONENTS AND OBJECTSacijjournal
Before the emergence of Component-Based Frameworks, similar issues have been addressed by other
software development paradigms including e.g. Object-Oriented Programming (OOP), ComponentBased Development (CBD), and Object-Oriented Framework. In this study, these approaches especially
object-oriented Frameworks are compared to Component-Based Frameworks and their relationship are
discussed. Different software reuse methods impacts on architectural patterns and support for
application extensions and versioning. It is concluded that many of the mechanisms provided by
Component-Based Framework can be enabled by software elements at the lower level. The main
contribution of Component-Based Framework is the focus on Component development. All of them can be
built on each other in layered manner by adopting suitable design patterns. Still some things such as
which method to develop and upgrade existing application to other approach.
A Review of Feature Model Position in the Software Product Line and Its Extra...CSCJournals
The software has become a modern asset and competitive product. The product line that has long been used in manufacturing and construction industries nowadays has attracted a lot of attention in software industry. Most importance of product line engineering approach is in cost and time issues involved in marketing. Feature model is one of the most important methods of documenting variability in product line that shows product features and their dependencies. Because of the magnitude and complexity of the product line, build and maintain feature models are complex and time-consuming work. In this article feature model importance and position in product line is discussed and feature model extraction methods are reviewed and compared.
Rhapsody and mechatronics, multi-domain simulationGraham Bleakley
This document discusses mechatronics and its application with Rational Rhapsody Design Manager. [1] Mechatronics involves the integration of mechanical, electrical, and software engineering, requiring a systems engineering approach. [2] Mechatronic modeling requires mathematical modeling tools that can be integrated into logical behavior models. [3] Rhapsody provides a way to work with mathematical modeling tools like Simulink and Modelica to model both logical and physical behavior.
This document discusses various techniques for business process modeling, including flow charts, Gantt charts, PERT diagrams, data flow diagrams, control flow diagrams, functional flow block diagrams, Petri nets, IDEF, UML, BPMN, XPDL, Wf-XML, and BPEL. It provides brief descriptions of each technique and notes their purposes and applications in business process modeling and systems development.
Feature Model Configuration Based on Two-Layer Modelling in Software Product ...IJECEIAES
The aim of the Software Product Line (SPL) approach is to improve the software development process by producing software products that match the stakeholders’ requirements. One of the important topics in SPLs is the feature model (FM) configuration process. The purpose of configuration here is to select and remove specific features from the FM in order to produce the required software product. At the same time, detection of differences between application’s requirements and the available capabilities of the implementation platform is a major concern of application requirements engineering. It is possible that the implementation of the selected features of FM needs certain software and hardware infrastructures such as database, operating system and hardware that cannot be made available by stakeholders. We address the FM configuration problem by proposing a method, which employs a two-layer FM comprising the application and infrastructure layers. We also show this method in the context of a case study in the SPL of a sample E-Shop website. The results demonstrate that this method can support both functional and non-functional requirements and can solve the problems arising from lack of attention to implementation requirements in SPL FM selection phase.
IRJET- Design Automation of Cam Lobe Modeling in Creo using C#IRJET Journal
This document discusses automating the design of cam lobes in Creo using C#. Cam lobes control valve opening and closing in engines and typically take 15-20 minutes each to model in Creo. The objective is to develop a C# application to automatically model any number of cam lobes in under 5 minutes. Conventionally, modeling cam lobes in Creo is an iterative process that takes time, especially for modifying previously modeled camshafts. The automated approach using C# aims to reduce modeling time and facilitate design changes.
Cbt component based technology architecturesSaransh Garg
This document discusses various component models and technologies, including ACME, Java Beans, COM/DCOM/MTS, CORBA, .NET, and OSGi. It covers topics such as component interfaces, connections, interactions, lifecycles, architectures, and specific models like Java Beans. The Java Bean component model is described in more detail, including its key features around reusable software components, interfaces focusing on methods, properties, events, and implementations as simple Java objects or more complex wrappers.
This document provides an overview of a five-day course on architecting and designing J2EE applications. The course objectives are to understand the process of developing an architecture from requirements to implementation using the J2EE framework. It will cover business and technical architecture design, mapping components to J2EE, and include hands-on labs. The agenda includes sections on business architecture, applying component modeling, J2EE technical overview, and mapping to the technical architecture.
PHP modernization approach generating KDM models from PHP legacy codejournalBEEI
With the rise of new web technologies such as web 2.0, Jquery, Bootstrap. Modernizing legacy web systems to benefit from the advantages of the new technologies is more and more relevant. The migration of a system from an environment to another is a time and effort consuming process, it involves a complete rewrite of the application adapted to the target platform. To realize this migration in an automated and standardized way, many approaches have tried to define standardized engineering processes. Architecture Driven Modernization (ADM) defines an approach to standardize and automate the reengineering process. We defined an ADM approach to represent PHP web applications in the highest level of abstraction models. To do this, we have used software artifacts as a entry point . This paper describes the extraction process, which permits discovering and understanding of the legacy system. And generate models to represent the system in an abstract way.
EARLY PERFORMANCE PREDICTION OF WEB SERVICESijwscjournal
The document describes a methodology for early performance prediction of web services. It involves modeling web services using UML diagrams, simulating the model using a tool called SMTQA, and analyzing performance metrics. The methodology was applied to model a general web services system using use case and sequence diagrams. The model was simulated and found that internet connections and the service broker disk were bottlenecks based on high average waiting times and request dropping probabilities.
MODIGEN: MODEL-DRIVEN GENERATION OF GRAPHICAL EDITORS IN ECLIPSEijcsit
The document describes MoDiGen, a model-driven approach for generating graphical editors in Eclipse. MoDiGen uses three domain-specific languages - MoDiGen Core, Shape, and Style - as inputs to a generator. The generator produces Java code, XML files, and properties to implement a graphical editor as an Eclipse plugin. The languages allow defining node and edge diagrams, complex shapes, and styles. When the DSL models are saved, code generation is triggered to build the graphical editor. The approach aims to reduce the effort of developing graphical modeling tools compared to manually coding editors in Eclipse frameworks.
STRUCTURAL VALIDATION OF SOFTWARE PRODUCT LINE VARIANTS: A GRAPH TRANSFORMATI...IJSEA
This document discusses an approach to structurally validating software product line variants using graph transformations. The authors propose using model transformations to automatically validate products according to dependencies defined in the feature diagram. They introduce necessary meta-models and present graph grammars to perform validation using the AToM3 tool. The approach is illustrated through examples.
- The document provides a summary of the candidate's work experience including 7 projects as a Java developer working on online billing services and dashboard applications for United Health Group and Cognizant over 6 years and 3 months.
- Technical skills include Java, J2EE, Spring Framework, Hibernate, HTML, JSP, Servlet, JSF, Oracle, and MS SQL.
- Educational qualifications include a B.E. in Mechanical Engineering and an M.B.A. in Operations Management.
This document discusses various proposed software development methodologies that are based on model-driven architecture (MDA). It first provides background on MDA and its key concepts. It then examines how MDA can be mapped to the Rational Unified Process (RUP) software development lifecycle framework. The rest of the document describes several specific MDA-based methodologies: MODA-TEL, MASTER, MIDAS, C3, ODAC, and DREAM. It compares these methodologies based on which phases of the software development lifecycle they cover in detail. The document concludes that while many have invested in MDA, a standardized methodology for developing model-based systems is still lacking.
Super applied in a sitecore migration projectdodoshelu
This document describes using a business process model and ontology for a Sitecore migration project. It discusses key Sitecore concepts like items, templates, fields and presentation components. An ontology is proposed to model Sitecore concepts and help answer competency questions during content creation. Integrating the business process model with SUPER could provide semantic tools and services to help manage the migration, increasing flexibility and reducing costs and time. Potential disadvantages include challenges developing an accurate ontology and hiding complex IT implementation details.
ARCHITECTURAL FRAMEWORK FOR DEVELOPING COMPONENT BASED GIS SYSTEMijfcstjournal
Component Based Software Engineering has one main sole motive of making the development process of
software systems as easy as possible and to achieve this objective work needs to be done in previous
systems to identify the concerns and limitations which can be overcome using this software engineering
based approach. In this paper to support concept of component based system a domain is chosen that
covers the GIS systems. GIS (Geographic Information Systems) are commonly used for development of map
based applications these systems are widely used across the web and in various organizations. With the
development and deepening of GIS, traditional GIS systems showed the challenges on isolation, sealing,
interoperability and the limitations, thereby hindering further development and application of GIS
technology. In this paper framework for component based GIS system is proposed. This framework is
having rich graphical interface and user data can be easily retrieved from the connected database and
displayed in the browser.
Improving Consistency of UML Diagrams and Its Implementation Using Reverse En...journalBEEI
This document summarizes a research paper that describes the development of a tool called the UML-Code Consistency Checker Tool (UCCCT) to improve consistency between UML design models and their implementation in C# source code using reverse engineering. The tool detects both vertical inconsistencies between UML diagrams (e.g. class diagrams) and the implemented code, as well as horizontal inconsistencies between different UML diagrams. It extracts information from UML diagrams in XMI format and from compiled C# code to generate tree views. predefined consistency rules are then used to check for inconsistencies between the UML models and code. Any inconsistencies found are highlighted in the tree views. An evaluation of UCCCT found it
A natural language requirements engineering approach for mdaIJCSEA Journal
A software system for any information system can be developed following a model driven paradigm, in particular MDA (Model Driven Architecture). In this way, models that represent the organizational work are used to produce models that represent the information system. Current software development methods are starting to provide guidelines for the construction of conceptual models, taking as input requirements models. In MDA the CIM (Computation Independent Model) can be used to define the business process model. Though a complete automatic construction of the CIM is not possible, we have proposed in other papers the integration of some natural language requirements models and we have defined a strategy to
derive a CIM from these models. In this paper, we present an improved version of our ATL transformation
that implements a strategy to obtain a UML class diagram representing a preliminary CIM from requirements models allowing traceability between the source and the target models.
A Natural Language Requirements Engineering Approach for MDAIJCSEA Journal
A software system for any information system can be developed following a model driven paradigm, in particular MDA (Model Driven Architecture). In this way, models that represent the organizational work are used to produce models that represent the information system. Current software development methods are starting to provide guidelines for the construction of conceptual models, taking as input requirements models. In MDA the CIM (Computation Independent Model) can be used to define the business process model. Though a complete automatic construction of the CIM is not possible, we have proposed in other papers the integration of some natural language requirements models and we have defined a strategy to derive a CIM from these models. In this paper, we present an improved version of our ATL transformation that implements a strategy to obtain a UML class diagram representing a preliminary CIM from requirements models allowing traceability between the source and the target models
EDGE Detection Filter for Gray Image and Observing PerformancesIOSR Journals
This paper presents an edge detection filter for gray images and analyzes the performance of different edge detection operators. The paper studies common edge detection operators like Sobel, Prewitt, Laplacian, Robert, and Canny. It applies these operators to detect edges in a gray test image and calculates statistical metrics like PSNR and MSE to compare the performance of each operator. The results show that the Canny edge detector produces better results for edge detection in gray images compared to other operators based on the statistical measurements. In conclusion, the paper demonstrates edge detection in gray images using common operators and analyzes their relative performances.
This document discusses how the addition of antioxidants dibenzyl disulfide (DBDS) and 2,6-di-tert-butyl-p-cresol (DBPC) to transformer oil can increase the formation of copper sulfide on insulation paper. Copper sulfide deposition degrades the insulation strength of paper and can lead to transformer failures. The document presents an experimental setup to study the effect of varying concentrations of DBDS and DBPC on the leakage currents in oil, as increased leakage currents indicate degradation of the insulating properties of the oil. Test results showing the effect of DBDS and DBPC concentration on leakage current are presented and discussed.
A generic randomization framework architecture is proposed for automated testing of system on chips (SoCs). The framework generates random test configurations to improve coverage and prevent bias. It captures the device under test's state at different parameter settings. The framework consists of a random number generator, parameter constraints, and interfaces to configure the test and get results. It was validated using a test that generates single edge nibble transmission frames with randomized frequencies and data. The framework successfully selected random parameters and passed them to the test, demonstrating its ability to automate testing and improve coverage of SoCs.
The document discusses the optimization of the design of a brake drum for a two-wheeler through reverse engineering using ANSYS software. The authors create CAD models of an existing brake drum and analyze it using finite element analysis to determine stresses and temperature variations under different braking conditions and materials. Their results show that a CE alloy material produces less deformation and lower maximum temperatures than aluminum. They conclude that CE alloys can improve braking performance and are a better candidate material for brake drums compared to aluminum.
Rhapsody and mechatronics, multi-domain simulationGraham Bleakley
This document discusses mechatronics and its application with Rational Rhapsody Design Manager. [1] Mechatronics involves the integration of mechanical, electrical, and software engineering, requiring a systems engineering approach. [2] Mechatronic modeling requires mathematical modeling tools that can be integrated into logical behavior models. [3] Rhapsody provides a way to work with mathematical modeling tools like Simulink and Modelica to model both logical and physical behavior.
This document discusses various techniques for business process modeling, including flow charts, Gantt charts, PERT diagrams, data flow diagrams, control flow diagrams, functional flow block diagrams, Petri nets, IDEF, UML, BPMN, XPDL, Wf-XML, and BPEL. It provides brief descriptions of each technique and notes their purposes and applications in business process modeling and systems development.
Feature Model Configuration Based on Two-Layer Modelling in Software Product ...IJECEIAES
The aim of the Software Product Line (SPL) approach is to improve the software development process by producing software products that match the stakeholders’ requirements. One of the important topics in SPLs is the feature model (FM) configuration process. The purpose of configuration here is to select and remove specific features from the FM in order to produce the required software product. At the same time, detection of differences between application’s requirements and the available capabilities of the implementation platform is a major concern of application requirements engineering. It is possible that the implementation of the selected features of FM needs certain software and hardware infrastructures such as database, operating system and hardware that cannot be made available by stakeholders. We address the FM configuration problem by proposing a method, which employs a two-layer FM comprising the application and infrastructure layers. We also show this method in the context of a case study in the SPL of a sample E-Shop website. The results demonstrate that this method can support both functional and non-functional requirements and can solve the problems arising from lack of attention to implementation requirements in SPL FM selection phase.
IRJET- Design Automation of Cam Lobe Modeling in Creo using C#IRJET Journal
This document discusses automating the design of cam lobes in Creo using C#. Cam lobes control valve opening and closing in engines and typically take 15-20 minutes each to model in Creo. The objective is to develop a C# application to automatically model any number of cam lobes in under 5 minutes. Conventionally, modeling cam lobes in Creo is an iterative process that takes time, especially for modifying previously modeled camshafts. The automated approach using C# aims to reduce modeling time and facilitate design changes.
Cbt component based technology architecturesSaransh Garg
This document discusses various component models and technologies, including ACME, Java Beans, COM/DCOM/MTS, CORBA, .NET, and OSGi. It covers topics such as component interfaces, connections, interactions, lifecycles, architectures, and specific models like Java Beans. The Java Bean component model is described in more detail, including its key features around reusable software components, interfaces focusing on methods, properties, events, and implementations as simple Java objects or more complex wrappers.
This document provides an overview of a five-day course on architecting and designing J2EE applications. The course objectives are to understand the process of developing an architecture from requirements to implementation using the J2EE framework. It will cover business and technical architecture design, mapping components to J2EE, and include hands-on labs. The agenda includes sections on business architecture, applying component modeling, J2EE technical overview, and mapping to the technical architecture.
PHP modernization approach generating KDM models from PHP legacy codejournalBEEI
With the rise of new web technologies such as web 2.0, Jquery, Bootstrap. Modernizing legacy web systems to benefit from the advantages of the new technologies is more and more relevant. The migration of a system from an environment to another is a time and effort consuming process, it involves a complete rewrite of the application adapted to the target platform. To realize this migration in an automated and standardized way, many approaches have tried to define standardized engineering processes. Architecture Driven Modernization (ADM) defines an approach to standardize and automate the reengineering process. We defined an ADM approach to represent PHP web applications in the highest level of abstraction models. To do this, we have used software artifacts as a entry point . This paper describes the extraction process, which permits discovering and understanding of the legacy system. And generate models to represent the system in an abstract way.
EARLY PERFORMANCE PREDICTION OF WEB SERVICESijwscjournal
The document describes a methodology for early performance prediction of web services. It involves modeling web services using UML diagrams, simulating the model using a tool called SMTQA, and analyzing performance metrics. The methodology was applied to model a general web services system using use case and sequence diagrams. The model was simulated and found that internet connections and the service broker disk were bottlenecks based on high average waiting times and request dropping probabilities.
MODIGEN: MODEL-DRIVEN GENERATION OF GRAPHICAL EDITORS IN ECLIPSEijcsit
The document describes MoDiGen, a model-driven approach for generating graphical editors in Eclipse. MoDiGen uses three domain-specific languages - MoDiGen Core, Shape, and Style - as inputs to a generator. The generator produces Java code, XML files, and properties to implement a graphical editor as an Eclipse plugin. The languages allow defining node and edge diagrams, complex shapes, and styles. When the DSL models are saved, code generation is triggered to build the graphical editor. The approach aims to reduce the effort of developing graphical modeling tools compared to manually coding editors in Eclipse frameworks.
STRUCTURAL VALIDATION OF SOFTWARE PRODUCT LINE VARIANTS: A GRAPH TRANSFORMATI...IJSEA
This document discusses an approach to structurally validating software product line variants using graph transformations. The authors propose using model transformations to automatically validate products according to dependencies defined in the feature diagram. They introduce necessary meta-models and present graph grammars to perform validation using the AToM3 tool. The approach is illustrated through examples.
- The document provides a summary of the candidate's work experience including 7 projects as a Java developer working on online billing services and dashboard applications for United Health Group and Cognizant over 6 years and 3 months.
- Technical skills include Java, J2EE, Spring Framework, Hibernate, HTML, JSP, Servlet, JSF, Oracle, and MS SQL.
- Educational qualifications include a B.E. in Mechanical Engineering and an M.B.A. in Operations Management.
This document discusses various proposed software development methodologies that are based on model-driven architecture (MDA). It first provides background on MDA and its key concepts. It then examines how MDA can be mapped to the Rational Unified Process (RUP) software development lifecycle framework. The rest of the document describes several specific MDA-based methodologies: MODA-TEL, MASTER, MIDAS, C3, ODAC, and DREAM. It compares these methodologies based on which phases of the software development lifecycle they cover in detail. The document concludes that while many have invested in MDA, a standardized methodology for developing model-based systems is still lacking.
Super applied in a sitecore migration projectdodoshelu
This document describes using a business process model and ontology for a Sitecore migration project. It discusses key Sitecore concepts like items, templates, fields and presentation components. An ontology is proposed to model Sitecore concepts and help answer competency questions during content creation. Integrating the business process model with SUPER could provide semantic tools and services to help manage the migration, increasing flexibility and reducing costs and time. Potential disadvantages include challenges developing an accurate ontology and hiding complex IT implementation details.
ARCHITECTURAL FRAMEWORK FOR DEVELOPING COMPONENT BASED GIS SYSTEMijfcstjournal
Component Based Software Engineering has one main sole motive of making the development process of
software systems as easy as possible and to achieve this objective work needs to be done in previous
systems to identify the concerns and limitations which can be overcome using this software engineering
based approach. In this paper to support concept of component based system a domain is chosen that
covers the GIS systems. GIS (Geographic Information Systems) are commonly used for development of map
based applications these systems are widely used across the web and in various organizations. With the
development and deepening of GIS, traditional GIS systems showed the challenges on isolation, sealing,
interoperability and the limitations, thereby hindering further development and application of GIS
technology. In this paper framework for component based GIS system is proposed. This framework is
having rich graphical interface and user data can be easily retrieved from the connected database and
displayed in the browser.
Improving Consistency of UML Diagrams and Its Implementation Using Reverse En...journalBEEI
This document summarizes a research paper that describes the development of a tool called the UML-Code Consistency Checker Tool (UCCCT) to improve consistency between UML design models and their implementation in C# source code using reverse engineering. The tool detects both vertical inconsistencies between UML diagrams (e.g. class diagrams) and the implemented code, as well as horizontal inconsistencies between different UML diagrams. It extracts information from UML diagrams in XMI format and from compiled C# code to generate tree views. predefined consistency rules are then used to check for inconsistencies between the UML models and code. Any inconsistencies found are highlighted in the tree views. An evaluation of UCCCT found it
A natural language requirements engineering approach for mdaIJCSEA Journal
A software system for any information system can be developed following a model driven paradigm, in particular MDA (Model Driven Architecture). In this way, models that represent the organizational work are used to produce models that represent the information system. Current software development methods are starting to provide guidelines for the construction of conceptual models, taking as input requirements models. In MDA the CIM (Computation Independent Model) can be used to define the business process model. Though a complete automatic construction of the CIM is not possible, we have proposed in other papers the integration of some natural language requirements models and we have defined a strategy to
derive a CIM from these models. In this paper, we present an improved version of our ATL transformation
that implements a strategy to obtain a UML class diagram representing a preliminary CIM from requirements models allowing traceability between the source and the target models.
A Natural Language Requirements Engineering Approach for MDAIJCSEA Journal
A software system for any information system can be developed following a model driven paradigm, in particular MDA (Model Driven Architecture). In this way, models that represent the organizational work are used to produce models that represent the information system. Current software development methods are starting to provide guidelines for the construction of conceptual models, taking as input requirements models. In MDA the CIM (Computation Independent Model) can be used to define the business process model. Though a complete automatic construction of the CIM is not possible, we have proposed in other papers the integration of some natural language requirements models and we have defined a strategy to derive a CIM from these models. In this paper, we present an improved version of our ATL transformation that implements a strategy to obtain a UML class diagram representing a preliminary CIM from requirements models allowing traceability between the source and the target models
EDGE Detection Filter for Gray Image and Observing PerformancesIOSR Journals
This paper presents an edge detection filter for gray images and analyzes the performance of different edge detection operators. The paper studies common edge detection operators like Sobel, Prewitt, Laplacian, Robert, and Canny. It applies these operators to detect edges in a gray test image and calculates statistical metrics like PSNR and MSE to compare the performance of each operator. The results show that the Canny edge detector produces better results for edge detection in gray images compared to other operators based on the statistical measurements. In conclusion, the paper demonstrates edge detection in gray images using common operators and analyzes their relative performances.
This document discusses how the addition of antioxidants dibenzyl disulfide (DBDS) and 2,6-di-tert-butyl-p-cresol (DBPC) to transformer oil can increase the formation of copper sulfide on insulation paper. Copper sulfide deposition degrades the insulation strength of paper and can lead to transformer failures. The document presents an experimental setup to study the effect of varying concentrations of DBDS and DBPC on the leakage currents in oil, as increased leakage currents indicate degradation of the insulating properties of the oil. Test results showing the effect of DBDS and DBPC concentration on leakage current are presented and discussed.
A generic randomization framework architecture is proposed for automated testing of system on chips (SoCs). The framework generates random test configurations to improve coverage and prevent bias. It captures the device under test's state at different parameter settings. The framework consists of a random number generator, parameter constraints, and interfaces to configure the test and get results. It was validated using a test that generates single edge nibble transmission frames with randomized frequencies and data. The framework successfully selected random parameters and passed them to the test, demonstrating its ability to automate testing and improve coverage of SoCs.
The document discusses the optimization of the design of a brake drum for a two-wheeler through reverse engineering using ANSYS software. The authors create CAD models of an existing brake drum and analyze it using finite element analysis to determine stresses and temperature variations under different braking conditions and materials. Their results show that a CE alloy material produces less deformation and lower maximum temperatures than aluminum. They conclude that CE alloys can improve braking performance and are a better candidate material for brake drums compared to aluminum.
The document describes a wireless data acquisition system using an ARM Cortex M-3 processor. The system collects data from gas, humidity, and temperature sensors and transmits the data wirelessly via Bluetooth to a host computer. The ARM processor samples the sensor data using its analog-to-digital converter and transmits the digital values to the host computer in real-time. The host computer receives the data using a Bluetooth receiver and displays the measurements graphically using MATLAB for monitoring and analysis purposes. The system provides wireless short-range data collection and is suitable for applications like industrial monitoring and mobile meter reading.
This document proposes an approach for interactively smoothing experimental data using a Savitzky-Golay (S-G) filter. An optimization problem is formulated to minimize two criteria: total absolute error and integral smoothness. Pareto optimal solutions are determined for the filter parameters of polynomial degree and number of supporting points. The μ-selection method is used to find subsets of Pareto optimal solutions ranked by compromised efficiency. The highest ranked solution is the Salukvadze optimum that balances the criteria most evenly. The approach allows choosing S-G filter parameters that adequately represent and smooth the data. An example applies this to smooth measured acceleration data from a vibrating mechanism.
The document describes the implementation of a natural sounding speech synthesizer for the Marathi language using English text input. It discusses concatenative speech synthesis using a unit selection approach. Over 28,580 syllables, words and sentences recorded from a female speaker were used to create an inventory of speech units. The synthesizer was tested and able to generate natural sounding output and waveforms. Formant frequencies were analyzed using MATLAB and PRAAT tools to evaluate the quality of the synthesized speech.
This document summarizes the evolution of wireless mobile communication systems from 1G to 5G. It discusses the key technologies and features of each generation. 1G systems used analog signals for voice calls. 2G introduced digital encryption and SMS. 3G enabled faster speeds and services like video calls and internet access. 4G provides further increased speeds up to 1Gbps and is based on LTE. 5G is expected to offer much higher speeds and bandwidth, near unlimited connectivity, low latency, and new applications through convergence of technologies like cloud computing and nanotechnology. It is predicted 5G will revolutionize wireless communications and be a major driver of social and economic development.
The document summarizes an algorithm for object detection and tracking in moving backgrounds under different environmental conditions. The algorithm uses a discriminative learning approach to develop a more robust way of updating an adaptive appearance model. It aims to handle partial occlusions without significant drift and work well with minimal parameter tuning. The algorithm divides each frame into blocks and extracts features using a random Gaussian matrix method. A Gaussian classifier is used to get the tracking location with the highest response. The classifier is incrementally learned and updated using positive and negative samples to predict the object location in the next frame. The proposed algorithm is shown to outperform existing L1-tracker algorithms in terms of accuracy, computational efficiency, and robustness to appearance changes.
Car Dynamics Using Quarter Model and Passive Suspension, Part III: A Novel Po...IOSR Journals
This document presents research on using a novel polynomial speed hump and analyzing car dynamics when crossing it using a quarter-car model. The study varied car crossing speed from 5-30 km/h, hump length from 3-9m, and hump height from 60-120mm. MATLAB simulation was used to analyze sprung mass displacement, maximum/minimum displacement, acceleration, and maximum speed for ride comfort. Graphs presented show the effect of hump dimensions and speed on these dynamics. The polynomial hump provided smooth kinematics for the sprung mass, meeting ride comfort standards of less than 0.8m/s^2 acceleration.
This document discusses the modeling of electric and magnetic fields under high voltage AC transmission lines. It begins by introducing the importance of modeling these fields and some of the factors that influence field strength. It then provides equations to calculate the electric field based on the method of images technique, accounting for parameters like conductor voltage, geometry, and distance from measuring point. Similar equations are given for calculating the magnetic field based on conductor current. The document emphasizes that these fields can induce currents and charges in nearby objects, and discusses how to calculate the induced short circuit current. The goal of the modeling is to understand safer operating practices near transmission lines.
This document describes a computational model called the Vehicle Dynamic Model (VDM) that was developed to analyze the dynamic behavior of vehicles. The VDM allows users to define vehicle parameters and evaluate the vehicle's vertical response when traversing different track profiles. It provides four types of results: 1) steady state response, 2) frequency response curves, 3) animation of the vehicle running on a track profile, and 4) natural frequencies and vibration modes. The model accounts for components like tires, springs, dampers and vehicle geometry. It was tested using literature data and allows analyzing ride performance by changing parameters and checking the vehicle's response over different tracks.
The document summarizes research conducted on utilizing foundry waste sand as a masonry mortar. Three types of foundry waste sand were tested: burnt black sand, weathered sand, and currently used sand. Their physical properties were compared to local construction sand. Specimens were cast to test compressive strength of masonry mortar mixes with the different sands. Compressive strengths for local sand were 5.10, 3.70, and 3.80 N/mm2, while strengths for weathered sand were 4.60, 2.95, and 3.23 N/mm2, meeting the required minimum of 3-5 N/mm2. This indicates foundry waste sand can be
This document presents a case study on rooftop rainwater harvesting at the Shivajirao S. Jondhale College of Engineering and Technology campus in Asangaon, India. It calculates the annual water demand for the campus based on student and staff population. It also calculates the potential water collection from the rooftops of two buildings based on their catchment area, average rainfall height, and runoff coefficient. The results show that the amount of water that could be collected from rooftop rainwater harvesting would meet and exceed the total annual campus water demand. Rooftop rainwater harvesting is presented as a low-cost and effective solution to meet the college's water needs while conserving water resources.
This document summarizes previous literature on traffic analysis and congestion modeling in mobile networks. It reviews works that have evaluated network performance at different elements like the BTS, BSC and MSC. However, none addressed congestion at all three basic elements (BTS, BSC, MSC) to characterize end-to-end connections, or used busy hour traffic data to adequately dimension network elements. The document also identifies gaps in the existing research, such as not establishing the statistical causes of congestion or using sufficient data. It proposes to analyze traffic at the access and core networks using live network data over two years to help dimension elements and identify congestion causes to develop an accurate congestion prediction model.
The document describes a proposed modification to the conventional Booth multiplier that aims to increase its speed by applying concepts from Vedic mathematics. Specifically, it utilizes the Urdhva Tiryakbhyam formula to generate all partial products concurrently rather than sequentially. The proposed 8x8 bit multiplier was coded in VHDL, simulated, and found to have a path delay 44.35% lower than a conventional Booth multiplier, demonstrating its potential for higher speed.
A Power Flow Analysis of the Nigerian 330 KV Electric Power SystemIOSR Journals
This document analyzes the power flow of Nigeria's 330kV electric power system. It describes the system which includes 30 buses and 9 generating stations connected by 330kV transmission lines totaling 5000km. The document aims to determine causes of power failures in Nigeria and evaluate individual bus voltages to ensure they remain within statutory limits. A power flow analysis is conducted using Newton-Raphson's method in MATLAB. Results show several bus voltages outside limits. Capacitive compensation is implemented on problem buses, improving voltages to acceptable levels and enhancing system efficiency.
Crack Detection for Various Loading Conditions in Beam Using Hilbert – Huang ...IOSR Journals
The document discusses crack detection in beams using the Hilbert-Huang transform (HHT). It first provides background on using vibration-based methods to detect structural damage. It then describes modeling a cracked beam using finite element analysis, representing the crack as a rotational spring. Vibration analysis is performed on simply supported, fixed-fixed, free-free, and cantilever beams with cracks. HHT is applied to the transformed response to determine crack location based on changes in spatial variation. Both analytical and experimental results show good agreement with the model and that HHT is effective for analysis.
Performance Evaluation of the Bingo Electronic Voting ProtocolIOSR Journals
The document analyzes the computational performance of a prototype implementation of the Bingo electronic voting protocol. Bingo is an end-to-end verifiable voting protocol that provides coercion resistance through the use of zero-knowledge proofs and commitments. The analysis evaluates four main operations of the Bingo protocol: initialization of cyclic groups, generation of dummy votes, zero-knowledge proofs of fair vote distribution, and zero-knowledge proofs of receipt correctness. The performance was found to be affected by the size of the cyclic group order, number of candidates, and number of voters, with the cyclic group order having the largest impact on computation time.
This document discusses a study of arbuscular mycorrhizal (AM) fungi associated with Pandanus fascicularis, a plant species found in coastal regions of Konkan, Maharashtra, India. The key findings are:
1) All samples of P. fascicularis roots were colonized by AM fungi, with colonization rates ranging from 39-74%.
2) A total of 1275 AM fungal spores were isolated from soil samples, with spore densities ranging from 39-890 spores per 100g of soil.
3) Thirteen AM fungal morphospecies were identified, with Kuklospora colombiana being the most widely distributed.
DESIGN AND DEVELOPMENT OF BUSINESS RULES MANAGEMENT SYSTEM (BRMS) USING ATLAN...ijcsit
The document describes the design and development of a Business Rules Management System (BRMS) using the ATL and Eclipse Sirius frameworks. It proposes a new "Target Ecore meta model" to improve the structure and management of business rules. The system allows business rules to be modeled and transformed from their current format into an object-oriented format using ATL model transformations. This provides improved modularity, scalability and extensibility of the rules compared to the original structure. A case study demonstrates transforming an example business rule from a software package based on the proposed approach.
This document discusses the use of Model-Driven Architecture (MDA) and model transformations in software product lines (SPL). It begins by introducing SPLs and MDA. SPLs aim to increase productivity by leveraging commonalities between related products. MDA uses platform-independent and platform-specific models with transformations between them. The document then explores combining MDA and SPL approaches through the Modden framework and Baseline-Oriented Modeling. Modden develops reusable core assets through domain and application engineering processes with MDA. Baseline-Oriented Modeling produces expert systems as PRISMA architectural models from SPLs using MDA.
Object Oriented Approach for Software DevelopmentRishabh Soni
This document provides an overview of object-oriented design methodologies. It discusses key object-oriented concepts like abstraction, encapsulation, and polymorphism. It also describes the three main models used in object-oriented analysis: the object model, dynamic model, and functional model. Finally, it outlines the typical stages of the object-oriented development life cycle, including system conception, analysis, system design, class design, and implementation.
Service Oriented & Model Driven ArchitecturesPankaj Saharan
The document discusses a seminar on combining Service-Oriented Architectures (SOA) and Model Driven Architectures (MDA). It first provides an overview of SOA and MDA individually, including their goals and characteristics. It then analyzes the similarities and differences between the two approaches when combining them. Some benefits are improved productivity and lower costs, while challenges include defining models and transformations between levels of abstraction. Overall, the document concludes that combining SOA with MDA's model-driven approach can provide benefits like business agility if key issues like semantics and metadata modeling are adequately addressed.
Integrating profiling into mde compilersijseajournal
Scientific computation requires more and more performance in its algorithms. New massively parallel
architectures suit well to these algorithms. They are known for offering high performance and power
efficiency. Unfortunately, as parallel programming for these architectures requires a complex distribution
of tasks and data, developers find difficult to implement their applications effectively. Although approaches
based on source-to-source intends to provide a low learning curve for parallel programming and take
advantage of architecture features to create optimized applications, programming remains difficult for
neophytes. This work aims at improving performance by returning to the high-level models, specific
execution data from a profiling tool enhanced by smart advices computed by an analysis engine. In order to
keep the link between execution and model, the process is based on a traceability mechanism. Once the
model is automatically annotated, it can be re-factored aiming better performances on the re-generated
code. Hence, this work allows keeping coherence between model and code without forgetting to harness the
power of parallel architectures. To illustrate and clarify key points of this approach, we provide an
experimental example in GPUs context. The example uses a transformation chain from UML-MARTE
models to OpenCL code.
Project PlanFor our Project Plan, we are going to develop.docxwkyra78
The document outlines a project plan to develop a new payroll system for an organization using the waterfall development methodology. It estimates the project will take 33.3 person-months to complete based on industry standards of allocating 15% of effort to planning, 20% to analysis, 35% to design, and 30% to implementation. It also discusses developing a work plan, staffing the project, and coordinating project activities to manage the system development life cycle.
BPM-X Pattern-based model transformations (v2)BPM-Xchange
Model data conversions can be achieved with a pattern-based transformation engine, a component included into the BPM-Xchange® enterprise application integration (EAI) software.
This document discusses implementing the MAP framework within the GERAM framework. MAP defines the roles and deliverables of a software architect, while GERAM is a meta-framework for enterprise architectures. The document argues that by mapping MAP to GERAM, it provides a common ontology for software architects across different enterprise architecture and software development frameworks. It describes how MAP and GERAM can be implemented using three layers (M0-M3), with MAP defining the meta-ontological concepts and processes that can then be customized for specific companies or software processes.
PYFML- A TEXTUAL LANGUAGE FOR FEATURE MODELINGijseajournal
The document describes PyFML, a new textual feature modeling language based on Python. PyFML aims to generalize classical feature models by supporting extensions like cardinalities, attributes, and complex cross-tree constraints. It uses the textX meta-language to build the PyFML grammar and maps feature models to a Python object representation that can be analyzed by the PyConstraint solver. The language is designed to provide a human-readable notation for feature models while enabling scalable analysis of large problems.
MASRML - A DOMAIN-SPECIFIC MODELING LANGUAGE FOR MULTI-AGENT SYSTEMS REQUIREM...ijseajournal
MASRML – Multi-Agent Systems Requirements Modeling Language – is a UML-based Domain-Specific Modeling Language conceived for the requirements modeling in multi-agent system projects. Along this work the extended metamodel developed to support the language is described and the applicability of this DSML in the requirements identification of a multi-agent system is demonstrated using the new mechanisms produced to model specific functional requirements for this kind of system. This work also includes how the DSML was validated and the impressions collected during the validations.
MASRML - A DOMAIN-SPECIFIC MODELING LANGUAGE FOR MULTI-AGENT SYSTEMS REQUIREM...ijseajournal
MASRML – Multi-Agent Systems Requirements Modeling Language – is a UML-based Domain-Specific
Modeling Language conceived for the requirements modeling in multi-agent system projects. Along this
work the extended metamodel developed to support the language is described and the applicability of this
DSML in the requirements identification of a multi-agent system is demonstrated using the new
mechanisms produced to model specific functional requirements for this kind of system. This work also
includes how the DSML was validated and the impressions collected during the validations.
Modeling and Evaluation of Performance and Reliability of Component-based So...Editor IJCATR
Validation of software systems is very useful at the primary stages of their development cycle. Evaluation of functional
requirements is supported by clear and appropriate approaches, but there is no similar strategy for evaluation of non-functional requirements
(such as performance and reliability). Whereas establishing the non-functional requirements have significant effect on success of software
systems, therefore considerable necessities are needed for evaluation of non-functional requirements. Also, if the software performance has
been specified based on performance models, may be evaluated at the primary stages of software development cycle. Therefore, modeling
and evaluation of non-functional requirements in software architecture level, that are designed at the primary stages of software systems
development cycle and prior to implementation, will be very effective.
We propose an approach for evaluate the performance and reliability of software systems, based on formal models (hierarchical timed
colored petri nets) in software architecture level. In this approach, the software architecture is described by UML use case, activity and
component diagrams, then UML model is transformed to an executable model based on hierarchical timed colored petri nets (HTCPN) by a
proposed algorithm. Consequently, upon execution of an executive model and analysis of its results, non-functional requirements including
performance (such as response time) and reliability may be evaluated in software architecture level.
The document discusses model-oriented approaches, BPMN 2.0, and Enterprise 2.0. It describes the model-driven architecture approach, its history and key aspects. It outlines the different types of diagrams in BPMN 2.0 including processes, collaborations, and choreographies. It defines the key elements of Enterprise 2.0 including search, links, authoring, tags, and social functions. It also lists example tools that support these approaches.
Planner Application Based on Customer Relationship ManagementIRJET Journal
This document describes a project to build a calendar or planner application for managers in a customer relationship management (CRM) system. The goals are to aggregate task and appointment data from different CRM modules into an organized calendar view to improve productivity and visibility. The planner will have daily, weekly, and monthly views. It will be developed using React and integrate with an existing auto dealership CRM system to help managers and employees more easily track responsibilities and tasks. The methodology discusses using the FullCalendar module to implement the calendar functionality and integrating it with React components to build out the different views and functionality of the planner application.
This document discusses model-driven architecture (MDA), an approach to system specification and interoperability based on the use of formal models. MDA uses platform-independent models that are translated to platform-specific models using formal rules. Core MDA standards like UML, MOF, XMI, and CWM define the infrastructure. The vision is for nearly seamless interoperability based on shared metadata and formal model translations, with a long-term goal of adaptive object models that can dynamically interpret models at runtime.
The document provides an overview of object-oriented technology and software engineering approaches. It describes the structured and object-oriented approaches, the roles of modeling, notation, process and techniques in software development. It also summarizes the Unified Modeling Language (UML), Unified Process, View Alignment techniques, and the Visual Paradigm for UML (VP-UML) CASE tool.
Process perspective is valuable, but far too much time is wasted in detailed process modelling with too little benefit. Presents an approach that delivers high benefits for less effort.
Automatic generation of business process models from user storiesIJECEIAES
In this paper, we propose an automated approach to extract business process models from requirements, which are presented as user stories. In agile software development, the user story is a simple description of the functionality of the software. It is presented from the user's point of view and is written in natural language. Acceptance criteria are a list of specifications on how a new software feature is expected to operate. Our approach analyzes a set of acceptance criteria accompanying the user story, in order, first, to automatically generate the components of the business model, and then to produce the business model as an activity diagram which is a unified modeling language (UML) behavioral diagram. We start with the use of natural language processing (NLP) techniques to extract the elements necessary to define the rules for retrieving artifacts from the business model. These rules are then developed in Prolog language and imported into Python code. The proposed approach was evaluated on a set of use cases using different performance measures. The results indicate that our method is capable of generating correct and accurate process models.
WEB PORTAL INTEGRATION ARCHITECTURE APPROACHESijwscjournal
This document proposes a service-oriented architecture approach for web portal integration. It begins by describing a platform-independent integration architecture based on standards like UML, MOF, and XMI. This allows modeling integration independently of specific technologies. The document then discusses using WSMO as a specific implementation, describing how the platform-independent layers map to WSMO. It also discusses integrating agents and semantic web services using model transformations. Finally, it proposes additional "X-as-a-Service" layers that could be part of the architecture, like structure-as-a-service and process-as-a-service.
WEB PORTAL INTEGRATION ARCHITECTURE APPROACHESijwscjournal
Enterprise Modelling with Web portal integration architecture requires investment of advanced architectural thinking into definition of services before any development of services or service consumers can begin. Service Oriented Architecture (SOA) is gradually replacing monolithic architecture as the premier design principle for new business applications with its inherently systematic nature and capability. Earlier efforts of notable styles of SOA such as CORBA and XATMI have failed to be adopted as main stream projects because of demanding design process requirement with sense-making activities and even have been residing with the modern SOA or Web services middleware. In this paper it is aimed to incorporate sensemaking design activities with the proposed semantic web service based architecture. This paper tries to tackle the above problem by proposing a service-oriented architecture for web data and service integration. A gen-Spec architectural pattern has been suggested and adopted in order to tackle the problem.
Firstly, it proposes a service-oriented platform independent architecture and Secondly, it presents a specific deployment of such architecture for data and service integration on the web using semantic web services implemented with the WSMO (Web Services Modeling Ontology).
This document provides a technical review of secure banking using RSA and AES encryption methodologies. It discusses how RSA and AES are commonly used encryption standards for secure data transmission between ATMs and bank servers. The document first provides background on ATM security measures and risks of attacks. It then reviews related work analyzing encryption techniques. The document proposes using a one-time password in addition to a PIN for ATM authentication. It concludes that implementing encryption standards like RSA and AES can make transactions more secure and build trust in online banking.
This document analyzes the performance of various modulation schemes for achieving energy efficient communication over fading channels in wireless sensor networks. It finds that for long transmission distances, low-order modulations like BPSK are optimal due to their lower SNR requirements. However, as transmission distance decreases, higher-order modulations like 16-QAM and 64-QAM become more optimal since they can transmit more bits per symbol, outweighing their higher SNR needs. Simulations show lifetime extensions up to 550% are possible in short-range networks by using higher-order modulations instead of just BPSK. The optimal modulation depends on transmission distance and balancing the energy used by electronic components versus power amplifiers.
This document provides a review of mobility management techniques in vehicular ad hoc networks (VANETs). It discusses three modes of communication in VANETs: vehicle-to-infrastructure (V2I), vehicle-to-vehicle (V2V), and hybrid vehicle (HV) communication. For each communication mode, different mobility management schemes are required due to their unique characteristics. The document also discusses mobility management challenges in VANETs and outlines some open research issues in improving mobility management for seamless communication in these dynamic networks.
This document provides a review of different techniques for segmenting brain MRI images to detect tumors. It compares the K-means and Fuzzy C-means clustering algorithms. K-means is an exclusive clustering algorithm that groups data points into distinct clusters, while Fuzzy C-means is an overlapping clustering algorithm that allows data points to belong to multiple clusters. The document finds that Fuzzy C-means requires more time for brain tumor detection compared to other methods like hierarchical clustering or K-means. It also reviews related work applying these clustering algorithms to segment brain MRI images.
1) The document simulates and compares the performance of AODV and DSDV routing protocols in a mobile ad hoc network under three conditions: when users are fixed, when users move towards the base station, and when users move away from the base station.
2) The results show that both protocols have higher packet delivery and lower packet loss when users are either fixed or moving towards the base station, since signal strength is better in those scenarios. Performance degrades when users move away from the base station due to weaker signals.
3) AODV generally has better performance than DSDV, with higher throughput and packet delivery rates observed across the different user mobility conditions.
This document describes the design and implementation of 4-bit QPSK and 256-bit QAM modulation techniques using MATLAB. It compares the two techniques based on SNR, BER, and efficiency. The key steps of implementing each technique in MATLAB are outlined, including generating random bits, modulation, adding noise, and measuring BER. Simulation results show scatter plots and eye diagrams of the modulated signals. A table compares the results, showing that 256-bit QAM provides better performance than 4-bit QPSK. The document concludes that QAM modulation is more effective for digital transmission systems.
The document proposes a hybrid technique using Anisotropic Scale Invariant Feature Transform (A-SIFT) and Robust Ensemble Support Vector Machine (RESVM) to accurately identify faces in images. A-SIFT improves upon traditional SIFT by applying anisotropic scaling to extract richer directional keypoints. Keypoints are processed with RESVM and hypothesis testing to increase accuracy above 95% by repeatedly reprocessing images until the threshold is met. The technique was tested on similar and different facial images and achieved better results than SIFT in retrieval time and reduced keypoints.
This document studies the effects of dielectric superstrate thickness on microstrip patch antenna parameters. Three types of probes-fed patch antennas (rectangular, circular, and square) were designed to operate at 2.4 GHz using Arlondiclad 880 substrate. The antennas were tested with and without an Arlondiclad 880 superstrate of varying thicknesses. It was found that adding a superstrate slightly degraded performance by lowering the resonant frequency and increasing return loss and VSWR, while decreasing bandwidth and gain. Specifically, increasing the superstrate thickness or dielectric constant resulted in greater changes to the antenna parameters.
This document describes a wireless environment monitoring system that utilizes soil energy as a sustainable power source for wireless sensors. The system uses a microbial fuel cell to generate electricity from the microbial activity in soil. Two microbial fuel cells were created using different soil types and various additives to produce different current and voltage outputs. An electronic circuit was designed on a printed circuit board with components like a microcontroller and ZigBee transceiver. Sensors for temperature and humidity were connected to the circuit to monitor the environment wirelessly. The system provides a low-cost way to power remote sensors without needing battery replacement and avoids the high costs of wiring a power source.
1) The document proposes a model for a frequency tunable inverted-F antenna that uses ferrite material.
2) The resonant frequency of the antenna can be significantly shifted from 2.41GHz to 3.15GHz, a 31% shift, by increasing the static magnetic field placed on the ferrite material.
3) Altering the permeability of the ferrite allows tuning of the antenna's resonant frequency without changing the physical dimensions, providing flexibility to operate over a wide frequency range.
This document summarizes a research paper that presents a speech enhancement method using stationary wavelet transform. The method first classifies speech into voiced, unvoiced, and silence regions based on short-time energy. It then applies different thresholding techniques to the wavelet coefficients of each region - modified hard thresholding for voiced speech, semi-soft thresholding for unvoiced speech, and setting coefficients to zero for silence. Experimental results using speech from the TIMIT database corrupted with white Gaussian noise at various SNR levels show improved performance over other popular denoising methods.
This document reviews the design of an energy-optimized wireless sensor node that encrypts data for transmission. It discusses how sensing schemes that group nodes into clusters and transmit aggregated data can reduce energy consumption compared to individual node transmissions. The proposed node design calculates the minimum transmission power needed based on received signal strength and uses a periodic sleep/wake cycle to optimize energy when not sensing or transmitting. It aims to encrypt data at both the node and network level to further optimize energy usage for wireless communication.
This document discusses group consumption modes. It analyzes factors that impact group consumption, including external environmental factors like technological developments enabling new forms of online and offline interactions, as well as internal motivational factors at both the group and individual level. The document then proposes that group consumption modes can be divided into four types based on two dimensions: vertical (group relationship intensity) and horizontal (consumption action period). These four types are instrument-oriented, information-oriented, enjoyment-oriented, and relationship-oriented consumption modes. Finally, the document notes that consumption modes are dynamic and can evolve over time.
The document summarizes a study of different microstrip patch antenna configurations with slotted ground planes. Three antenna designs were proposed and their performance evaluated through simulation: a conventional square patch, an elliptical patch, and a star-shaped patch. All antennas were mounted on an FR4 substrate. The effects of adding different slot patterns to the ground plane on resonance frequency, bandwidth, gain and efficiency were analyzed parametrically. Key findings were that reshaping the patch and adding slots increased bandwidth and shifted resonance frequency. The elliptical and star patches in particular performed better than the conventional design. Three antenna configurations were selected for fabrication and measurement based on the simulations: a conventional patch with a slot under the patch, an elliptical patch with slots
1) The document describes a study conducted to improve call drop rates in a GSM network through RF optimization.
2) Drive testing was performed before and after optimization using TEMS software to record network parameters like RxLevel, RxQuality, and events.
3) Analysis found call drops were occurring due to issues like handover failures between sectors, interference from adjacent channels, and overshooting due to antenna tilt.
4) Corrective actions taken included defining neighbors between sectors, adjusting frequencies to reduce interference, and lowering the mechanical tilt of an antenna.
5) Post-optimization drive testing showed improvements in RxLevel, RxQuality, and a reduction in dropped calls.
This document describes the design of an intelligent autonomous wheeled robot that uses RF transmission for communication. The robot has two modes - automatic mode where it can make its own decisions, and user control mode where a user can control it remotely. It is designed using a microcontroller and can perform tasks like object recognition using computer vision and color detection in MATLAB, as well as wall painting using pneumatic systems. The robot's movement is controlled by DC motors and it uses sensors like ultrasonic sensors and gas sensors to navigate autonomously. RF transmission allows communication between the robot and a remote control unit. The overall aim is to develop a low-cost robotic system for industrial applications like material handling.
This document reviews cryptography techniques to secure the Ad-hoc On-Demand Distance Vector (AODV) routing protocol in mobile ad-hoc networks. It discusses various types of attacks on AODV like impersonation, denial of service, eavesdropping, black hole attacks, wormhole attacks, and Sybil attacks. It then proposes using the RC6 cryptography algorithm to secure AODV by encrypting data packets and detecting and removing malicious nodes launching black hole attacks. Simulation results show that after applying RC6, the packet delivery ratio and throughput of AODV increase while delay decreases, improving the security and performance of the network under attack.
This document discusses image deblurring techniques. It begins by introducing image restoration and focusing on image deblurring. It then discusses challenges with image deblurring being an ill-posed problem. It reviews existing approaches to screen image deconvolution including estimating point spread functions and iteratively estimating blur kernels and sharp images. The document also discusses handling spatially variant blur and summarizes the relationship between the proposed method and previous work for different blur types. It proposes using color filters in the aperture to exploit parallax cues for segmentation and blur estimation. Finally, it proposes moving the image sensor circularly during exposure to prevent high frequency attenuation from motion blur.
This document describes modeling an adaptive controller for an aircraft roll control system using PID, fuzzy-PID, and genetic algorithm. It begins by introducing the aircraft roll control system and motivation for developing an adaptive controller to minimize errors from noisy analog sensor signals. It then provides the mathematical model of aircraft roll dynamics and describes modeling the real-time flight control system in MATLAB/Simulink. The document evaluates PID, fuzzy-PID, and PID-GA (genetic algorithm) controllers for aircraft roll control and finds that the PID-GA controller delivers the best performance.
This document provides a review of synthetic aperture radar (SAR) engineering. It begins with an introduction to SAR and its uses in remote sensing and defense. It then discusses designs for SAR systems and antennas. The document reviews recent literature on SAR, including works discussing antenna mask design to optimize SAR performance, polarimetric SAR for mapping terrain changes, and using SAR to monitor cryospheric regions. It concludes that SAR is a useful technique for achieving good image quality through optimized antenna design.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
This presentation provides valuable insights into effective cost-saving techniques on AWS. Learn how to optimize your AWS resources by rightsizing, increasing elasticity, picking the right storage class, and choosing the best pricing model. Additionally, discover essential governance mechanisms to ensure continuous cost efficiency. Whether you are new to AWS or an experienced user, this presentation provides clear and practical tips to help you reduce your cloud costs and get the most out of your budget.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
A Comprehensive Guide to DeFi Development Services in 2024Intelisync
DeFi represents a paradigm shift in the financial industry. Instead of relying on traditional, centralized institutions like banks, DeFi leverages blockchain technology to create a decentralized network of financial services. This means that financial transactions can occur directly between parties, without intermediaries, using smart contracts on platforms like Ethereum.
In 2024, we are witnessing an explosion of new DeFi projects and protocols, each pushing the boundaries of what’s possible in finance.
In summary, DeFi in 2024 is not just a trend; it’s a revolution that democratizes finance, enhances security and transparency, and fosters continuous innovation. As we proceed through this presentation, we'll explore the various components and services of DeFi in detail, shedding light on how they are transforming the financial landscape.
At Intelisync, we specialize in providing comprehensive DeFi development services tailored to meet the unique needs of our clients. From smart contract development to dApp creation and security audits, we ensure that your DeFi project is built with innovation, security, and scalability in mind. Trust Intelisync to guide you through the intricate landscape of decentralized finance and unlock the full potential of blockchain technology.
Ready to take your DeFi project to the next level? Partner with Intelisync for expert DeFi development services today!
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Building Production Ready Search Pipelines with Spark and Milvus
F1803053945
1. IOSR Journal of Computer Engineering (IOSR-JCE)
e-ISSN: 2278-0661,p-ISSN: 2278-8727, Volume 18, Issue 3, Ver. V (May-Jun. 2016), PP 39-45
www.iosrjournals.org
DOI: 10.9790/0661-1803053945 www.iosrjournals.org 39 | Page
Tool Support for the Service Oriented Modelling
Mr.N.Ramkumar
(Department Of Information Technology), Sri Shakthi Institute Of Engineering and Technology, Coimbatore,
India.
Abstract: The Tool Support for the Service Orientated modelling is one of the technical project and to model
the changes from the existing metamodel and to define the grammar for the internal policy and the coordination
and to compile in the SRML editor. Some of the major components like the wires, business protocol, business
role and the transition have already been implemented in the SRML model. The requirement is the existing
metamodel: in the existing metamodel few components have not been implemented. The components are the part
of the interaction protocol i.e. Coordination and the external policy. The main task of my project is to implement
these two properties in the existing metamodel.
The other part of the project is to define the grammar for the coordination and the interaction protocol and to
compile in the SRML editor. The grammars are defined in such way that the user can edit and can view the
changes in the editor. By using the defined rules we can generate the metamodel using the eclipse tool
Keywords: SRML Editor, Metamodel
I. Introduction
The SENSORIA (Software Engineering for Service Orientated Overlay Computers) aims to develop
novel and comprehensive approach for engineering service orientated computations. The main issues of the
SENSORIA are the development of service specification like design and reconfiguration of service based
architectures. In this setting, the system consists of the components and the interconnections along with the
current state. [1]
SRML is the modelling language that the number of addresse the qualitative and quantitative support
which is based on the mathematical foundation.SRML offers for modelling business services and activities
and on the methodological approach that SRML supports.SRML addresses Service Orientated Computing
(SOC) as a new technologies in which interactions are no longer based on fixed or programmed exchanges of
products with particular parties.which is known as clientship in the object orientated programming but on the
provisioning of services by external providers that are procured on the fly subject to a negotiation of service
level agreements. More precisely, the processes of discovery and selection of services as required by an
application are not coded in the design time but performed by the middleware according to functional and non
functional requirements. (SLAs).
One of the key concern in defining the SRML was precisely the need to distinguish between these
two different modes of composition. In SOC, we decided to follow the basic principles and structures put
forward by the Service Component Architecture. However, SCA addresses low level design in the sense that it
provices an assembly model and binding mechanisms for service components and clients programmed in
specific languages. Instead, we find SRML are primitives that address high level design and support of shift of
emphasis from programming to modelling from component interoperability to business integration.
SRML is a technology agnostic in the sense that it does not commit to any specific language or
platform for programming and composing services. SRML offers for high level business design by the
traditional use case diagrams. The use case diagram can be extended so as to support the specificities of
service- oriented software engineering.
In this paper, the requirement is that the existing metamodel, in the existing metamodel, a
part of the interaction protocol i.e. Co-ordination and the external policy was not implemented yet. The main
task of the project is to extend these two properties in the existing metamodel. The other task of the
project is transformation from SRML models to business process languages using the BPEL and
also to improve the graphical representation of SRML using the visual graphical editors like Eclipse.
In order to extend the metamodel, the rules are defined as grammar (called XTEXT Grammar) is
used to implement the co- ordination and the external policy. In order to transform from SRML models to
business process, the language called Business Process Execution Language will be used.
II. Objective
1. Extending the existing Metamodel:
The above step is achieved in the given time duration.
2. Tool Support for the Service Oriented Modelling
DOI: 10.9790/0661-1803053945 www.iosrjournals.org 40 | Page
2. Transformation from SRML models to business process languages:
Due to the time duration, the given information about the background knowledge about the BPEL and
SRML so that it will be useful for the future reference and the implementation of the SRML to BPEL will be
considered for the future work.
3. To improve the graphical representation of SRML Metamodel:
Due to the time duration, the background knowledge about the graphical representation and concept
about GMF so that it will be useful for the future reference and the implementation of the graphical
representation will be considered for the future work
The main purpose of the project is to implement the interaction protocol and the external policy
in the existing metamodel using the xtext grammar. In the interaction protocol, only the coordination
part is not yet implemented. The main objective is to implement the coordination part. The other objective is
to make a research on the case study of the existing example (Mortgage).The other objective is
transformation from SRML models to business process languages using the BPEL and also to improve the
graphicalrepresentation of SRML using the visual graphical editors like Eclipse.
Metamodelling is the construction of collection of things, terms etc within a certain domain. A
metamodel is yet another abstraction, highlighting the properties of the model itself. A model conforms to its
metamodel in the way that a computer program conforms to the grammar of the programming language. [2]
Common uses for metamodel are as a schema for semantic data that needs to be exchanged or stored,
language that supports a particular method or process, as a language to express additional semantics of existing
information. Metadata modelling is a type of metamodelling used in software engineering and systems
engineering for the analysis and construction of models applicable and useful to some predefined class of
problems.
In Software engineering, several types of models can be distinguished they are:
Metadata Modeling, Meta-process modelling, Executable metamodeling and Model Transformation Language.
Why extending the metamodel?
The status of the SRML metamodel could be defined already as a prefinal, the development
of the SRML editor is ongoing. Until, now only the SRML modules can be represented graphically while
the specification of Business Roles, Business Protocols and interaction protocols still have to be done in
tree-based view. Generating the extended properties views in GMF is still under development, so that the
generated editor code may be extended by hand to provide user-friendly editing support of certain features.
In order to overcome some of the above said features, the metamodel is extended. [8]
In this paper, already created metamodel is taken into consideration and going to extend the existing
metamodel i.e.Ecore Metamodel. The Ecore metamodel is a powerful tool for Model driven architecture.
Typically if we take an application we would define the objects, their attributes and their relationship. We
can also define the specific operation that belongs to those objects using the Eoperation model element. By
default, EMF will generate the method signatures, for those operations but we have to go back and
implement those operation often changing the code in similar logic time.[2]
To add some sort of arbitrary implementation behaviour in the model, one approach is to add the
text based annotations of type Eannotation to model the objects and to interpret those changes in templates in
the code generation. However our aim is not to validate model elements but rather to model implementation
itself, in order to reuse those metamodel elements with any concrete model need to extend the metamodel.[2]
There are several methods to extend the existing metamodel, adding the grammar in the existing
metamodel gives the better result. So by following this methodology so as to get the better outcome.
In the recommended part is to develop the transformation from SRML models to some business
process execution language. So far there are only transformation of BPEL to SRML is there but in this
project the transformation from SRML model to BPEL is to be implemented
To improve the graphical representation of the SRML in visual editors like Eclipse:
Earlier the graphical representation does not show better result, there were no specific tool to
implement the models will be messed without any order and will not give the clear picture, so in order to
improve the graphical representation to use the Eclipse visual editor project is to advance the creation,
evolution, promotion of the eclipse visual editor platform, and to cultivate both an open source community
and an ecosystem of products, capabilities and services. The most obvious tool that the visual editors
draws upon is the Graphical Editing Framework (GEF) As part of the visual editor it uses the
Eclipse Modelling framework behind the scenes to map among the model, a java class, and the
graphical representation.
3. Tool Support for the Service Oriented Modelling
DOI: 10.9790/0661-1803053945 www.iosrjournals.org 41 | Page
III. Literature Survey
Xtext is use to create a small domain specific Language or it is used to create a general purpose
programming language. With Xtext create our own languages. Xtext is use to create an Eclipse-based
development environment providing editing experience from modern Java IDEs in a short amount of time.
So Xtext grammar rules are used in the eclipse to generate the metamodel [3].
The Eclipse is an open source software development project to develop a wide range of exemplary
extensible development tools and the tools specific components for the Eclipse Platform. The Eclipse is used
to generate the class diagram. It is one of the efficient tools.
IV. Technologies
SRML:
SRML is a modelling language for service oriented systems. It operates the higher level of
abstraction of business modelling. It provides the semantic modelling that are independent of languages
and platforms in which services and program are executed.SRML is a prototype modelling language,
methodological approach to service orientated systems. SRML relies on the mathematical domains of service
composition and reconfiguration SRML abstracts from the typical mechanisms made available by service
oriented middleware such as sessions and event or message correlation, as well as the brokers that are
responsible for the discovery and binding of services. A formal computation and coordination model was
developed for SRML over which qualitative and quantitative analysis techniques were defined using the UMC
model checker and the PEPA stochastic analyser. [1]
XTEXT:
XTEXT can be use to create our own languages. It is a decent tool support to use XTEXT to create a
sophisticated Eclipse based development environment. It provides editing experience from java IDE in a short
period of time. TEXT is otherwise called as Language development framework. XTEXT is a professional open
source project. XTEXT provides with the set of domain specific language and modern API to describe the
different aspect of the programming language. XTEXT also gives the full implementation of that language
running on the JVM. The grammar language is the cornerstone of the Xtext. It is a domain specific
language, carefully defined for the description for the textual languages. The main idea is to describe the
concrete syntax and how it is mapped to an in memory model created during parsing [3]
Xtext uses the lightweight dependency injection framework for the googe guice to wire up the
whole language as well as the IDE infrastructure.Xtext comes with default implementations and DSLs
and APIs for the aspect that are common spots for customization. Google Guice gives you the
power to exchange every littles class.[3]
Xtext is a professional Open Source Project. Xtext is an Eclipse.org.project. Besides many other
advantages this means that not to worry anything about the IP issues. Because the Eclipse Foundation had
their own layers who take care that no intellectual property is violated.[3]
ECLIPSE MODELLING FRAMEWORK:
EMF is an integral part of the Eclipse platform, as well as a cornerstone of related technologies and
frameworks, such as the eclipse visual editors.SOD and UML amny of which are integrated into IBM platforms
like Rational Application Developer and websphere Business Modeler. In recent years, EMF has grown to
encompass java technology features such as enumerated types, annotations and generics.[2]
EMF is a modelling framework and code generation facility for building tools and other
applications based on a structured data model. It provides the run time support to produce the java
classes for the model, a set of adapter classes that enable viewing and command based editing of the
model. Models can be specified using Java, XML documents or modelling tools like Rational Rose, then
import into EMF. The key point about EMF is it provides the foundation for interoperability with other EMF
based tools and applications.[4]
XPAND:
XPAND is also part of Eclipse and ships with its own documentation.The second workflow component
is an instance of org.eclipse.xpand2.Generator, which is the MWE2 facade to the Xpand template language. The
Xpand generator needs to know which template to invoke for which models. Qualified names in Xpand are
separated by a double colon. In Xpand there is a file statement which refers to outlets.
XPAND is statically typed template language featuring. It works very similar to the incremental java
compiler in the eclipse. XPAND is used for functional extensions, model transformation, model validation
and much more. It has editor features like syntax colouring, error hi ghli ght i ng , a nd n a v i g a t i o n and
t he c od e completion. XPAND was originally developed as a part of the open architecture ware project
4. Tool Support for the Service Oriented Modelling
DOI: 10.9790/0661-1803053945 www.iosrjournals.org 42 | Page
before it became a component under eclipse.[3]
BPEL:
Business Process Execution Language (BPEL) is an XML based language for the formal
specification of business processes and business interaction protocols. It defines the notation of business
process behaviour based on the web services. Business process can be divided into two categories one is
executable business processes and the other type is Business protocol. BPEL is used to model the behaviour of
both executable and abstract processes. The scope includes sequencing of process activities, correlation of
messages.[5] BPEL is often associated with Business Process Management Notation (BPMN), which also seeks
to streamline the BPM modelling process. Unlike BPEL, BPMN is not executable and so is mostly used for
planning and design. BPMN, though, has a visual component that makes it easier to understand their own visual
notation for BPEL to further simplify the language.[14] BPEL and BPMN have grown in popularity together
over the last few years as each seeks to simplify business process management and encourage collaboration
between business people and developers. But translating from one to the other remains a challenge. [14]
V. Methodology
5.1 ECORE METAMODEL :
The Ecore model or the meta model of a textual representation describes the structure of its abstract
syntax tree. Ecore models are declared to be either inferred from the grammar or imported.[7] The Ecore
metamodel is the powerful tool for designing model driven Architecture which can be used as a starting
point for the software development. Typically would define the objects in our applications, their attributes
and the relationships. It defines the specific operations that belong to those objects. Those operations are
added using the Object EOperations. By default the EMF will generate skeletons , or method signatures for
those operations, but have to go back and implement those operations often recoding similar logic and time.
[7] The term Ecore is a modelling framework for Eclipse. According to the Eclipse foundation, the core
EMF framework includes a metamodel (ecore) for describing models and run time support for the models. In
other words, ecore defines the structure of the models Developers use to maintain application data. [7]
Creation of the ECORE Diagram for the existing metamodel:
Considering the Existing model (TravelBooking) to run the file SRML Ecore in the Eclipse tool in
order to generate the Ecore Diagram (metamodel). The Ecore diagram will look like tree like structure which
is used to check the properties defined in the class diagram. Whereas it can generate the class diagram in the
eclipse but the best way is to generate the ecore diagram. The Ecore metamodel can be generated using
eclipse.Metamodel is generated by using the Eclipse Modelling Framework and by loading the file to get the
tree like structure.
In the existing grammar, the changes are made i.e. adding the external policy of the above grammar
and the ecore diagram is generated. The external policy and their properties in the ecore diagram is
viewable. The properties which are name, SlaVariables and the constraints.
Fig: 1 SRML Model
5. Tool Support for the Service Oriented Modelling
DOI: 10.9790/0661-1803053945 www.iosrjournals.org 43 | Page
After implementing the rules of the grammar, the external policy in the above ecore diagram can be
viewed, in the diagram the properties and their name, SlaVariables and the constraints. The name is defined as
a string.
After the evaluation of the metamodel the below two diagrams are viewed. One for the
Interaction protocol and external policy.
Fig :2 MetaModel
The above metamodel shows the representation of the interaction protocol where it contains the
interaction and the coordination. The interactions consists of Role A ,Role B and the Coordination. After
defining the set of rules of the grammar used to generate the metamodel using the eclipse tool, but the metmodel
generated by the tool will be very large so metamodel is drawn using the UMLet tool
The above fig shows that the each instance of type Interaction protocol seems to contain an instance of
type Interactions and the coordination. The black diamond represents the composition. It is placed in the
interaction protocol class because it is the interaction protocol that is composed of the interaction and the
coordination. From the metamodel it is understood that the interaction and the coordination knew about the
interaction protocol. Composition relationship is the strong form of containment or aggregation. Aggregation is
the whole part relationship.
Similarly, an instance of the type Interaction seems to contain an instance of the type Role A and Role
B. There is an association represented between the coordination and the Behavioural Specification. In the
diagram the association has an arrowhead to denote the Behavioural Specification does not know anything
about the coordination. The dashed arrow between the Coordination and the coordination are seen in the
diagram. This is the dependency relationship. The coordination somehow depends on the coordination.
The cardinalities are the interaction Protocol represents the one interaction and the one coordination.
The interaction represents the one role a and one Role B.In other words it is one to one relationship.
Fig :3 Extended Metamodel
The above metamodel shows the representation of the external policy. The external policy consists
of slavariables and the constraints. The constraints consist of the Sla variables and the assignment. The
assignment consists of conditions. The conditions are defined in the grammar.
The above fig shows each instance of the type External Policy seems to contain the instance of the type
Sla Variable and the constraints. This relationship is known as Composition. The black diamond represents the
composition and the other side of the relationship is blank just the line so it is understood that the constraints
and the sla variable knew about the external policy. Similarly, the instance constraint seems to contain the
6. Tool Support for the Service Oriented Modelling
DOI: 10.9790/0661-1803053945 www.iosrjournals.org 44 | Page
instance of the type Sla Variable and the Assignment. And these instances are represented as a compositional
relationship. The instance Assignment has an instance of the condition and also denoted as a composition
relationship. Where the condition knew about the assignment since it is represented as normal line without any
arrowhead.
In the above metamodel, the External policy class instance will always have at least one SlaVariables
and one constraints. Because the relationship is called the composition relationship, when the External policy is
removed or destroyed, the slavariables and constraints class will automatically removed or destroyed as well.
Another important concept about the composition aggregation is that the part of the class can only be related to
one instance of the parent class. Likewise, the constraints class instance will always have atleast one
SlaVariables and one Assignment. This relationship is a composition relationship. Hence the cardinality is also
mentioned as one. When the Constraints is removed or destroyed from the model then the class SlaVaribles and
the Assignment will automatically will be removed.
The assignment is further associated with the conditions. The assignment class will always have one
class which is condition. If the Assignment class is removed then obviously the Condition class will also be
removed. There are several conditions which will be specified in the condition class.
The associations represent relationships between instances of types. The interpretation varies with the
perspective. Conceptually they represent the conceptual relationships between the types involved. In
specification these are responsibilities for knowing, and will be made explicit by access and update operations.
A more implementation interpretation implies the presence of a pointer. Thus it is essential to know what
perspective is used to built a model in order to interpret it correctly.[5]
Associations may be bi-directional can be navigated in either direction or uni-direction can be
navigated in one direction only. Conceptually all associations can be thought of as bi directional associations
are important for the specification and implementation models. For specification modules the bi directional
associations gives more flexibility in navigation but incur greater coupling. In implementation models a bi-
directional association implies coupled set of pointers, which many designers find difficult to dealt with.Often
those who use bi-directional associations have notation to indicate a unidirectional when needed. With a bi-
directional association the word role represents a single direction. Thus, a bi-directional association has two
roles but a uni-directional association would have only one.[5]
One of the key aspects of association is the cardinality of an association sometimes called the
multiplicity. This corresponds to the notation of mandatory, optional, 1-many, many to many relationships in
the Entity- relationship approach. Each method uses a particular notation to indicate the cardinality. The
cardinality is specified for each role in the association.Aggregation relationships are introduced by many
methods. These are represented as a part or whole relationships. It is difficult to define the difference between
an aggregation and an association or to indicate whether the distinction is useful.[5]
In class diagram, a generalization relationship is a relationship in which one model element is based on
the other model element. The child is based on the parent. Generalization relationships are used in class,
component, deployment and the use case diagrams. In order to comply with the UML semantics, the model
elements in a generalization relationship must be the same type. For example the generalization relationship can
be used between actors or between use cases, it cannot be used between an actor and a use case. We can add
generalization relationship in order to capture the attributes, operations and relationships in a parent model
element and then reuse them in one or more child model elements. Because the child model elements in
generalizations inherit attributes, operations and relationships of the parent. The parent model element can have
one or more children, and any child model element can have one or more parents. It is common to have a single
parent model element and multiple child model elements. Normally generalization relationships do not have
names.[5]
The realization is the one of the relationship between two models elements, in which one model
element realize the behaviour that the other model element specified. A realization is indicated by a dashed line
with an unfilled arrowhead towards the supplier. Realizations can only be shown on class or component
diagrams.
A realization is a relationship between classes, interfaces, components and packages that connects a
client element with a supplier element.
The Dependency is a weaker form of relationship which indicates that one class on another because it
uses it at some point of time. Dependency exists if a class is a parameter variable or local variable of a method
of another class.
VI. Conclusion
In the way to conclude, the implementation of the metamodel satisfies the primary objective of the
project like extending the existing metamodel and defining the grammar for the interaction protocol and the
external policy. The primary objective is to implement the external policy and the interaction protocol in the
7. Tool Support for the Service Oriented Modelling
DOI: 10.9790/0661-1803053945 www.iosrjournals.org 45 | Page
existing metamodel. After defining rules of the grammar, run the grammar in the editor in order to generate the
ecore diagram and the corresponding metamodel. The grammar is defined in a such a way that the user can edit
and change the grammar effectively. Hence the user can edit the grammar using the eclipse tool. The
information about the background technologies of the BPEL and SRML are provided for the future reference
The remaining work like transformation from SRML models to Business Process Execution Language
(BPEL) and to improve the graphical representation of SRML metamodel are added to the future work.
References
[1] Victoriya Dessler, , Service-Oriented Architecture for Smart Environments, IEEE ,6th
International conference on service
oriented computing and Applications , 2013,99-104.
[2] [W.Dai et al , Service oriented Architecture in Industrial Automation Systems, IEEE, VOL-11, 2015, 771-781
[3] [Abdelfattah El-Sharkawi, Ahmed Shouman, Sayed Lasheen, Service Oriented Architecture for Remote Sensing Satellite Telemetry
Data Implemented on Cloud Computing, IJITCS Vol. 5, No. 7, June 2013, 12-26
[4] [Said Nabi, Saif Ur Rehman, Simon Fong, Kamran Aziz, A Model for Implementing Security at Application Level in Service
Oriented Architecture, Journal of Emerging Technologies in Web Intelligence, Vol 6, No 1 ,(2014), 157-163,.
[5] Feng Wang, Liang Hu, Jin Zhou, and Kuo Zhao, A Data Processing Middleware Based on SOA for the Internet of Things, Journal
of Sensors,Volume 2015 (2015), Article ID 827045, 8 pages
BOOKS
[1] Laura Bocchi, Yi Hong, Antónia Lopes, and José Luiz Fiadeiro From BPEL to SRML: A Formal Transformational Approach
Department of Computer Science, University of Leicester University Road, Leicester LE1 7RH, UK .
[2] BPEL Tutorial, http://searchsoa.techtarget.com/tutorial/BPEL-tutorial
[3] Richard C.Gronback Eclipse Modeling Project U.S.A Addison Wesley December 8, 1997