This document discusses adopting aspect-oriented programming (AOP) in enterprise-wide computing. It provides a brief history of AOP, from its inception at Xerox PARC in the 1990s to the development of AspectJ in the late 1990s. It then reviews related work studying the benefits and challenges of using AOP, such as improved modularity and separation of concerns but also increased complexity. Many studies found quantitative benefits to maintenance from AOP but challenges in adoption. The document concludes by discussing uses of AOP in enterprises, noting both benefits like modularizing cross-cutting concerns, but also challenges such as difficulties aspectizing concurrency and failures.
The document discusses a PDEng project at Océ to analyze using Scenario Based Programming to develop embedded control software. The project involved changing mindssets to the new paradigm and extending Océ's development environment to support Live Sequence Charts. As a result, part of a printer finisher's functionalities were implemented using LSCs. While the paradigm was found powerful, the tools require more development, so incorporating LSCs fully was not viable, but the project helped Océ understand strengths and limitations of Scenario Based Programming.
This document summarizes a research paper that studied how to integrate agile software development methods like Extreme Programming (XP) into traditional stage-gate project management models. It discusses how agile methods have evolved for smaller projects but must work within larger product development contexts. The paper presents a case study of two large software projects that used XP within stage-gate management. It finds that integrating XP is possible if the interfaces with the agile subproject and management attitudes towards agility are properly managed.
The article proposes a new model for optimizing software effort and cost estimation based on code reusability. The model compares new projects to previously completed, similar projects stored in a code repository. By searching for and retrieving reusable code, functions, and methods from old projects, the model aims to reduce effort and cost estimates for new software development. The model is described as being based on the concept of estimation by analogy and using innovative search and retrieval techniques to achieve code reuse and thus decreased cost and effort estimates.
STRUCTURAL VALIDATION OF SOFTWARE PRODUCT LINE VARIANTS: A GRAPH TRANSFORMATI...IJSEA
This document discusses an approach to structurally validating software product line variants using graph transformations. The authors propose using model transformations to automatically validate products according to dependencies defined in the feature diagram. They introduce necessary meta-models and present graph grammars to perform validation using the AToM3 tool. The approach is illustrated through examples.
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISONijseajournal
Performance responsiveness and scalability is a make-or-break quality for software. Nearly everyone runs into performance problems at one time or another. This paper discusses about performance issues faced during Pre Examination Process Automation System (PEPAS) implemented in java technology. The challenges faced during the life cycle of the project and the mitigation actions performed. It compares 3 java technologies and shows how improvements are made through statistical analysis in response time of the application. The paper concludes with result analysis.
Integrating profiling into mde compilersijseajournal
Scientific computation requires more and more performance in its algorithms. New massively parallel
architectures suit well to these algorithms. They are known for offering high performance and power
efficiency. Unfortunately, as parallel programming for these architectures requires a complex distribution
of tasks and data, developers find difficult to implement their applications effectively. Although approaches
based on source-to-source intends to provide a low learning curve for parallel programming and take
advantage of architecture features to create optimized applications, programming remains difficult for
neophytes. This work aims at improving performance by returning to the high-level models, specific
execution data from a profiling tool enhanced by smart advices computed by an analysis engine. In order to
keep the link between execution and model, the process is based on a traceability mechanism. Once the
model is automatically annotated, it can be re-factored aiming better performances on the re-generated
code. Hence, this work allows keeping coherence between model and code without forgetting to harness the
power of parallel architectures. To illustrate and clarify key points of this approach, we provide an
experimental example in GPUs context. The example uses a transformation chain from UML-MARTE
models to OpenCL code.
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTijseajournal
The present business network infrastructure is quickly varying with latest servers, services, connections,
and ports added often, at times day by day, and with a uncontrollably inflow of laptops, storage media and
wireless networks. With the increasing amount of vulnerabilities and exploits coupled with the recurrent
evolution of IT infrastructure, organizations at present require more numerous vulnerability assessments.
In this paper new approach the Unified process for Network vulnerability Assessments hereafter called as
a unified NVA is proposed for network vulnerability assessment derived from Unified Software
Development Process or Unified Process, it is a popular iterative and incremental software development
process framework.
GENERATING SOFTWARE PRODUCT LINE MODEL BY RESOLVING CODE SMELLS IN THE PRODUC...ijseajournal
The document discusses an approach for building a software product line (SPL) model by resolving code smells in the source code of existing products. It proposes using reverse engineering to analyze the source code of product variants to detect and refactor code smells. This improves the quality of the source code, allowing features to then be identified and used to generate an SPL model with common core and customizable assets. The goal is to reduce code smells when constructing an SPL using an existing set of products via the bottom-up strategy.
The document discusses a PDEng project at Océ to analyze using Scenario Based Programming to develop embedded control software. The project involved changing mindssets to the new paradigm and extending Océ's development environment to support Live Sequence Charts. As a result, part of a printer finisher's functionalities were implemented using LSCs. While the paradigm was found powerful, the tools require more development, so incorporating LSCs fully was not viable, but the project helped Océ understand strengths and limitations of Scenario Based Programming.
This document summarizes a research paper that studied how to integrate agile software development methods like Extreme Programming (XP) into traditional stage-gate project management models. It discusses how agile methods have evolved for smaller projects but must work within larger product development contexts. The paper presents a case study of two large software projects that used XP within stage-gate management. It finds that integrating XP is possible if the interfaces with the agile subproject and management attitudes towards agility are properly managed.
The article proposes a new model for optimizing software effort and cost estimation based on code reusability. The model compares new projects to previously completed, similar projects stored in a code repository. By searching for and retrieving reusable code, functions, and methods from old projects, the model aims to reduce effort and cost estimates for new software development. The model is described as being based on the concept of estimation by analogy and using innovative search and retrieval techniques to achieve code reuse and thus decreased cost and effort estimates.
STRUCTURAL VALIDATION OF SOFTWARE PRODUCT LINE VARIANTS: A GRAPH TRANSFORMATI...IJSEA
This document discusses an approach to structurally validating software product line variants using graph transformations. The authors propose using model transformations to automatically validate products according to dependencies defined in the feature diagram. They introduce necessary meta-models and present graph grammars to perform validation using the AToM3 tool. The approach is illustrated through examples.
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISONijseajournal
Performance responsiveness and scalability is a make-or-break quality for software. Nearly everyone runs into performance problems at one time or another. This paper discusses about performance issues faced during Pre Examination Process Automation System (PEPAS) implemented in java technology. The challenges faced during the life cycle of the project and the mitigation actions performed. It compares 3 java technologies and shows how improvements are made through statistical analysis in response time of the application. The paper concludes with result analysis.
Integrating profiling into mde compilersijseajournal
Scientific computation requires more and more performance in its algorithms. New massively parallel
architectures suit well to these algorithms. They are known for offering high performance and power
efficiency. Unfortunately, as parallel programming for these architectures requires a complex distribution
of tasks and data, developers find difficult to implement their applications effectively. Although approaches
based on source-to-source intends to provide a low learning curve for parallel programming and take
advantage of architecture features to create optimized applications, programming remains difficult for
neophytes. This work aims at improving performance by returning to the high-level models, specific
execution data from a profiling tool enhanced by smart advices computed by an analysis engine. In order to
keep the link between execution and model, the process is based on a traceability mechanism. Once the
model is automatically annotated, it can be re-factored aiming better performances on the re-generated
code. Hence, this work allows keeping coherence between model and code without forgetting to harness the
power of parallel architectures. To illustrate and clarify key points of this approach, we provide an
experimental example in GPUs context. The example uses a transformation chain from UML-MARTE
models to OpenCL code.
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTijseajournal
The present business network infrastructure is quickly varying with latest servers, services, connections,
and ports added often, at times day by day, and with a uncontrollably inflow of laptops, storage media and
wireless networks. With the increasing amount of vulnerabilities and exploits coupled with the recurrent
evolution of IT infrastructure, organizations at present require more numerous vulnerability assessments.
In this paper new approach the Unified process for Network vulnerability Assessments hereafter called as
a unified NVA is proposed for network vulnerability assessment derived from Unified Software
Development Process or Unified Process, it is a popular iterative and incremental software development
process framework.
GENERATING SOFTWARE PRODUCT LINE MODEL BY RESOLVING CODE SMELLS IN THE PRODUC...ijseajournal
The document discusses an approach for building a software product line (SPL) model by resolving code smells in the source code of existing products. It proposes using reverse engineering to analyze the source code of product variants to detect and refactor code smells. This improves the quality of the source code, allowing features to then be identified and used to generate an SPL model with common core and customizable assets. The goal is to reduce code smells when constructing an SPL using an existing set of products via the bottom-up strategy.
FRAMEWORKS BETWEEN COMPONENTS AND OBJECTSacijjournal
Before the emergence of Component-Based Frameworks, similar issues have been addressed by other
software development paradigms including e.g. Object-Oriented Programming (OOP), ComponentBased Development (CBD), and Object-Oriented Framework. In this study, these approaches especially
object-oriented Frameworks are compared to Component-Based Frameworks and their relationship are
discussed. Different software reuse methods impacts on architectural patterns and support for
application extensions and versioning. It is concluded that many of the mechanisms provided by
Component-Based Framework can be enabled by software elements at the lower level. The main
contribution of Component-Based Framework is the focus on Component development. All of them can be
built on each other in layered manner by adopting suitable design patterns. Still some things such as
which method to develop and upgrade existing application to other approach.
The document describes a new methodology for eliciting software requirements for smart handheld devices. The methodology is focused on users' work processes. It involves observing users' activities, identifying stakeholders, discussing requirements with stakeholders, finding inspiration from other software, and brainstorming user needs and goals. As an example, the methodology is applied to develop an iPad-based software for improving the learning performance of playgroup students.
Model-Driven Development of Web Applicationsidescitation
Over the last few years Model-Driven Development (MDD) has been regarded as
the future of Software Engineering, offering architects the possibility of creating artifacts to
illustrate the design of the software solutions, contributing directly to the implementation of
the product after performing a series of model transformations on them. The model-to-text
transformations are the most important operations from the point of view of the automatic
code generation. The automatic generation or the fast prototyping of applications implies an
acceleration of the development process and a reduction of time and effort, which could
materialize in a noticeable cost reduction. This paper proposes a practical approach for the
model-based development of web applications, offering a solution for the layered and
platform independent modeling of web applications, as well as for the automatic generation
of software solutions realized using the ASP.NET technology.
INTRODUCING REFINED AGILE MODEL (RAM) IN THE CONTEXT OF BANGLADESH'S SOFTWARE...ijseajournal
The Software Companies of Bangladesh are using different types of agile models for software development. Although theoretically these models are worthy for small and medium projects, in practical case they are not so effective. In doing so, this paper tries to find out why do the agile models not suitable for Bangladesh’s Software Companies and how do the problems that the Software Companies face for using the models can be solved. To reveal the answers, this study is based on survey and interview methods. Findings of this paper show that Bangladesh's Software Companies are facing different problems for implementing traditional agile models, such as, Communicational gap, lack of Documentation, unavailability of Prototype, Customer’s lack of knowledge in the area of IT and many more. The study shows that if the Requirement Engineering Process is perfectly managed and some rules are modified in the traditional agile models, these problems can be solved. In doing so, a new model has been proposed by the study named Refined Agile Model (RAM) which is claimed to be better for Bangladesh rather than the traditional Agile Models. This model proposes a process flow which consists of Prototyping Cycle, Development Iteration Cycle and Additional Development Iteration Cycle. This new model also ensures a Requirement Engineer at Client End, sufficient documentation, preparation of prototype and presentation of frequent Demos. After ensuring these requirements in several real time projects, it was found that those projects were completed more effectively compared to all other old project experiences. Eventually, the paper concludes by mentioning that the Refined Agile Model (RAM) is the best model in the Bangladeshi software environment.
This document summarizes an experiment on using pair programming with students in a Java lab. Pair programming involves two programmers working together at one workstation, with one typing (driver) while the other reviews the work (navigator). The experiment found that students performed better on various metrics like participation, debugging skills, and perseverance when using pair programming compared to solo programming. An algorithm called PPPA (Pair Programming Performance Algorithm) is also presented to assess pair programming efforts based on factors like effort, time, cohesion, coupling, complexity, and bugs. Empirical evidence from questionnaires given to students after the experiment supported the benefits of pair programming identified by the PPPA.
- The document introduces a framework for open engineering, which applies open source principles to product design. It describes an open design environment and reviews Georgia Tech's basic product design process as it pertains to complex mechanical or aerospace systems.
- It outlines the key stages of the design process: establishing need, defining the problem, establishing value, generating alternatives, and evaluating alternatives. Tools like quality function deployment, morphological matrices, and Pugh matrices are discussed for evaluating concepts.
- Finally, it proposes an open engineering portal that would enable the open community to interface with the formal design process through shared design documents and integration of changes across the different stages.
Enhancing the Software Effort Prediction Accuracy using Reduced Number of Cos...IRJET Journal
This document presents research on modifying the COCOMO II software cost estimation model to improve prediction accuracy. The researchers reduced the number of cost estimation factors (called cost drivers) from 17 to 13 by adjusting the definitions and impact levels to better reflect current industry situations. They estimated effort for software projects using the modified model and found lower percentage errors compared to the original COCOMO II model, demonstrating improved estimation efficiency. The goal of the research was to analyze cost drivers and their impact on effort estimation in COCOMO II and enhance the model for more accurate predictions.
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...ijseajournal
Rework is a known vicious circle in software development since it plays a central role in the generation of
delays, extra costs and diverse risks introduced after software delivery. It eventually triggers a negative
impact on the quality of the software developed. In order to cater the rework issue, this paper goes in depth
with the notion of rework in software development as it occurs in practice by analysing a development
process on an organisation in Mauritius where rework is a major issue. Meticulous strategies to reduce
rework are then analysed and discussed. The paper ultimately leads to the recommendation of the best
strategy that is software configuration management to reduce the rework problem in software development
This document summarizes the results of an empirical study on the use of software metrics in a large multinational organization at different levels of maturity as defined by the Capability Maturity Model (CMM). The study found that common metrics used in practice include project metrics like time, effort, size, and quality at CMM level 2. Product quality and complexity metrics are more common at level 3. Process-wide metrics are recommended for level 4 according to some literature, but the findings were less clear. The number and types of metrics generally increase with higher maturity levels. Comparing the empirical results to literature provided insights for organizations on selecting appropriate metrics for their maturity level.
This document provides an introduction to the Eclipse Process Framework Composer tool. It discusses how the tool can be used to manage libraries of reusable method content and assemble customized processes for projects. The tool addresses the needs of aligning flexible development processes with business processes and supporting modern agile development practices. It separates reusable method content from its application in processes to provide a knowledge base and process engineering capabilities.
Software that requires maintenance and evolution presumably has value that causes the producers of the software—individuals and organizations—to invest in these activities. Given that there is almost always more that any given software package or product can provide, software producers should be motivated in enabling maintenance and evolution activities and should be interested in the software engineering research efforts that are undertaken to address identified pain points. Yet, despite efforts by providers of research results (software engineering researchers) and interest by recipients (software producing individuals and organizations), a gap remains and too few research results make their way into use. In this keynote talk from ICSME 2021, I focus on research results that take the form of software tools for software producers and explore what this gap is and how the gap might be bridged. This exploration aims to provide some practical tips for how to orient research to create usable and useful software tools.
EMPIRICAL STUDY OF THE EVOLUTION OF AGILE-DEVELOPED SOFTWARE SYSTEM IN JORDAN...ijbiss
The document discusses a study of the adoption of agile practices in three telecommunications companies in Jordan. It finds that:
1) The companies used some agile practices like continuous integration and collective code ownership but did not follow full agile methodologies.
2) Adoption levels varied across practices and companies, with measurement by working code and coding standards seeing higher adoption.
3) Experiences with practices were mixed, with benefits seen in visibility of progress but challenges in refactoring large code bases.
CAPE-OPEN was a collaboration between chemical companies and software vendors to develop a single standard for enterprise process simulation software. This posed challenges as competitors had to cooperate. Through teamwork, the participants overcame differences to create an open system with standardized interfaces. This allowed components from different vendors to work together, reducing costs and encouraging innovation. While competitors initially feared it could commoditize their products, the project ended up opening new markets for all involved through its component-based approach.
International Journal of Business and Management Invention (IJBMI)inventionjournals
International Journal of Business and Management Invention (IJBMI) is an international journal intended for professionals and researchers in all fields of Business and Management. IJBMI publishes research articles and reviews within the whole field Business and Management, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online
Programmer Productivity Enhancement Through Controlled Natural Language Inputijseajournal
We have created CABERNET, a Controlled Nature Language (CNL) based approach to program creation. CABERNET allows programmers to use a simple outline-based syntax. This allows increased programmer efficiency and syntax flexibility. CNLs have successfully been used for writing requirements documents. We propose taking this approach well beyond this to fully functional programs. Through the use of heuristics and inference to analyze and determine the programmer’s intent we are able to create fully functional mobile applications. The goal is for programs to be aligned with the way that the humans think rather than the way computers process information. Through the use of templates a CABERNET application can be processed to run on multiple run time environments. Because processing of a CABERNET program file results in native application program performance is maintained.
The (Un) Expected Impact of Tools in Software EvolutionGail Murphy
The document discusses how software architecture and tool architecture impact a continual flow of value in software development. It argues that software architecture needs to evolve gracefully over time to enable value delivery, and that tools should support human and tool interactions to facilitate appropriate architectural decisions. However, tool architecture is often ignored despite likely interactions with software architecture. More research is needed to understand these relationships and how tool architecture can support software architecture evolution.
This document discusses challenges faced during software component selection for component-based software development. It identifies key challenges as performance, time, component size, fault tolerance, reliability, functionality, compatibility, and available component subsets. Specifically, it states that performance, coupling, cohesion and interfaces impact each other, that using more commercial off-the-shelf components reduces time and cost, and that selecting components based on higher-level programming languages and fault tolerance can help address several of these challenges.
This document provides a summary of Part 2 of an introduction to the Eclipse Process Framework Composer tool. It details the concepts that define method content and processes in EPF, including roles, tasks, work products, and guidance. It explains how to organize method content using catalogs and define different types of processes in EPF Composer. The document is authored by Peter Haumer, a solution architect for IBM, and provides examples and explanations of key EPF Composer concepts.
Exploring the Efficiency of the Program using OOAD MetricsIRJET Journal
This document proposes a methodology to analyze the efficiency of object-oriented programs using OOAD (Object Oriented Analysis and Design) metrics. The methodology involves compiling a program successively until it is error-free, recording the error rate at each compilation. These results are then compared to determine how many compilations were needed for the program to be error-free, indicating its efficiency. The methodology is experimentally validated on a sample Java program, with results showing the error rate decreasing with each compilation until the program is error-free after the 8th compilation, demonstrating good efficiency.
COMPARATIVE STUDY OF SOFTWARE ESTIMATION TECHNIQUES ijseajournal
Many information technology firms among other organizations have been working on how to perform estimation of the sources such as fund and other resources during software development processes. Software development life cycles require lot of activities and skills to avoid risks and the best software estimation technique is supposed to be employed. Therefore, in this research, a comparative study was conducted, that consider the accuracy, usage, and suitability of existing methods. It will be suitable for the project managers and project consultants during the whole software project development process. In this project technique such as linear regression; both algorithmic and non-algorithmic are applied. Model, composite and regression techniques are used to derive COCOMO, COCOMO II, SLIM and linear multiple respectively. Moreover, expertise-based and linear-based rules are applied in non-algorithm methods. However, the technique needs some advancement to reduce the errors that are experienced during the software development process. Therefore, this paper in relation to software estimation techniques has proposed a model that can be helpful to the information technology firms, researchers and other firms that use information technology in the processes such as budgeting and decision-making processes.
SBGC provides IEEE software projects for students in various domains including Java, J2ME, J2EE, .NET and MATLAB. It offers two categories of projects - projects with new ideas/papers and selecting from their project list. They ensure projects are implemented satisfactorily and students understand all aspects. SBGC provides latest 2012-2013 projects for various engineering and technology students as well as MBA students. It offers project support including abstracts, reports, presentations and certificates.
This document provides an overview and requirements for the Stat project, an open source machine learning framework for text analysis. It describes the background, motivation, scope, and stakeholders of the project. Key requirements for the framework include being simplified, reusable, and providing built-in capabilities to naturally support text representation and processing tasks.
FRAMEWORKS BETWEEN COMPONENTS AND OBJECTSacijjournal
Before the emergence of Component-Based Frameworks, similar issues have been addressed by other
software development paradigms including e.g. Object-Oriented Programming (OOP), ComponentBased Development (CBD), and Object-Oriented Framework. In this study, these approaches especially
object-oriented Frameworks are compared to Component-Based Frameworks and their relationship are
discussed. Different software reuse methods impacts on architectural patterns and support for
application extensions and versioning. It is concluded that many of the mechanisms provided by
Component-Based Framework can be enabled by software elements at the lower level. The main
contribution of Component-Based Framework is the focus on Component development. All of them can be
built on each other in layered manner by adopting suitable design patterns. Still some things such as
which method to develop and upgrade existing application to other approach.
The document describes a new methodology for eliciting software requirements for smart handheld devices. The methodology is focused on users' work processes. It involves observing users' activities, identifying stakeholders, discussing requirements with stakeholders, finding inspiration from other software, and brainstorming user needs and goals. As an example, the methodology is applied to develop an iPad-based software for improving the learning performance of playgroup students.
Model-Driven Development of Web Applicationsidescitation
Over the last few years Model-Driven Development (MDD) has been regarded as
the future of Software Engineering, offering architects the possibility of creating artifacts to
illustrate the design of the software solutions, contributing directly to the implementation of
the product after performing a series of model transformations on them. The model-to-text
transformations are the most important operations from the point of view of the automatic
code generation. The automatic generation or the fast prototyping of applications implies an
acceleration of the development process and a reduction of time and effort, which could
materialize in a noticeable cost reduction. This paper proposes a practical approach for the
model-based development of web applications, offering a solution for the layered and
platform independent modeling of web applications, as well as for the automatic generation
of software solutions realized using the ASP.NET technology.
INTRODUCING REFINED AGILE MODEL (RAM) IN THE CONTEXT OF BANGLADESH'S SOFTWARE...ijseajournal
The Software Companies of Bangladesh are using different types of agile models for software development. Although theoretically these models are worthy for small and medium projects, in practical case they are not so effective. In doing so, this paper tries to find out why do the agile models not suitable for Bangladesh’s Software Companies and how do the problems that the Software Companies face for using the models can be solved. To reveal the answers, this study is based on survey and interview methods. Findings of this paper show that Bangladesh's Software Companies are facing different problems for implementing traditional agile models, such as, Communicational gap, lack of Documentation, unavailability of Prototype, Customer’s lack of knowledge in the area of IT and many more. The study shows that if the Requirement Engineering Process is perfectly managed and some rules are modified in the traditional agile models, these problems can be solved. In doing so, a new model has been proposed by the study named Refined Agile Model (RAM) which is claimed to be better for Bangladesh rather than the traditional Agile Models. This model proposes a process flow which consists of Prototyping Cycle, Development Iteration Cycle and Additional Development Iteration Cycle. This new model also ensures a Requirement Engineer at Client End, sufficient documentation, preparation of prototype and presentation of frequent Demos. After ensuring these requirements in several real time projects, it was found that those projects were completed more effectively compared to all other old project experiences. Eventually, the paper concludes by mentioning that the Refined Agile Model (RAM) is the best model in the Bangladeshi software environment.
This document summarizes an experiment on using pair programming with students in a Java lab. Pair programming involves two programmers working together at one workstation, with one typing (driver) while the other reviews the work (navigator). The experiment found that students performed better on various metrics like participation, debugging skills, and perseverance when using pair programming compared to solo programming. An algorithm called PPPA (Pair Programming Performance Algorithm) is also presented to assess pair programming efforts based on factors like effort, time, cohesion, coupling, complexity, and bugs. Empirical evidence from questionnaires given to students after the experiment supported the benefits of pair programming identified by the PPPA.
- The document introduces a framework for open engineering, which applies open source principles to product design. It describes an open design environment and reviews Georgia Tech's basic product design process as it pertains to complex mechanical or aerospace systems.
- It outlines the key stages of the design process: establishing need, defining the problem, establishing value, generating alternatives, and evaluating alternatives. Tools like quality function deployment, morphological matrices, and Pugh matrices are discussed for evaluating concepts.
- Finally, it proposes an open engineering portal that would enable the open community to interface with the formal design process through shared design documents and integration of changes across the different stages.
Enhancing the Software Effort Prediction Accuracy using Reduced Number of Cos...IRJET Journal
This document presents research on modifying the COCOMO II software cost estimation model to improve prediction accuracy. The researchers reduced the number of cost estimation factors (called cost drivers) from 17 to 13 by adjusting the definitions and impact levels to better reflect current industry situations. They estimated effort for software projects using the modified model and found lower percentage errors compared to the original COCOMO II model, demonstrating improved estimation efficiency. The goal of the research was to analyze cost drivers and their impact on effort estimation in COCOMO II and enhance the model for more accurate predictions.
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...ijseajournal
Rework is a known vicious circle in software development since it plays a central role in the generation of
delays, extra costs and diverse risks introduced after software delivery. It eventually triggers a negative
impact on the quality of the software developed. In order to cater the rework issue, this paper goes in depth
with the notion of rework in software development as it occurs in practice by analysing a development
process on an organisation in Mauritius where rework is a major issue. Meticulous strategies to reduce
rework are then analysed and discussed. The paper ultimately leads to the recommendation of the best
strategy that is software configuration management to reduce the rework problem in software development
This document summarizes the results of an empirical study on the use of software metrics in a large multinational organization at different levels of maturity as defined by the Capability Maturity Model (CMM). The study found that common metrics used in practice include project metrics like time, effort, size, and quality at CMM level 2. Product quality and complexity metrics are more common at level 3. Process-wide metrics are recommended for level 4 according to some literature, but the findings were less clear. The number and types of metrics generally increase with higher maturity levels. Comparing the empirical results to literature provided insights for organizations on selecting appropriate metrics for their maturity level.
This document provides an introduction to the Eclipse Process Framework Composer tool. It discusses how the tool can be used to manage libraries of reusable method content and assemble customized processes for projects. The tool addresses the needs of aligning flexible development processes with business processes and supporting modern agile development practices. It separates reusable method content from its application in processes to provide a knowledge base and process engineering capabilities.
Software that requires maintenance and evolution presumably has value that causes the producers of the software—individuals and organizations—to invest in these activities. Given that there is almost always more that any given software package or product can provide, software producers should be motivated in enabling maintenance and evolution activities and should be interested in the software engineering research efforts that are undertaken to address identified pain points. Yet, despite efforts by providers of research results (software engineering researchers) and interest by recipients (software producing individuals and organizations), a gap remains and too few research results make their way into use. In this keynote talk from ICSME 2021, I focus on research results that take the form of software tools for software producers and explore what this gap is and how the gap might be bridged. This exploration aims to provide some practical tips for how to orient research to create usable and useful software tools.
EMPIRICAL STUDY OF THE EVOLUTION OF AGILE-DEVELOPED SOFTWARE SYSTEM IN JORDAN...ijbiss
The document discusses a study of the adoption of agile practices in three telecommunications companies in Jordan. It finds that:
1) The companies used some agile practices like continuous integration and collective code ownership but did not follow full agile methodologies.
2) Adoption levels varied across practices and companies, with measurement by working code and coding standards seeing higher adoption.
3) Experiences with practices were mixed, with benefits seen in visibility of progress but challenges in refactoring large code bases.
CAPE-OPEN was a collaboration between chemical companies and software vendors to develop a single standard for enterprise process simulation software. This posed challenges as competitors had to cooperate. Through teamwork, the participants overcame differences to create an open system with standardized interfaces. This allowed components from different vendors to work together, reducing costs and encouraging innovation. While competitors initially feared it could commoditize their products, the project ended up opening new markets for all involved through its component-based approach.
International Journal of Business and Management Invention (IJBMI)inventionjournals
International Journal of Business and Management Invention (IJBMI) is an international journal intended for professionals and researchers in all fields of Business and Management. IJBMI publishes research articles and reviews within the whole field Business and Management, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online
Programmer Productivity Enhancement Through Controlled Natural Language Inputijseajournal
We have created CABERNET, a Controlled Nature Language (CNL) based approach to program creation. CABERNET allows programmers to use a simple outline-based syntax. This allows increased programmer efficiency and syntax flexibility. CNLs have successfully been used for writing requirements documents. We propose taking this approach well beyond this to fully functional programs. Through the use of heuristics and inference to analyze and determine the programmer’s intent we are able to create fully functional mobile applications. The goal is for programs to be aligned with the way that the humans think rather than the way computers process information. Through the use of templates a CABERNET application can be processed to run on multiple run time environments. Because processing of a CABERNET program file results in native application program performance is maintained.
The (Un) Expected Impact of Tools in Software EvolutionGail Murphy
The document discusses how software architecture and tool architecture impact a continual flow of value in software development. It argues that software architecture needs to evolve gracefully over time to enable value delivery, and that tools should support human and tool interactions to facilitate appropriate architectural decisions. However, tool architecture is often ignored despite likely interactions with software architecture. More research is needed to understand these relationships and how tool architecture can support software architecture evolution.
This document discusses challenges faced during software component selection for component-based software development. It identifies key challenges as performance, time, component size, fault tolerance, reliability, functionality, compatibility, and available component subsets. Specifically, it states that performance, coupling, cohesion and interfaces impact each other, that using more commercial off-the-shelf components reduces time and cost, and that selecting components based on higher-level programming languages and fault tolerance can help address several of these challenges.
This document provides a summary of Part 2 of an introduction to the Eclipse Process Framework Composer tool. It details the concepts that define method content and processes in EPF, including roles, tasks, work products, and guidance. It explains how to organize method content using catalogs and define different types of processes in EPF Composer. The document is authored by Peter Haumer, a solution architect for IBM, and provides examples and explanations of key EPF Composer concepts.
Exploring the Efficiency of the Program using OOAD MetricsIRJET Journal
This document proposes a methodology to analyze the efficiency of object-oriented programs using OOAD (Object Oriented Analysis and Design) metrics. The methodology involves compiling a program successively until it is error-free, recording the error rate at each compilation. These results are then compared to determine how many compilations were needed for the program to be error-free, indicating its efficiency. The methodology is experimentally validated on a sample Java program, with results showing the error rate decreasing with each compilation until the program is error-free after the 8th compilation, demonstrating good efficiency.
COMPARATIVE STUDY OF SOFTWARE ESTIMATION TECHNIQUES ijseajournal
Many information technology firms among other organizations have been working on how to perform estimation of the sources such as fund and other resources during software development processes. Software development life cycles require lot of activities and skills to avoid risks and the best software estimation technique is supposed to be employed. Therefore, in this research, a comparative study was conducted, that consider the accuracy, usage, and suitability of existing methods. It will be suitable for the project managers and project consultants during the whole software project development process. In this project technique such as linear regression; both algorithmic and non-algorithmic are applied. Model, composite and regression techniques are used to derive COCOMO, COCOMO II, SLIM and linear multiple respectively. Moreover, expertise-based and linear-based rules are applied in non-algorithm methods. However, the technique needs some advancement to reduce the errors that are experienced during the software development process. Therefore, this paper in relation to software estimation techniques has proposed a model that can be helpful to the information technology firms, researchers and other firms that use information technology in the processes such as budgeting and decision-making processes.
SBGC provides IEEE software projects for students in various domains including Java, J2ME, J2EE, .NET and MATLAB. It offers two categories of projects - projects with new ideas/papers and selecting from their project list. They ensure projects are implemented satisfactorily and students understand all aspects. SBGC provides latest 2012-2013 projects for various engineering and technology students as well as MBA students. It offers project support including abstracts, reports, presentations and certificates.
This document provides an overview and requirements for the Stat project, an open source machine learning framework for text analysis. It describes the background, motivation, scope, and stakeholders of the project. Key requirements for the framework include being simplified, reusable, and providing built-in capabilities to naturally support text representation and processing tasks.
An Empirical Study of the Improved SPLD Framework using Expert Opinion TechniqueIJEACS
Due to the growing need for high-performance and low-cost software applications and the increasing competitiveness, the industry is under pressure to deliver products with low development cost, reduced delivery time and improved quality. To address these demands, researchers have proposed several development methodologies and frameworks. One of the latest methodologies is software product line (SPL) which utilizes the concepts like reusability and variability to deliver successful products with shorter time-to-market, least development and minimum maintenance cost with a high-quality product. This research paper is a validation of our proposed framework, Improved Software Product Line (ISPL), using Expert Opinion Technique. An extensive survey based on a set of questionnaires on various aspects and sub-processes of the ISPLD Framework was carried. Analysis of the empirical data concludes that ISPL shows significant improvements on several aspects of the contemporary SPL frameworks.
Aspect Oriented Architecture (AOA) is an effective agile development method that involves breaking down functional components into parts of the software architecture. AOA tools like AspectJ, join points, advice, and pointcuts are used to address cross-cutting concerns. AOA was applied to the Capella online learning system case study to increase workflow processes and provide a positive customer experience. While AOA allows for effective review of programming mechanisms and better understanding of cross-cutting concerns, potential disadvantages include modifications due to inconsistent tools, performance issues with some applications, and limited reuse of aspects.
An Adjacent Analysis of the Parallel Programming Model Perspective: A SurveyIRJET Journal
This document provides an overview and analysis of parallel programming models. It begins with an abstract discussing the growing demand for parallel computing and challenges with existing parallel programming frameworks. It then reviews several relevant studies on parallel programming models and architectures. The document goes on to describe several key parallel programming models in more detail, including the Parallel Random Access Machine (PRAM) model, Unrestricted Message Passing (UMP) model, and Bulk Synchronous Parallel (BSP) model. It discusses aspects of each model like architecture, communication methods, and associated cost models. The overall goal is to compare benefits and limitations of different parallel programming models.
Ludmila Orlova HOW USE OF AGILE METHODOLOGY IN SOFTWARE DEVELO.docxsmile790243
Ludmila Orlova
HOW USE OF AGILE METHODOLOGY IN SOFTWARE DEVELOPMENT INFLUENCE AGILITY OF THE BUSINESS
Agile methodology is widely distributed tool for software development. Presented article explore research data about use of these tools, its influence to quality of the end product and performance of development and overall agility of business and companies.
KEYWORDS:
Agile, software development, agile business
CONTENT
1 INTRODUCTION
2 AGILE SOFTWARE DEVELOPMENT
3 SCALING AGILE
4 AGILE BUSINESS
5 CONCLUSION
REFERENCES
1 INTRODUCTION
Fast pace of science progress in solid state electronics led to incredible progress of computer devices that on its turn demanded software to control and manage the power of computer calculations and usage.
Software engineering emerged in the beginning of 20th century and by the end of it became separate state of art science, activity and the profession for millions. There are about 18.2 million software developers worldwide, a number that is due to rise to 26.4 million by 2019, a 45% increase, says Evans Data Corp. in its latest Global Developer Population and Demographic Study (P. Thibodeau, 2013). Along with growing number of software developers (software development firms, projects and people involved), increased the need for effective management of software development process. This demanded new approach and methodology from business researchers and managers. In the last several decades there was huge number of research, both in IT field and business management dedicated to this area.
Popularity of agile software development methods started about decade ago and at present these methods are employed by many big, medium size and small companies. Still growing attention to agile methods from software development specialists confirm these methods filled the lack of management techniques for software development that emerged and developed extremely fast along with speedy advancement of hardware in IT area. Great number of research done in areas such as changes in performance of software development using agile methods or scaling agile for large companies and teams. Also one of modern trends is an attempt to apply agile methodology for project management, marketing, sales and other activities. Goal of this article is to explore influence of application agile methods in software development to agility of whole company and business. Presented work based on secondary data taken from a multiple sources, the work performed as an exploratory study and a review of existing research in the area.
2 AGILE SOFTWARE DEVELOPMENT
Definition of an adjective agile in English is: able to move quickly and easily or able to think and understand quickly (Oxford Dictionary, 2015). The most often contemporary use presented by the following sentence: Relating to or denoting a method of project management, used especially for software development, that is characterized by the division of tasks into ...
AN ITERATIVE HYBRID AGILE METHODOLOGY FOR DEVELOPING ARCHIVING SYSTEMSijseajournal
With the massive growth of the organizations files, the needs for archiving system become a must. A lot of time is consumed in collecting requirements from the organization to build an archiving system. Sometimes the system does not meet the organization needs. This paper proposes a domain-based requirement engineering system that efficiently and effectively develops different archiving systems based on new
suggested technique that merges the two best used agile methodologies: extreme programming (XP) and SCRUM. The technique is tested on a real case study. The results shows that the time and effort consumed during analyzing and designing the archiving systems decreased significantly. The proposed methodology also reduces the system errors that may happen at the early stages of the development of the system.
AN ITERATIVE HYBRID AGILE METHODOLOGY FOR DEVELOPING ARCHIVING SYSTEMSijseajournal
With the massive growth of the organizations files, the needs for archiving system become a must. A lot of
time is consumed in collecting requirements from the organization to build an archiving system. Sometimes
the system does not meet the organization needs. This paper proposes a domain-based requirement
engineering system that efficiently and effectively develops different archiving systems based on new
suggested technique that merges the two best used agile methodologies: extreme programming (XP) and
SCRUM. The technique is tested on a real case study. The results shows that the time and effort consumed
during analyzing and designing the archiving systems decreased significantly. The proposed methodology
also reduces the system errors that may happen at the early stages of the development of the system.
Is Multicore Hardware For General-Purpose Parallel Processing Broken? : NotesSubhajit Sahu
Highlighted notes of article while studying Concurrent Data Structures, CSE:
Is Multicore Hardware For General-Purpose Parallel Processing Broken?
By Uzi Vishkin
Communications of the ACM, April 2014, Vol. 57 No. 4, Pages 35-39
10.1145/2580945
IRJET- Analysis of Software Cost Estimation TechniquesIRJET Journal
This document analyzes and compares different software cost estimation techniques using machine learning algorithms. It uses the COCOMO and function point estimation models on NASA project datasets to test the performance of the ZeroR and M5Rules classifiers. The M5Rules classifier produced more accurate results with lower mean absolute errors and root mean squared errors compared to COCOMO, function points, and the ZeroR classifier. Therefore, the study suggests using M5Rules techniques to build models for more precise software effort estimation.
GENERATING SOFTWARE PRODUCT LINE MODEL BY RESOLVING CODE SMELLS IN THE PRODUC...ijseajournal
Software Product Lines (SPLs) refer to some software engineering methods, tools and techniques for creating a collection of similar software systems from a shared set of software assets using a common means of production. This concept is recognized as a successful approach to reuse in software development. Its purpose is to reduce production costs by reusing existing features and managing the variability between the different products with respect of particular constraints. Software Product Line engineering is the production process in product lines and the development of a family of systems by reusing core assets. It exploits the commonalities between software products and preserves the ability to vary the functionalities and features between these products. The adopted strategy for building SPL can be a top-down or bottom-up. Depending from the selected strategy, it is possible to face an inappropriate implementation in the SPL Model or the derived products during this process. The code can contain code smells or code anomalies. Code smells are considered as problems in source code which can have an impact on the quality of the derived products of an SPL. The same problem can be present in many derived products from an SPL due to reuse or in the obtained product line when the bottom-up strategy is selected. A possible solution to this problem can be the refactoring which can improve the internal structure of source code without altering external behavior. This paper proposes an approach for building SPL from source code using the bottom-up strategy. Its purpose is to reduce code smells in the obtained SPL using refactoring source code. This approach proposes a possible solution using reverse engineering to obtain the feature model of the SPL
Generating Software Product Line Model by Resolving Code Smells in the Produc...ijseajournal
Software Product Lines (SPLs) refer to some software engineering methods, tools and techniques for
creating a collection of similar software systems from a shared set of software assets using a common
means of production. This concept is recognized as a successful approach to reuse in software
development. Its purpose is to reduce production costs by reusing existing features and managing the
variability between the different products with respect of particular constraints. Software Product Line
engineering is the production process in product lines and the development of a family of systems by
reusing core assets. It exploits the commonalities between software products and preserves the ability to
vary the functionalities and features between these products. The adopted strategy for building SPL can be
a top-down or bottom-up. Depending from the selected strategy, it is possible to face an inappropriate
implementation in the SPL Model or the derived products during this process. The code can contain code
smells or code anomalies. Code smells are considered as problems in source code which can have an
impact on the quality of the derived products of an SPL. The same problem can be present in many derived
products from an SPL due to reuse or in the obtained product line when the bottom-up strategy is selected.
A possible solution to this problem can be the refactoring which can improve the internal structure of
source code without altering external behavior. This paper proposes an approach for building SPL from
source code using the bottom-up strategy. Its purpose is to reduce code smells in the obtained SPL using
refactoring source code. This approach proposes a possible solution using reverse engineering to obtain
the feature model of the SPL.
A novel data type architecture support for programming languagesijpla
In programmers point of view, Datatypes in programming language level have a simple description but inside hardware, huge machine codes are responsible to describe type features. Datatype architecture design is a novel approach to match programming features along with hardware design. In this paper a novel Data type-Based Code Reducer (TYPELINE) architecture is proposed and implemented according to significant data types (SDT) of programming languages. TYPELINE uses TEUs for processing various SDT operations. This architecture design leads to reducing the number of machine codes, and increases execution speed, and also improves some parallelism level. This is because this architecture supports some operation for the execution of Abstract Data Types in parallel. Also it ensures to maintain data type features and entire application level specifications using the proposed type conversion unit. This framework includes compiler level identifying execution modes and memory management unit for decreasing object read/write in heap memory by ISA support. This energy-efficient architecture is completely compatible with object oriented programming languages and in combination mode it can process complex C++ data structures with respect to parallel TYPELINE architecture support.
Productivity Factors in Software Development for PC PlatformIJERA Editor
Identifying the most relevant factors influencing project performance is essential for implementing business strategies by selecting and adjusting proper improvement activities. The two major classification algorithms CRT and ANN that were recommended by the Auto Classifier tool in SPSS Modeler used for determining the most important variables (attributes) of software development in PC environment. While the accuracy of classification of productive versus non-productive cases are relatively close (72% vs 69%), their ranking of important variables are different. CRT ranks the Programming Language as the most important variable and Function Points as the least important. On the other hand, ANN ranks the Function Points as the most important followed by team size and Programming Language.
Innovation Agile Methodology towards DevOpsIRJET Journal
This document discusses the relationship between DevOps and agile methodologies for software development. It notes that while DevOps and agile are related concepts that both aim to improve efficiency, they differ in their specific focuses. Agile focuses on iterative development processes within software teams, while DevOps also encompasses collaboration between development and IT operations teams. The document argues that combining agile and DevOps approaches can provide benefits like continuous delivery and improved communication across teams. It also addresses some common challenges faced in transitioning to agile and DevOps cultures.
Extreme Programming (XP) is an agile software development methodology that advocates short development cycles, frequent code integration and testing, pair programming, and close customer collaboration. It aims to improve productivity and responsiveness to changing requirements. Key practices include test-driven development, where automated unit tests are written before code; pair programming; frequent communication between programmers and customers; and continuous integration of code changes. XP originated from Kent Beck's work on the Chrysler payroll project in the 1990s and emphasizes adapting practices like testing and code reviews "to the extreme."
PROGRAMMING REQUESTS/RESPONSES WITH GREATFREE IN THE CLOUD ENVIRONMENTijdpsjournal
Programming request with GreatFree is an efficient programming technique to implement distributed polling in the cloud computing environment. GreatFree is a distributed programming environment through which diverse distributed systems can be established through programming rather than configuring or scripting. GreatFree emphasizes the importance of programming since it offers developers the opportunities to leverage their distributed knowledge and programming skills. Additionally, programming is the unique way to construct creative, adaptive and flexible systems to accommodate various distributed computing environments. With the support of GreatFree code-level Distributed Infrastructure Patterns,
Distributed Operation Patterns and APIs, the difficult procedure is accomplished in a programmable, rapid and highly-patterned manner, i.e., the programming behaviors are simplified as the repeatable operation of Copy-Paste-Replace. Since distributed polling is one of the fundamental techniques to construct distributed systems, GreatFree provides developers with relevant APIs and patterns to program requests/responses in the novel programming environment.
PROGRAMMING REQUESTS/RESPONSES WITH GREATFREE IN THE CLOUD ENVIRONMENTijdpsjournal
Programming request with GreatFree is an efficient programming technique to implement distributed polling in the cloud computing environment. GreatFree is a distributed programming environment through which diverse distributed systems can be established through programming rather than configuring or scripting. GreatFree emphasizes the importance of programming since it offers developers the opportunities to leverage their distributed knowledge and programming skills. Additionally, programming is the unique way to construct creative, adaptive and flexible systems to accommodate various distributed computing environments. With the support of GreatFree code-level Distributed Infrastructure Patterns, Distributed Operation Patterns and APIs, the difficult procedure is accomplished in a programmable, rapid and highly-patterned manner, i.e., the programming behaviors are simplified as the repeatable operation of Copy-Paste-Replace. Since distributed polling is one of the fundamental techniques to construct distributed systems, GreatFree provides developers with relevant APIs and patterns to program requests/responses in the novel programming environment.
Similar to International Journal of Engineering and Science Invention (IJESI) (20)
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
OpenID AuthZEN Interop Read Out - AuthorizationDavid Brossard
During Identiverse 2024 and EIC 2024, members of the OpenID AuthZEN WG got together and demoed their authorization endpoints conforming to the AuthZEN API
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Building Production Ready Search Pipelines with Spark and Milvus
International Journal of Engineering and Science Invention (IJESI)
1. International Journal of Engineering Science Invention
ISSN (Online): 2319 – 6734, ISSN (Print): 2319 – 6726
www.ijesi.org Volume 2 Issue 7 ǁ July. 2013 ǁ PP.29-35
www.ijesi.org 29 | Page
Adopting Aspect Oriented Programming in Enterprise Wide
Computing
Dr. Debashis Jena 1
, Dr. S.C. Das 2,
Ambika Prasad Das 3
1
(Department of Computer Science and Engineering / IIIT Bhubaneswar, India)
2
(Department of IT Management, KIIT School of Management, Bhubaneswar, India)
3
(Department of IT Management, KIIT School of Management, Bhubaneswar, India)
ABSTRACT: Aspect Oriented Programming (AOP) has received the attention of both the research community
and industry experts since its inception in Xerox Palo Alto Research Center (PARC) in 1996. This supplements
the Object Oriented Programming paradigm by separating the cross cutting concerns from business logic. MIT
in its technical review in 2001 identified AOP as one of the ten emerging technologies that will have significant
impact on the economy. However even after more than a decade of its inception as a programming paradigm,
adoption of AOP in the IT industry in general and IT services industry in particular is still in its infancy. This
paper discusses some of the pain areas in adoption of AOP in the IT industry.
KEYWORDS: adoption, Aspect Oriented Programming (AOP), cross cutting concerns, Object Oriented
Programming (OOP).
I. INTRODUCTION
The In today‟s complex business scenario managing IT projects has several challenges. IT managers
are faced with risk arising out of single / preferred vendor, conflicting standards, designs, maintainability issues
and high cost due to use of proprietary standards and tools. Therefore more and more non IT companies follow
collaborating model while outsourcing IT products and services which can cater to their organization‟s needs.
While collaborative software development reduces single vendor dependency, it also adds complexity due to
conflicting standards, rework due to defect in collaborating partner‟s modules and integration challenges.
Object Oriented Programming (OOP) model revolutionized software development with the facility of
modeling real life objects in programming languages thereby reducing complexity and increasing modularity.
However as the size and complexity of software applications grew, modeling behavior of software components
and their modularization became increasingly important. Also separation between application programmers and
system programmers became more and more discrete. Orthogonal concerns like transaction, security, logging
and exception handling needed to be implemented in a distributed, heterogeneous environment thereby
separating enterprise wide system concerns from programming logic. Also these concerns or functionalities
needed to be more and more robust which can‟t be implemented within limited timelines of the project.
In collaborative software development many outsourcing partners get involved based on their domain
competency, skill set, team size and other pressing business requirements. IT organizations have tough time
managing these diverse teams and ensuring project completion within time and budget. While there can be many
contributing factor for project overrun, integration is one of the major issues in such projects.
Aspect Oriented Programming (AOP) helps in addressing cross cutting concerns which in turn is
supposed to reduce cost and complexity. The objective of this research is to identify pain areas in large scale
software development using enterprise wide computing standards and critically appraise scenarios where AOP
can do value addition in terms of reducing cost, integration challenges and maintenance overheads.
II. HISTORY AND EVOLUTION OF AOP RESEARCH
Gregor Kiczales and colleagues at Xerox Palo Alto Research Center (PARC) developed the explicit
concept of AOP sometime between November 1995 and May 1996. AOP was based on a strong foundation of
prior work, but somehow the existing terminology was not sufficient to describe what was being done. AOP
revolutionized the way programming was being done and very soon became an interest of researchers around
the world. [2]
The RG Case Study
One of the early projects at PARC which contributed to the development of AOP as a paradigm was RG
(Reverse Graphics). RG is an image processing system that allows sophisticated image processing operations to
be defined by composing primitive image processing filters. An implementation of RG using OOP was easy to
implement but has performance bottlenecks. These performance issues could not be addressed effectively in a
2. Adopting Aspect Oriented Programming…
www.ijesi.org 30 | Page
paradigm tied to the object boundaries, rather required a mechanism in which these complexities cross cutting
several modules to be handled as separate concerns. The paper on RG case study [3] discusses how this
limitation of OOP was overcome using Aspect Oriented Programming techniques. The RG case study
demonstrated that AOP can reduce code complexity without sacrificing important performance requirements.
Annotated MatLab (AML)
The second project that contributed to the development of AOP paradigm during this period was
Annotated MatLab (AML) [5]. The problem discussed here was optimization of certain MatLab programs, again
focusing on memory usage and operation fusion. There were many discussions among the group at PARC
whether AML was AOP or not. The final language annotations did not match with other AOP systems at PARC
due to which AML did not get much importance in AOP research. However it served to list out features which
should not be part of AOP.
Evaluation Time Control Meta Language (ECTML)
This work was carried out by John lamping. He conceptualized a small system called Evaluation Time
Control Meta Language (ECTML) where idea was to provide a set of directives that programmers could use in
order to instruct the language processor when to evaluate certain parts of the code. This work was analogous to
work on reflection where processor decides certain parts of the program to be evaluated at compile time or
runtime. The gist of this work was that language processors were not always capable to determine the best time
to evaluate these parts of the code and explicit specification from the programmer would help the processors in
this aspect. This contributed to another dimension of separation of concerns enriching AOP knowledgebase.
DJ
DJ was different in its approach compared to RG, AML and ETCML. DJ was a Java package for
adaptive programming. Adaptive programming enables the programmer to practice concern-shy programming.
DJ made it easier to follow the Low of Demeter, loosening coupling between the structure and behavior
concerns and adapting changes in the object model. [4].
AspectJ
AspectJ [6] was another milestone project in Aspect-Oriented research by PARC and was first made
public in 1998. This is a simple and practical aspect oriented extension to Java language. Effort was being made
to make AspectJ a general purpose AOP implementation whereas its predecessors like DJ and RG were concern
and domain specific in nature respectively. AspectJ was designed to be a compatible extension to Java so that it
will facilitate adoption by existing Java programmers‟ community.
III. RELATED WORK
Since its inception, AOP researchers have been interested in the benefits which AOP brings to the table
in terms of the software quality parameters. However there is a significant research gap when it comes to the
benefits and challenges of managing enterprise wide computing and enterprise architecture by adopting AOP in
place of plain OOP methodologies in a multi vendor environment. Also there seems to be little ongoing research
on the costs and effects of AOP within the enterprise.
Kiczales etal [1] in the proceedings of ECOOP, 1997 identified the programming problems with OOP
where OOP alone fails to capture all important design decisions that the program must implement. In fact there
are some programming problems that fit neither the OOP nor the procedural approach it replaces. The paper
presents an example driven approach of the relevance of AOP in software architecture. First example is the
image processing example done in RG and AOP based reimplementation of the same. They found that the AOP
based reimplementation of the system has met the original design goals, application code is easy to interpret and
maintain in addition to being highly efficient. They have agreed that it is difficult to measure the benefits of
using AOP without a large experimental study involving multiple developers. They measured the effectiveness
of AOP by taking into consideration the tangled code (code without AOP) size, component program size and
sum of aspect program size. In this experiment they have found extremely large gain so far code size and
complexity was concerned. However there were no clues on the integration efforts and complexity arising out of
using a full-fledged AOP implementation and the costs involved.
In the second example, they implemented a distributed digital library that stores documents in many
forms and provides a wide range of operations on those documents. There they studied various aspects like
communication where a higher aspect level language was found to be appropriate vis-à-vis traditional reflective
access to control communication. One of the major goals which were still an open issue in this paper was
quantitative assessment of the utility of AOP. How much AOP helps in maintenance? Which applications
/domains / industries it is more applicable for? These were still open questions.
Roger T. Alexander and James M. Bieman [8] in their paper „Challenges of Aspect-oriented
Technology‟ discussed on some of the pain areas in implementing AOP. The paper focuses on certain aspects
3. Adopting Aspect Oriented Programming…
www.ijesi.org 31 | Page
like understandability, fault tolerance, cognitive burden etc and stresses that there are some of the grey areas
when it come to effective implementation of AOP in practical situation. For example in case of fault tolerance
diagnosing the root cause will involve examining not only the primary abstraction but also the woven aspects.
This increases the overhead of using AOP effectively with realizable benefits. However the complexity arising
out of weaving process was still not measurable with accuracy. Similarly maintainability issues could not be
addressed. This is because changes to the primary abstraction that forms the basis of a woven composition have
potential to require changes to the woven aspects.
Avadhesh Kumar, Rajesh Kumar and P.S. Grover studied the maintainability of Aspect Oriented
Systems [10] and have found that average change impact in AO systems is less than that of OO systems which
suggests maintainability is better in case of AO systems. However they concluded that during reengineering if
concerns which are not cross cutting in nature are mined to aspects, this will make the system more complex and
in turn less maintainable.
Zhao [9] did some work in this area of change impact analysis based on program slicing technique.
However, this was not applied this to realistic systems.
Joon-Sang Leea and Doo-Hwan Baeb in 2002 proposed an Aspect Oriented Development Framework
(AODF) [11] in which functional behaviors are encapsulated in each component and connector and particular
non functional requirements are tuned flexibly during software composition. AODF would enable intra
component behavior with the component and inter component behavior with the connector using a CB style. In
order to illustrate how well this framework works, they have used OpenJava providing reflective support during
compile time. This work was not complete, but provided some light on feature component writing standard and
architectural style to implement intra-component, inter-component behavior and non functional behavior in
component based software developments.
Roberta Coelho et al in 2010 [12] identified some drawbacks of aspect oriented programs when it
comes to exception handling. In this work they presented a verification approach based on a static analysis tool
called SAFE, to check the reliability of exception handling code in AspectJ programs. They stressed on
increased complexity arising out of additional code included by the aspect weavers and their resolution by
application programmers. Secondly most AOP implementations work on the basis of inversion of control where
the aspect has the control which classes it should affect rather than the class itself deciding so. This means the
advised code can‟t be safe from the exceptions flowing out of the aspects which are challenging to debug. Also
AO paradigm mandates application and aspect code being developed in parallel. This leads to obliviousness on
part of the application programmer about the aspect code and exceptions flowing out of aspects into the base
code are not handled with ease by the application developer.
Jianjun Zhao studied the opportunities and challenges of AOP software maintenance [13]. He reiterated
that AOP research has primarily been focused on problem analysis, language design and implementation. Even
though issues related to software maintenance are known, it has received little attention. AOP programs consist
of base code and aspect code which is weaved into the base code to address cross cutting concern. Weaving is
what makes it different from OO or procedural programs. Therefore in order to maintain AO software
effectively, new analysis and testing techniques are strongly needed.
Gail C. Murphy et al studied the lessons learned from accessing AOP [14]. They found that while
evaluating a software engineering methodology, the researcher must be aware of 3 factors which require trade
off with each other. They are – validity, realism and cost. To study AOP, they applied two basic methods – a
case study method and an experimental method. Since the technique under study is in its infancy, the case study
and experimental methods were largely exploratory, yielding qualitative insights into AOP and directions for
further investigation. Overall they have found the case study approach more useful in evaluating how AOP helps
ease some of the tasks of software developers.
Cristina Videira Lopes and Martin Lippert in their paper „A Study on Exception Detection and
Handling Using Aspect-Oriented Programming‟ [15] studied the exception handling capabilities of AOP. Their
work was mainly focused on accessing APO‟s capabilities in easing out code tangling in handling exceptions.
They have found that AspectJ helps reducing code related to exception detection and handling to a substantial
extent. Even in one of the scenarios they were able to reduce the code related to exception handling by a factor
of 4. Also they have found that in contrast to plain Java code, AspectJ provided better configuration of different
exceptional behavior and more exception tolerance in case of change in specification. Their study also
identified some drawbacks of AspectJ which can be addressed in future.
Freddy Munoz et al in the year 2009 did some empirical study on the usage of AOP [16]. In 2001 MIT
had announced AOP as a key technology that will soon have profound impact on the economy and on how we
live and work [17]. However, the usage of AOP is not widely adopted even today as it was predicted. This can
be attributed to lack of matured tools for analysis, maintenance and testing. In this paper they analyzed the usage
of AOP in 38 open source aspect-oriented projects from small (<5000 LOC) to large (>20000 LOC). Their
objective was to find out to what extent developers use AOP, its invasiveness facilities and the coverage of
4. Adopting Aspect Oriented Programming…
www.ijesi.org 32 | Page
AspectJ Point Cut Descriptor (PCD) language. They observed that developers use few advices to modularize
cross cutting concerns and that advices are scarcely cross cutting. There were other observations where it was
found that developers write very few advices which break object-oriented encapsulation and the small number
of invasive advice advise a small number of specific join points.
They have listed some possible reasons which can be interesting to the research problem
Developer lack comprehensive knowledge of AOP and use of advice in modularizing concerns.
Developers fail to appreciate units which seem modular but cross cut other modules.
AspectJ implementation is not flexible enough to allow developers modularizing total of cross cutting
concerns.
The invasive capabilities of AspectJ which should be used modularizing cross cutting concerns are not used
because they can introduce side effects.
The points concluded give an indication regarding the challenges in adopting AOP within an enterprise.
Uirá Kulesza et al did some study on the effects of AOP with respect to maintenance aspect of the software [18].
The study involves a systematic comparison between object and aspect oriented versions of the same application
in order to access to what extent each solution provides maintainable software decompositions. The analysis is
base on fundamental attributes for modularity such as coupling, cohesion, conciseness and separation of
concerns. They have found that AOP exhibits superior stability and reusability through the changes since it
results in fewer lines of code, improved separation of concerns, more loose coupling and lower intra component
complexity.
Jörg Kienzle et al in their paper on concurrency and failure of AOP [19] pointed out two important
facts regarding the behavior of AOP. Firstly AO languages like any other macro language can be beneficial for
code factorization. However code factorization using AOP should be done by experienced programmers.
Secondly concurrency and failures are specific concerns which are hard to aspectize unlike other concerns like
logging and exception handling. As per their opinion, these are part of the phenomenon that objects should
simulate. They used AspectJ and OPTIMA transactional framework to conduct their research. They observed
the following so far transactions using AOP is concerned
Aspectizing transactions by automatically applying transactions to previously non transactional code is
doomed to fail. This is because of the transaction serialization and incompatibility of the methods provided
by shared objects.
Separation of transactional interfaces from rest of the programs is possible using AOP. However, this
separation may be redundant where transactional aspect is actually part of the object it applies to. This can
result in confusing code.
AOP provides mechanism whereby application programmer can specify the transactional attributes as in
case of EJB. However these are prone to error where an experienced programmer can break the ACID
properties by specifying a wrong attribute.
IV. USE OF AOP IN ENTERPRISE
As per the MIT technological review in January / February 2001, ten emerging areas of technology will
soon have profound impact on economy and how we live and work. [17] One of these top ten emerging
technologies was AOP. However even today, the use of AOP is not as widespread as it was supposed to be.
This is an open area of research which needs to be investigated.
What is an enterprise?
The Open Group Architecture Framework (TOGAF) defines “enterprise” as any collection of
organizations that has a common set of goals. For example, an enterprise could be a government agency, a
whole corporation, a division of a corporation, a single department, or a chain of geographically distant
organizations linked together by common ownership [20].The common set of goals within the enterprise
mandates the enterprise architecture to optimize across the enterprise the fragmented legacy of processes into an
integrated environment and this is where AOP fits in. For example, it might be required to record failed
financial transactions across all divisions into a single integrated repository for reporting and analysis. Such
requirements require enterprise architecture to be flexible enough to handle these concerns in a unified, seamless
way independent of the functional requirements.
Enterprise Architecture
As per open group, the term "enterprise" in the context of "enterprise architecture" can be used to
denote both an entire enterprise - encompassing all of its information and technology services, processes, and
infrastructure - and a specific domain within the enterprise. In both cases, the architecture crosses multiple
systems, and multiple functional groups within the enterprise. [20] Since architecture must cross cut multiple
5. Adopting Aspect Oriented Programming…
www.ijesi.org 33 | Page
systems and functional groups, separating the concerns from business functionality is of immense importance to
the foundation of any enterprise architecture. AOP addresses this need by providing a way to address orthogonal
concerns in a unified approach.
Using AOP in Enterprise Architecture
Separation of concerns is the biggest advantage AOP brings to the table apart from reducing code
tangling, coupling and helping in architecture enforcement. Paulo Merson in his work „Using Aspect-Oriented
Programming to Enforce Architecture‟ [21] has stressed the importance of architecture enforcement using AOP.
He demonstrated how coding policies, best practices and even naming conventions can be enforced using AOP.
Architects may be having good confidence on the architectural artifacts which they produce, but may not have
enough checks and means to ensure they will be adhered to in entirety. While architectural constraints can be
enforced using AOP, its cost in terms of system processing and development time can be significant. However
some basic architectural principles can definitely be imposed using AOP. AOP also adds to the quality of the
architecture by ensuring modularity, maintainability etc.
Reference: http://www.theserverside.com/news/1365141/AspectWerkz-20-An-Extensible-Aspect-Container
Industry Adoption of AOP
There are many successful case studies of adoption of AOP in the enterprise. Motorola successfully
developed WEAVR, an aspect oriented Modeling engine in light of the particular needs of telecom systems
engineering industry. [25] One of the core business units of Motorola, the Networks and Enterprise business unit
has successfully adopted AOP and is using WEAVR framework in production by development teams within the
business unit. It is interesting to note that right now, adoption is limited to simple aspects like tracing and time
out concerns. The maturity of WEAVR is still in its infancy.
HP successfully adopted AOP in C++ frameworks used in the development of VLSI CAD applications.
[27] The result shows reduction of code and improved modularity by adopting AOP.
Siemens adopted AOP by developing eclipse based plug-in and through that access to aspect
repository. [26] Siemens team demonstrated benefits of AOP by using an ordinary Java application and
aspectizing it for better architectural benefits.
SAP is the world‟s largest enterprise software vendor. Therefore SAP research on AOP is very much
relevant in the area of adopting AOP in large enterprise systems. Christoph Pohl et al have analyzed how AOP
would help SAP to tackle hard development problems and which roadblock could prevent its adoption. [24]
They have analyzed Enhancement Framework developed using ABAP to access the adoption issues related to
AOP. They observed that AOP has been used in a couple of industrial projects in SAP. However, the adoption
of AOP was not up to the expectation in large scale enterprise applications. Reasons are often a combination of
social, technical, psychological and commercial considerations that are worth investigating. They have
identified the following factors which inhibit AOP adoption in enterprise computing.
6. Adopting Aspect Oriented Programming…
www.ijesi.org 34 | Page
Social and Psychological Factors – Survey in internal SAP newsgroups indicated lack of awareness of AOP.
Therefore AOP is not considered as an option in many enterprise software projects.
Optimal adoption of AOP requires experienced professionals which make the situation worse. Since ABAP
does not mandate object oriented programming, applying AOP on ABAP codebase requires senior personnel
which is not always justifiable by the organization.
Developers often feel losing control when AOP is introduced, because the aspect code might affect their base
code and they might be responsible for errors they can‟t influence.
Technical Factors – For understanding flow at runtime involving various aspects, powerful tool support for
tracing, verifying, debugging is of utmost importance. While some implementations are available for Java, tool
support in ABAP is not very strong. Secondly, the risks arising out of Aspect based coupling and code
interactions become more when third party software vendors are involved. Strict governance models are
required to avoid side effects.
Economic Factors – It is really difficult to calculate the difference in ROI between projects with or without
AOP. The ROI calculation of AOP platform extensions like Enhancement Framework is even more difficult.
However these are case studies only from the IT products sector. The penetration of AOP in the services sector
is still scarce and this is an open research question. Some of the contributing factors to low penetration of AOP
within enterprise are listed below.
Pain areas in Adopting AOP within the enterprise
There can be many contributing factors which prevent adoption of AOP optimally within the enterprise. Some
of the reasons are listed below.
Awareness: Though AOP was identified as one of the technologies by MIT in 2001 that would impact
economy as a whole in the next decade, it never got the kind of proliferation and importance among the
researchers and industry leaders its predecessors like OOP had. Firstly AOP emerged from an organization
which specializes in image processing. Therefore perhaps the perception it got for quite some time was that
it was specific only to image processing domain. Many programmers still seem to be oblivious of AOP as a
programming paradigm.
Lack of common AOP framework: The major industrial application of AOP was Microsoft Transaction
Server and Enterprise Java Beans which were specific to a certain category of business applications – the
middleware components. There was lack of any universal AOP framework which would cater to all
requirements. Though there were couple of open source initiatives like AspectJ, Spring AOP etc, their
adoption within enterprise was not widespread due to lack of support and organizational policies. (For
example some enterprises would restrict use of open source tools due to complications arising out of
support)
Greater learning curve: Though AOP scored well on the separation of concerns part, adoption was still low
owing to the complexity in the learning part. Weaving aspects within business functionality required
experienced professionals who understood not only the nuts and bolts of AOP, but also how to use them
optimally. Also trouble shooting an aspect oriented code required more insight into the framework which is
being used. Scarcely documented frameworks were difficult to adopt by bigger programming community.
Lack of adequate Literature: Though AOP has been received with enthusiasm to the researcher
community, there is lack of adequate literature. The frameworks implementing AOP had their own
documentations about the framework whereas standards and literature concerning AOP as a programming
paradigm was not that developed. However its predecessors like OOP were more mature in terms available
literature.
Difficulty in quantifying benefits of AO implementation: Whenever outsourcing companies go through
any new technology or paradigm, they need to justify the quantifiable benefits to the business. Although
AOP as a concept exists for quite some time in the programming arena, there seems to be lack of standard
tools / frameworks which can quantify the benefits of optimal adoption of AOP in any project.
Legal Issues: Aspects could be a security risk because of them having control over all join points in the
system. Program modules dealing with sensitive data like payroll, financial transactions when penetrated by
aspect code pose security problem and violate regulatory compliance norms like SOX and BASEL II. In
this context AOP needs to move from simple join points to complex encapsulation and security models to
make sensitive join points unavailable for point cuts. [24]
Multi Vendor Integration: In case of multi vendor projects, integration of aspect code becomes a
challenge. In absence of a clear cut specification on the AOP tools, frameworks and methodologies,
integrating AOP modules can be challenging. Moreover one aspect code cutting through other vendor‟s
module can often result in confusion and yield undesirable results.
7. Adopting Aspect Oriented Programming…
www.ijesi.org 35 | Page
V. CONCLUSION
Aspect Oriented Programming is a topic of research for more than a decade over now. As per the MIT
technical review in January / February 2001, AOP was one of the ten emerging areas of technology that will
soon have a profound impact on the economy and on how we live and work [17]. However after almost a
decade, the adoption of AOP in services industry has been very cautious and scarce. Also this is a research area
which has received little attention from the researcher community. Adoption of AOP has been observed to a
moderate degree in the IT products sector. However, penetration into the services sector is still scarce /
unknown. There has been lot of work done in the area of quantifying the quality metrics AOP augments, but
there is almost no research to find out the outsourcer‟s perspective in adopting AOP within the enterprise.
MS Transaction server and Enterprise Java Beans which were the first major industrial application of
AOP still are not used to the extent it is expected where penetration of AOP would be optimal and meaningful.
Complex enterprise applications are still developed without using Java EE features like container managed
transactions, security and so on. Outsourcers seem to be concerned with the end product and not the means.
There are few success stories on AOP adoption by enterprises like Motorola [25], Siemens [26], HP
[27] and SAP [24]. However large scale adoption of AOP in enterprise wide computing is either scarce or
unknown. This needs to be investigated and corrective measured suggested so that AOP adoption will be
optimum.
REFERENCES
[1]. Gregor Kiczales, John Lamping, Anurag Mendhekar, Chris Maeda, Cristina Videira Lopes, Jean-Marc Loingtier, John Irwin,
European Conference on Object-Oriented Programming (ECOOP), Finland. Springer-Verlag LNCS 1241, June 1997, 2-3
[2]. Cristina Videira Lopes, Aspect Oriented Programming: An Historical Perspective - Institute of Software Research, University of
California, Irvine, December 2002, 9-10
[3]. Gregor Kiczales, John Lamping, Anurag Mendhekar RG: A case study for Aspect-Oriented Programming, February 1997,12-13
[4]. Karl Lieberherr and david H. Lorenz - Coupling Aspect-Oriented and Adaptive Programming April, 2003, 1-2
[5]. John Erwin, Jean-Marc Loingtier, John R. Gilbert, Gregor Kiczales, John lamping, Anurag Mendhekar, Tatiana Shpeisman -
Aspect-Oriented Programming of Sparse Matrix Code –- December, 1997, 2-7
[6]. Gregor Kiczales, Eric Hilsdale, Jim Hugunin, Mik Kersten, Jeffrey Palm and William G. Griswold - An Overview of AspectJ ––
June, 2001,1-9
[7]. Xin Ma, Lian-he Yang -AOP Research based on Enterprise Application – World Academy of Science and Technology, 2008 [8].
Roger T. Alexander, James M. Bieman - Challenges of Aspect Oriented Technology – Workshop on Software Quality, Florida,
2002, 1-3 [9]. Jianjun Zhao -Change Impact Analysis for Aspect Oriented Software Evolution – Department of Computer Science
and Technology, Fukuoka Institute of Technology – 2002, 3-5
[10]. Avadhesh Kumar, Rajesh Kumar, P.S. Grover - Maintainability of Aspect Oriented Software Systems – 2006, 6-7
[11]. Joon-Sang Leea, Doo-Hwan Baeb - An aspect-oriented framework for developing component-based software with the
collaboration-based architectural style, August 2002, 7-9
[12]. Roberta Coelho,Arndt von Staa, Uirá Kulesza, Awais Rashid, Carlos Lucena Unveiling and taming liabilities of aspects in the
presence of exceptions: A static analysis based approach – June 2010, 1-4
[13]. Jianjun Zhao - Maintenance Support for Aspect-Oriented Programs: Opportunities and Challenges – 2008, 2,6
[14]. Gail C. Murphy, Member, IEEE Computer Society, Robert J. Walker, Student Member, IEEE, and Elisa L.A. Baniassad -
Evaluating Emerging Software Development Technologies: Lessons Learned from Assessing Aspect-Oriented Programming,
August 1999, 1-4
[15]. Cristina Videira Lopes, Martin Lippert - A Study on Exception Detection and Handling Using Aspect-Oriented Programming –
2000, 10-11 [16]. Freddy Munoz, Benoit Baudry, Romain Delamare, Yves Le Traon - Inquiring the Usage of Aspect-Oriented
Programming: An Empirical Study – 2009, 2-4
[17]. http://www.ccs.neu.edu/research/demeter/aop/publicity/mit-tech-review.html - visited on 10-Dec-2012
[18]. Uirá Kulesza Cláudio Sant‟Anna, Alessandro Garcia, Roberta Coelho, Arndt von Staa, Carlos Lucena - Quantifying the Effects of
Aspect-Oriented Programming: A Maintenance Study – 2008
[19]. J. Kienzle, R. Guerraoui - AOP: Does it Make Sense? The Case of Concurrency and Failures.- Proc. ECOOP'02 – 2002, 2-5 [20].
The Open Group Architecture Framework (TOGAF) – Version – 9.0 [21]. Paulo Merson - Using Aspect-Oriented Programming to
Enforce Architecture – Software Engineering Institute - 2007 [22]. http://www.ibm.com/developerworks/java/library/j-aopwork15 -
Ramnivas Laddad, Feb 2006 - Visited on 01-Jan-2013
[23]. Kim Mens and Tom Tourwe - Evolution Issues in Aspect-Oriented Programming –
[24]. Christoph Pohl, Anis Charfi, Wasif Gilani, Steffen Göbel, Birgit Grammel, Henrik Lochmann, Andreas Rummler, Axel
Spriestersbach - Adopting Aspect-Oriented Software Development in Business Application Engineering - AOSD Industry Track
2008 Brussels, Belgium [25]. T. Cottenier, A. van den Berg, and T. Elrad. The Motorola WEAVR: Model Weaving in a Large
Industrial Context. - AOSD, 2007, 9-10
[26]. D. Wiese, R. Meunier, and U. Hohenstein. How to Convince Industry of AOP- AOSD, 2007,7-8 [27]. M. Mortensen and S. Ghosh.
Using Aspects with Object-Oriented Frameworks - AOSD, 2006, 1-4