Abstract. The software engineering is comparatively new and regularly changing field. The big challenge of meeting strict project schedules with high quality software requires that the field of software engineering be automated to large extent and human resource intervention be minimized to optimum level. To achieve this goal the researcher have explored the potential of machine learning approaches as they are adaptable, have learning ability. In this paper, we take a look at how genetic algorithm (GA) can be used to build tool for software development and maintenance tasks.
COMPARATIVE STUDY OF SOFTWARE ESTIMATION TECHNIQUES ijseajournal
Many information technology firms among other organizations have been working on how to perform estimation of the sources such as fund and other resources during software development processes. Software development life cycles require lot of activities and skills to avoid risks and the best software estimation technique is supposed to be employed. Therefore, in this research, a comparative study was conducted, that consider the accuracy, usage, and suitability of existing methods. It will be suitable for the project managers and project consultants during the whole software project development process. In this project technique such as linear regression; both algorithmic and non-algorithmic are applied. Model, composite and regression techniques are used to derive COCOMO, COCOMO II, SLIM and linear multiple respectively. Moreover, expertise-based and linear-based rules are applied in non-algorithm methods. However, the technique needs some advancement to reduce the errors that are experienced during the software development process. Therefore, this paper in relation to software estimation techniques has proposed a model that can be helpful to the information technology firms, researchers and other firms that use information technology in the processes such as budgeting and decision-making processes.
ENSEMBLE REGRESSION MODELS FOR SOFTWARE DEVELOPMENT EFFORT ESTIMATION: A COMP...ijseajournal
As demand for computer software continually increases, software scope and complexity become higher than ever. The software industry is in real need of accurate estimates of the project under development. Software development effort estimation is one of the main processes in software project management. However, overestimation and underestimation may cause the software industry loses. This study determines which technique has better effort prediction accuracy and propose combined techniques that could provide better estimates. Eight different ensemble models to estimate effort with Ensemble Models were compared with each other base on the predictive accuracy on the Mean Absolute Residual (MAR) criterion and statistical tests. The results have indicated that the proposed ensemble models, besides delivering high efficiency in contrast to its counterparts, and produces the best responses for software project effort estimation. Therefore, the proposed ensemble models in this study will help the project managers working with development quality software.
A DECISION SUPPORT SYSTEM FOR ESTIMATING COST OF SOFTWARE PROJECTS USING A HY...ijfcstjournal
One of the major challenges for software, nowadays, is software cost estimation. It refers to estimating the
cost of all activities including software development, design, supervision, maintenance and so on. Accurate
cost-estimation of software projects optimizes the internal and external processes, staff works, efforts and
the overheads to be coordinated with one another. In the management software projects, estimation must
be taken into account so that reduces costs, timing and possible risks to avoid project failure. In this paper,
a decision- support system using a combination of multi-layer artificial neural network and decision tree is
proposed to estimate the cost of software projects. In the model included into the proposed system,
normalizing factors, which is vital in evaluating efforts and costs estimation, is carried out using C4.5
decision tree. Moreover, testing and training factors are done by multi-layer artificial neural network and
the most optimal values are allocated to them. The experimental results and evaluations on Dataset
NASA60 show that the proposed system has less amount of the total average relative error compared with
COCOMO model.
EARLY STAGE SOFTWARE DEVELOPMENT EFFORT ESTIMATIONS – MAMDANI FIS VS NEURAL N...cscpconf
Accurately estimating the software size, cost, effort and schedule is probably the biggest
challenge facing software developers today. It has major implications for the management of
software development because both the overestimates and underestimates have direct impact for
causing damage to software companies. Lot of models have been proposed over the years by
various researchers for carrying out effort estimations. Also some of the studies for early stage
effort estimations suggest the importance of early estimations. New paradigms offer alternatives
to estimate the software development effort, in particular the Computational Intelligence (CI)
that exploits mechanisms of interaction between humans and processes domain
knowledge with the intention of building intelligent systems (IS). Among IS,
Artificial Neural Network and Fuzzy Logic are the two most popular soft computing techniques
for software development effort estimation. In this paper neural network models and Mamdani
FIS model have been used to predict the early stage effort estimations using the student dataset.
It has been found that Mamdani FIS was able to predict the early stage efforts more efficiently in
comparison to the neural network models based models.
In the present paper, applicability and
capability of A.I techniques for effort estimation prediction has
been investigated. It is seen that neuro fuzzy models are very
robust, characterized by fast computation, capable of handling
the distorted data. Due to the presence of data non-linearity, it is
an efficient quantitative tool to predict effort estimation. The one
hidden layer network has been developed named as OHLANFIS
using MATLAB simulation environment.
Here the initial parameters of the OHLANFIS are
identified using the subtractive clustering method. Parameters of
the Gaussian membership function are optimally determined
using the hybrid learning algorithm. From the analysis it is seen
that the Effort Estimation prediction model developed using
OHLANFIS technique has been able to perform well over normal
ANFIS Model.
SCHEDULING AND INSPECTION PLANNING IN SOFTWARE DEVELOPMENT PROJECTS USING MUL...ijseajournal
This paper presents a Multi-objective Hyper-heuristic Evolutionary Algorithm (MHypEA) for the solution
of Scheduling and Inspection Planning in Software Development Projects. Scheduling and Inspection
planning is a vital problem in software engineering whose main objective is to schedule the persons to
various activities in the software development process such as coding, inspection, testing and rework in
such a way that the quality of the software product is maximum and at the same time the project make span
and cost of the project are minimum. The problem becomes challenging when the size of the project is
huge. The MHypEA is an effective metaheuristic search technique for suggesting scheduling and inspection
planning. It incorporates twelve low-level heuristics which are based on different methods of selection,
crossover and mutation operations of Evolutionary Algorithms. The selection mechanism to select a lowlevel
heuristic is based on reinforcement learning with adaptive weights. The efficacy of the algorithm has
been studied on randomly generated test problem.
A Model To Compare The Degree Of Refactoring Opportunities Of Three Projects ...acijjournal
Refactoring is applied to the software artifacts so as to improve its internal structure, while preserving its
external behavior. Refactoring is an uncertain process and it is difficult to give some units for
measurement. The amount to refactoring that can be applied to the source-code depends upon the skills of
the developer. In this research, we have perceived refactoring as a quantified object on an ordinal scale of
measurement. We have a proposed a model for determining the degree of refactoring opportunities in the
given source-code. The model is applied on the three projects collected from a company. UML diagrams
are drawn for each project. The values for source-code metrics, that are useful in determining the quality of
code, are calculated for each UML of the projects. Based on the nominal values of metrics, each relevant
UML is represented on an ordinal scale. A machine learning tool, weka, is used to analyze the dataset,
imported in the form of arff file, produced by the three projects
COMPARATIVE STUDY OF SOFTWARE ESTIMATION TECHNIQUES ijseajournal
Many information technology firms among other organizations have been working on how to perform estimation of the sources such as fund and other resources during software development processes. Software development life cycles require lot of activities and skills to avoid risks and the best software estimation technique is supposed to be employed. Therefore, in this research, a comparative study was conducted, that consider the accuracy, usage, and suitability of existing methods. It will be suitable for the project managers and project consultants during the whole software project development process. In this project technique such as linear regression; both algorithmic and non-algorithmic are applied. Model, composite and regression techniques are used to derive COCOMO, COCOMO II, SLIM and linear multiple respectively. Moreover, expertise-based and linear-based rules are applied in non-algorithm methods. However, the technique needs some advancement to reduce the errors that are experienced during the software development process. Therefore, this paper in relation to software estimation techniques has proposed a model that can be helpful to the information technology firms, researchers and other firms that use information technology in the processes such as budgeting and decision-making processes.
ENSEMBLE REGRESSION MODELS FOR SOFTWARE DEVELOPMENT EFFORT ESTIMATION: A COMP...ijseajournal
As demand for computer software continually increases, software scope and complexity become higher than ever. The software industry is in real need of accurate estimates of the project under development. Software development effort estimation is one of the main processes in software project management. However, overestimation and underestimation may cause the software industry loses. This study determines which technique has better effort prediction accuracy and propose combined techniques that could provide better estimates. Eight different ensemble models to estimate effort with Ensemble Models were compared with each other base on the predictive accuracy on the Mean Absolute Residual (MAR) criterion and statistical tests. The results have indicated that the proposed ensemble models, besides delivering high efficiency in contrast to its counterparts, and produces the best responses for software project effort estimation. Therefore, the proposed ensemble models in this study will help the project managers working with development quality software.
A DECISION SUPPORT SYSTEM FOR ESTIMATING COST OF SOFTWARE PROJECTS USING A HY...ijfcstjournal
One of the major challenges for software, nowadays, is software cost estimation. It refers to estimating the
cost of all activities including software development, design, supervision, maintenance and so on. Accurate
cost-estimation of software projects optimizes the internal and external processes, staff works, efforts and
the overheads to be coordinated with one another. In the management software projects, estimation must
be taken into account so that reduces costs, timing and possible risks to avoid project failure. In this paper,
a decision- support system using a combination of multi-layer artificial neural network and decision tree is
proposed to estimate the cost of software projects. In the model included into the proposed system,
normalizing factors, which is vital in evaluating efforts and costs estimation, is carried out using C4.5
decision tree. Moreover, testing and training factors are done by multi-layer artificial neural network and
the most optimal values are allocated to them. The experimental results and evaluations on Dataset
NASA60 show that the proposed system has less amount of the total average relative error compared with
COCOMO model.
EARLY STAGE SOFTWARE DEVELOPMENT EFFORT ESTIMATIONS – MAMDANI FIS VS NEURAL N...cscpconf
Accurately estimating the software size, cost, effort and schedule is probably the biggest
challenge facing software developers today. It has major implications for the management of
software development because both the overestimates and underestimates have direct impact for
causing damage to software companies. Lot of models have been proposed over the years by
various researchers for carrying out effort estimations. Also some of the studies for early stage
effort estimations suggest the importance of early estimations. New paradigms offer alternatives
to estimate the software development effort, in particular the Computational Intelligence (CI)
that exploits mechanisms of interaction between humans and processes domain
knowledge with the intention of building intelligent systems (IS). Among IS,
Artificial Neural Network and Fuzzy Logic are the two most popular soft computing techniques
for software development effort estimation. In this paper neural network models and Mamdani
FIS model have been used to predict the early stage effort estimations using the student dataset.
It has been found that Mamdani FIS was able to predict the early stage efforts more efficiently in
comparison to the neural network models based models.
In the present paper, applicability and
capability of A.I techniques for effort estimation prediction has
been investigated. It is seen that neuro fuzzy models are very
robust, characterized by fast computation, capable of handling
the distorted data. Due to the presence of data non-linearity, it is
an efficient quantitative tool to predict effort estimation. The one
hidden layer network has been developed named as OHLANFIS
using MATLAB simulation environment.
Here the initial parameters of the OHLANFIS are
identified using the subtractive clustering method. Parameters of
the Gaussian membership function are optimally determined
using the hybrid learning algorithm. From the analysis it is seen
that the Effort Estimation prediction model developed using
OHLANFIS technique has been able to perform well over normal
ANFIS Model.
SCHEDULING AND INSPECTION PLANNING IN SOFTWARE DEVELOPMENT PROJECTS USING MUL...ijseajournal
This paper presents a Multi-objective Hyper-heuristic Evolutionary Algorithm (MHypEA) for the solution
of Scheduling and Inspection Planning in Software Development Projects. Scheduling and Inspection
planning is a vital problem in software engineering whose main objective is to schedule the persons to
various activities in the software development process such as coding, inspection, testing and rework in
such a way that the quality of the software product is maximum and at the same time the project make span
and cost of the project are minimum. The problem becomes challenging when the size of the project is
huge. The MHypEA is an effective metaheuristic search technique for suggesting scheduling and inspection
planning. It incorporates twelve low-level heuristics which are based on different methods of selection,
crossover and mutation operations of Evolutionary Algorithms. The selection mechanism to select a lowlevel
heuristic is based on reinforcement learning with adaptive weights. The efficacy of the algorithm has
been studied on randomly generated test problem.
A Model To Compare The Degree Of Refactoring Opportunities Of Three Projects ...acijjournal
Refactoring is applied to the software artifacts so as to improve its internal structure, while preserving its
external behavior. Refactoring is an uncertain process and it is difficult to give some units for
measurement. The amount to refactoring that can be applied to the source-code depends upon the skills of
the developer. In this research, we have perceived refactoring as a quantified object on an ordinal scale of
measurement. We have a proposed a model for determining the degree of refactoring opportunities in the
given source-code. The model is applied on the three projects collected from a company. UML diagrams
are drawn for each project. The values for source-code metrics, that are useful in determining the quality of
code, are calculated for each UML of the projects. Based on the nominal values of metrics, each relevant
UML is represented on an ordinal scale. A machine learning tool, weka, is used to analyze the dataset,
imported in the form of arff file, produced by the three projects
Review on Algorithmic and Non Algorithmic Software Cost Estimation Techniquesijtsrd
Effective software cost estimation is the most challenging and important activities in software development. Developers want a simple and accurate method of efforts estimation. Estimation of the cost before starting of work is a prediction and prediction always not accurate. Software effort estimation is a very critical task in the software engineering and to control quality and efficiency a suitable estimation technique is crucial. This paper gives a review of various available software effort estimation methods, mainly focus on the algorithmic model and non algorithmic model. These existing methods for software cost estimation are illustrated and their aspect will be discussed. No single technique is best for all situations, and thus a careful comparison of the results of several approaches is most likely to produce realistic estimation. This paper provides a detailed overview of existing software cost estimation models and techniques. This paper presents the strength and weakness of various cost estimation methods. This paper focuses on some of the relevant reasons that cause inaccurate estimation. Pa Pa Win | War War Myint | Hlaing Phyu Phyu Mon | Seint Wint Thu "Review on Algorithmic and Non-Algorithmic Software Cost Estimation Techniques" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd26511.pdfPaper URL: https://www.ijtsrd.com/engineering/-/26511/review-on-algorithmic-and-non-algorithmic-software-cost-estimation-techniques/pa-pa-win
A Novel Optimization towards Higher Reliability in Predictive Modelling towar...IJECEIAES
Although, the area of software engineering has made a remarkable progress in last decade but there is less attention towards the concept of code reusability in this regards.Code reusability is a subset of Software Reusability which is one of the signature topics in software engineering. We review the existing system to find that there is no progress or availability of standard research approach toward code reusability being introduced in last decade. Hence, this paper introduced a predictive framework that is used for optimizing the performance of code reusability. For this purpose, we introduce a case study of near real-time challenge and involved it in our modelling. We apply neural network and Damped-Least square algorithm to perform optimization with a sole target to compute and ensure highest possible reliability. The study outcome of our model exhibits higher reliability and better computational response time
TOWARDS PREDICTING SOFTWARE DEFECTS WITH CLUSTERING TECHNIQUESijaia
The purpose of software defect prediction is to improve the quality of a software project by building a
predictive model to decide whether a software module is or is not fault prone. In recent years, much
research in using machine learning techniques in this topic has been performed. Our aim was to evaluate
the performance of clustering techniques with feature selection schemes to address the problem of software
defect prediction problem. We analysed the National Aeronautics and Space Administration (NASA)
dataset benchmarks using three clustering algorithms: (1) Farthest First, (2) X-Means, and (3) selforganizing map (SOM). In order to evaluate different feature selection algorithms, this article presents a
comparative analysis involving software defects prediction based on Bat, Cuckoo, Grey Wolf Optimizer
(GWO), and particle swarm optimizer (PSO). The results obtained with the proposed clustering models
enabled us to build an efficient predictive model with a satisfactory detection rate and acceptable number
of features.
Software Defect Trend Forecasting In Open Source Projects using A Univariate ...CSCJournals
Our objective in this research is to provide a framework that will allow project managers, business owners, and developers an effective way to forecast the trend in software defects within a software project in real-time. By providing these stakeholders with a mechanism for forecasting defects, they can then provide the necessary resources at the right time in order to remove these defects before they become too much ultimately leading to software failure. In our research, we will not only show general trends in several open-source projects but also show trends in daily, monthly, and yearly activity. Our research shows that we can use this forecasting method up to 6 months out with only an MSE of 0.019. In this paper, we present our technique and methodologies for developing the inputs for the proposed model and the results of testing on seven open source projects. Further, we discuss the prediction models, the performance, and the implementation using the FBProphet framework and the ARIMA model.
Function Point Software Cost Estimates using Neuro-Fuzzy techniqueijceronline
Software estimation accuracy is among the greatest challenges for software developers. As Neurofuzzy based system is able to approximate the non-linear function with more precision so it is used as a soft computing approach to generate model by formulating the relationship based on its training. The approach presented in this paper is independent of the nature and type of estimation. In this paper, Function point is used as algorithmic model and an attempt is being made to validate the soundness of Neuro fuzzy technique using ISBSG and NASA project data.
Insights on Research Techniques towards Cost Estimation in Software Design IJECEIAES
Software cost estimation is of the most challenging task in project management in order to ensuring smoother development operation and target achievement. There has been evolution of various standards tools and techniques for cost estimation practiced in the industry at present times. However, it was never investigated about the overall picturization of effectiveness of such techniques till date. This paper initiates its contribution by presenting taxonomies of conventional cost-estimation techniques and then investigates the research trends towards frequently addressed problems in it. The paper also reviews the existing techniques in well-structured manner in order to highlight the problems addressed, techniques used, advantages associated and limitation explored from literatures. Finally, we also brief the explored open research issues as an added contribution to this manuscript.
Automatically Estimating Software Effort and Cost using Computing Intelligenc...cscpconf
In the IT industry, precisely estimate the effort of each software project the development cost
and schedule are count for much to the software company. So precisely estimation of man
power seems to be getting more important. In the past time, the IT companies estimate the work
effort of man power by human experts, using statistics method. However, the outcomes are
always unsatisfying the management level. Recently it becomes an interesting topic if computing
intelligence techniques can do better in this field. This research uses some computing
intelligence techniques, such as Pearson product-moment correlation coefficient method and
one-way ANOVA method to select key factors, and K-Means clustering algorithm to do project
clustering, to estimate the software project effort. The experimental result show that using
computing intelligence techniques to estimate the software project effort can get more precise
and more effective estimation than using traditional human experts did
APPLYING REQUIREMENT BASED COMPLEXITY FOR THE ESTIMATION OF SOFTWARE DEVELOPM...cscpconf
The need of computing the software complexity in requirement analysis phase of software
development life cycle (SDLC) would be an enormous benefit for estimating the required
development and testing effort for yet to be developed software. Also, a relationship between
source code and difficulty in developing a source code are also attempted in order to estimate the
complexity of the proposed software for cost estimation, man power build up, code and
developer’s evaluation. Therefore, this paper presents a systematic and an integrated approach
for the estimation of software development and testing effort on the basis of improved
requirement based complexity (IRBC) of the proposed software. The IRBC measure serves as the
basis for estimation of these software development activities to enable the developers and
practitioners to predict the critical information about the software development intricacies and
obtained from software requirement specification (SRS) of proposed software. Hence, this paper
presents an integrated approach, for the prediction of software development and testing effort
using IRBC. For validation purpose, the proposed measures are categorically compared with
various established and prevalent practices proposed in the past. Finally, the results obtained, validates the claim, for the approaches discussed in this paper, for estimation of software development and testing effort, in the early phases of SDLC appears to be robust, comprehensive, early alarming and compares well with other measures proposed in the past.
The analytic hierarchy process (AHP) has been applied in many fields and especially to complex
engineering problems and applications. The AHP is capable of structuring decision problems and finding
mathematically determined judgments built on knowledge and experience. This suggests that AHP should
prove useful in agile software development where complex decisions occur routinely. In this paper, the
AHP is used to rank the refactoring techniques based on the internal code quality attributes. XP
encourages applying the refactoring where the code smells bad. However, refactoring may consume more
time and efforts.So, to maximize the benefits of the refactoring in less time and effort, AHP has been
applied to achieve this purpose. It was found that ranking the refactoring techniques helped the XP team to
focus on the technique that improve the code and the XP development process in general.
A NEW HYBRID FOR SOFTWARE COST ESTIMATION USING PARTICLE SWARM OPTIMIZATION A...ieijjournal
Software Cost Estimation (SCE) is considered one of the most important sections in software engineering that results in capabilities and well-deserved influence on the processes of cost and effort. Two factors of cost and effort in software projects determine the success and failure of projects. The project that will be completed in a certain time and manpower is a successful one and will have good profit to project
managers. In most of the SCE techniques, algorithmic models such as COCOMO algorithm models have been used. COCOMO model is not capable of estimating the close approximations to the actual cost, because it runs in the form of linear. So, the models should be adapted that simultaneously with the number of Lines of Code (LOC) has the ability to estimate in a fair and accurate fashion for effort factors. Metaheuristic algorithms can be a good model for SCE due to the ability of local and global search. In this paper, we have used the hybrid of Particle Swarm Optimization (PSO) and Differential Evolution (DE) for the SCE. Test results on NASA60 software dataset show that the rate of Mean Magnitude of Relative Error (MMRE) error on hybrid model, in comparison with COCOMO model is reduced to about 9.55%
The peer-reviewed International Journal of Engineering Inventions (IJEI) is started with a mission to encourage contribution to research in Science and Technology. Encourage and motivate researchers in challenging areas of Sciences and Technology.
Pareto-Optimal Search-Based Software Engineering (POSBSE): A Literature SurveyAbdel Salam Sayyad
Paper presented at the 2nd International Workshop on Realizing Artificial Intelligence Synergies in Software Engineering (RAISE’13), San Francisco, USA. May 2013.
AN APPROACH FOR SOFTWARE EFFORT ESTIMATION USING FUZZY NUMBERS AND GENETIC AL...csandit
One of the most critical tasks during the software development life cycle is that of estimating the effort and time involved in the development of the software product. Estimation may be performed by many ways such as: Expert judgments, Algorithmic effort estimation, Machine
learning and Analogy-based estimation. In which Analogy-based software effort estimation is the process of identifying one or more historical projects that are similar to the project being developed and then using the estimates from them. Analogy-based estimation is integrated with Fuzzy numbers in order to improve the performance of software project effort estimation during
the early stages of a software development lifecycle. Because of uncertainty associated with attribute measurement and data availability, fuzzy logic is introduced in the proposed model.But hardly a historical project is exactly same as the project being estimated due to some distance associated in similarity distance. This means that the most similar project still has a
similarity distance with the project being estimated in most of the cases. Therefore, the effort needs to be adjusted when the most similar project has a similarity distance with the project being estimated. To adjust the reused effort, we build an adjustment mechanism whose
algorithm can derive the optimal adjustment on the reused effort using Genetic Algorithm. The proposed model Combine the fuzzy logic to estimate software effort in early stages with Genetic algorithm based adjustment mechanism may result to near the correct effort estimation.
An approach for software effort estimation using fuzzy numbers and genetic al...csandit
One of the most critical tasks during the software development life cycle is that of estimating the
effort and time involved in the development of the software product. Estimation may be
performed by many ways such as: Expert judgments, Algorithmic effort estimation, Machine
learning and Analogy-based estimation. In which Analogy-based software effort estimation is
the process of identifying one or more historical projects that are similar to the project being
developed and then using the estimates from them. Analogy-based estimation is integrated with
Fuzzy numbers in order to improve the performance of software project effort estimation during
the early stages of a software development lifecycle. Because of uncertainty associated with
attribute measurement and data availability, fuzzy logic is introduced in the proposed model.
But hardly a historical project is exactly same as the project being estimated due to some
distance associated in similarity distance. This means that the most similar project still has a
similarity distance with the project being estimated in most of the cases. Therefore, the effort
needs to be adjusted when the most similar project has a similarity distance with the project
being estimated. To adjust the reused effort, we build an adjustment mechanism whose
algorithm can derive the optimal adjustment on the reused effort using Genetic Algorithm. The
proposed model Combine the fuzzy logic to estimate software effort in early stages with Genetic
algorithm based adjustment mechanism may result to near the correct effort estimation.
A DECISION SUPPORT SYSTEM FOR ESTIMATING COST OF SOFTWARE PROJECTS USING A HY...ijfcstjournal
One of the major challenges for software, nowadays, is software cost estimation. It refers to estimating the
cost of all activities including software development, design, supervision, maintenance and so on. Accurate
cost-estimation of software projects optimizes the internal and external processes, staff works, efforts and
the overheads to be coordinated with one another. In the management software projects, estimation must
be taken into account so that reduces costs, timing and possible risks to avoid project failure. In this paper,
a decision- support system using a combination of multi-layer artificial neural network and decision tree is
proposed to estimate the cost of software projects. In the model included into the proposed system,
normalizing factors, which is vital in evaluating efforts and costs estimation, is carried out using C4.5
decision tree. Moreover, testing and training factors are done by multi-layer artificial neural network and
the most optimal values are allocated to them. The experimental results and evaluations on Dataset
NASA60 show that the proposed system has less amount of the total average relative error compared with
COCOMO model.
A MODEL TO COMPARE THE DEGREE OF REFACTORING OPPORTUNITIES OF THREE PROJECTS ...acijjournal
Refactoring is applied to the software artifacts so as to improve its internal structure, while preserving its
external behavior. Refactoring is an uncertain process and it is difficult to give some units for
measurement. The amount to refactoring that can be applied to the source-code depends upon the skills of
the developer. In this research, we have perceived refactoring as a quantified object on an ordinal scale of
measurement. We have a proposed a model for determining the degree of refactoring opportunities in the
given source-code. The model is applied on the three projects collected from a company. UML diagrams
are drawn for each project. The values for source-code metrics, that are useful in determining the quality of
code, are calculated for each UML of the projects. Based on the nominal values of metrics, each relevant
UML is represented on an ordinal scale. A machine learning tool, weka, is used to analyze the dataset,
imported in the form of arff file, produced by the three projects.
Review on Algorithmic and Non Algorithmic Software Cost Estimation Techniquesijtsrd
Effective software cost estimation is the most challenging and important activities in software development. Developers want a simple and accurate method of efforts estimation. Estimation of the cost before starting of work is a prediction and prediction always not accurate. Software effort estimation is a very critical task in the software engineering and to control quality and efficiency a suitable estimation technique is crucial. This paper gives a review of various available software effort estimation methods, mainly focus on the algorithmic model and non algorithmic model. These existing methods for software cost estimation are illustrated and their aspect will be discussed. No single technique is best for all situations, and thus a careful comparison of the results of several approaches is most likely to produce realistic estimation. This paper provides a detailed overview of existing software cost estimation models and techniques. This paper presents the strength and weakness of various cost estimation methods. This paper focuses on some of the relevant reasons that cause inaccurate estimation. Pa Pa Win | War War Myint | Hlaing Phyu Phyu Mon | Seint Wint Thu "Review on Algorithmic and Non-Algorithmic Software Cost Estimation Techniques" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd26511.pdfPaper URL: https://www.ijtsrd.com/engineering/-/26511/review-on-algorithmic-and-non-algorithmic-software-cost-estimation-techniques/pa-pa-win
A Novel Optimization towards Higher Reliability in Predictive Modelling towar...IJECEIAES
Although, the area of software engineering has made a remarkable progress in last decade but there is less attention towards the concept of code reusability in this regards.Code reusability is a subset of Software Reusability which is one of the signature topics in software engineering. We review the existing system to find that there is no progress or availability of standard research approach toward code reusability being introduced in last decade. Hence, this paper introduced a predictive framework that is used for optimizing the performance of code reusability. For this purpose, we introduce a case study of near real-time challenge and involved it in our modelling. We apply neural network and Damped-Least square algorithm to perform optimization with a sole target to compute and ensure highest possible reliability. The study outcome of our model exhibits higher reliability and better computational response time
TOWARDS PREDICTING SOFTWARE DEFECTS WITH CLUSTERING TECHNIQUESijaia
The purpose of software defect prediction is to improve the quality of a software project by building a
predictive model to decide whether a software module is or is not fault prone. In recent years, much
research in using machine learning techniques in this topic has been performed. Our aim was to evaluate
the performance of clustering techniques with feature selection schemes to address the problem of software
defect prediction problem. We analysed the National Aeronautics and Space Administration (NASA)
dataset benchmarks using three clustering algorithms: (1) Farthest First, (2) X-Means, and (3) selforganizing map (SOM). In order to evaluate different feature selection algorithms, this article presents a
comparative analysis involving software defects prediction based on Bat, Cuckoo, Grey Wolf Optimizer
(GWO), and particle swarm optimizer (PSO). The results obtained with the proposed clustering models
enabled us to build an efficient predictive model with a satisfactory detection rate and acceptable number
of features.
Software Defect Trend Forecasting In Open Source Projects using A Univariate ...CSCJournals
Our objective in this research is to provide a framework that will allow project managers, business owners, and developers an effective way to forecast the trend in software defects within a software project in real-time. By providing these stakeholders with a mechanism for forecasting defects, they can then provide the necessary resources at the right time in order to remove these defects before they become too much ultimately leading to software failure. In our research, we will not only show general trends in several open-source projects but also show trends in daily, monthly, and yearly activity. Our research shows that we can use this forecasting method up to 6 months out with only an MSE of 0.019. In this paper, we present our technique and methodologies for developing the inputs for the proposed model and the results of testing on seven open source projects. Further, we discuss the prediction models, the performance, and the implementation using the FBProphet framework and the ARIMA model.
Function Point Software Cost Estimates using Neuro-Fuzzy techniqueijceronline
Software estimation accuracy is among the greatest challenges for software developers. As Neurofuzzy based system is able to approximate the non-linear function with more precision so it is used as a soft computing approach to generate model by formulating the relationship based on its training. The approach presented in this paper is independent of the nature and type of estimation. In this paper, Function point is used as algorithmic model and an attempt is being made to validate the soundness of Neuro fuzzy technique using ISBSG and NASA project data.
Insights on Research Techniques towards Cost Estimation in Software Design IJECEIAES
Software cost estimation is of the most challenging task in project management in order to ensuring smoother development operation and target achievement. There has been evolution of various standards tools and techniques for cost estimation practiced in the industry at present times. However, it was never investigated about the overall picturization of effectiveness of such techniques till date. This paper initiates its contribution by presenting taxonomies of conventional cost-estimation techniques and then investigates the research trends towards frequently addressed problems in it. The paper also reviews the existing techniques in well-structured manner in order to highlight the problems addressed, techniques used, advantages associated and limitation explored from literatures. Finally, we also brief the explored open research issues as an added contribution to this manuscript.
Automatically Estimating Software Effort and Cost using Computing Intelligenc...cscpconf
In the IT industry, precisely estimate the effort of each software project the development cost
and schedule are count for much to the software company. So precisely estimation of man
power seems to be getting more important. In the past time, the IT companies estimate the work
effort of man power by human experts, using statistics method. However, the outcomes are
always unsatisfying the management level. Recently it becomes an interesting topic if computing
intelligence techniques can do better in this field. This research uses some computing
intelligence techniques, such as Pearson product-moment correlation coefficient method and
one-way ANOVA method to select key factors, and K-Means clustering algorithm to do project
clustering, to estimate the software project effort. The experimental result show that using
computing intelligence techniques to estimate the software project effort can get more precise
and more effective estimation than using traditional human experts did
APPLYING REQUIREMENT BASED COMPLEXITY FOR THE ESTIMATION OF SOFTWARE DEVELOPM...cscpconf
The need of computing the software complexity in requirement analysis phase of software
development life cycle (SDLC) would be an enormous benefit for estimating the required
development and testing effort for yet to be developed software. Also, a relationship between
source code and difficulty in developing a source code are also attempted in order to estimate the
complexity of the proposed software for cost estimation, man power build up, code and
developer’s evaluation. Therefore, this paper presents a systematic and an integrated approach
for the estimation of software development and testing effort on the basis of improved
requirement based complexity (IRBC) of the proposed software. The IRBC measure serves as the
basis for estimation of these software development activities to enable the developers and
practitioners to predict the critical information about the software development intricacies and
obtained from software requirement specification (SRS) of proposed software. Hence, this paper
presents an integrated approach, for the prediction of software development and testing effort
using IRBC. For validation purpose, the proposed measures are categorically compared with
various established and prevalent practices proposed in the past. Finally, the results obtained, validates the claim, for the approaches discussed in this paper, for estimation of software development and testing effort, in the early phases of SDLC appears to be robust, comprehensive, early alarming and compares well with other measures proposed in the past.
The analytic hierarchy process (AHP) has been applied in many fields and especially to complex
engineering problems and applications. The AHP is capable of structuring decision problems and finding
mathematically determined judgments built on knowledge and experience. This suggests that AHP should
prove useful in agile software development where complex decisions occur routinely. In this paper, the
AHP is used to rank the refactoring techniques based on the internal code quality attributes. XP
encourages applying the refactoring where the code smells bad. However, refactoring may consume more
time and efforts.So, to maximize the benefits of the refactoring in less time and effort, AHP has been
applied to achieve this purpose. It was found that ranking the refactoring techniques helped the XP team to
focus on the technique that improve the code and the XP development process in general.
A NEW HYBRID FOR SOFTWARE COST ESTIMATION USING PARTICLE SWARM OPTIMIZATION A...ieijjournal
Software Cost Estimation (SCE) is considered one of the most important sections in software engineering that results in capabilities and well-deserved influence on the processes of cost and effort. Two factors of cost and effort in software projects determine the success and failure of projects. The project that will be completed in a certain time and manpower is a successful one and will have good profit to project
managers. In most of the SCE techniques, algorithmic models such as COCOMO algorithm models have been used. COCOMO model is not capable of estimating the close approximations to the actual cost, because it runs in the form of linear. So, the models should be adapted that simultaneously with the number of Lines of Code (LOC) has the ability to estimate in a fair and accurate fashion for effort factors. Metaheuristic algorithms can be a good model for SCE due to the ability of local and global search. In this paper, we have used the hybrid of Particle Swarm Optimization (PSO) and Differential Evolution (DE) for the SCE. Test results on NASA60 software dataset show that the rate of Mean Magnitude of Relative Error (MMRE) error on hybrid model, in comparison with COCOMO model is reduced to about 9.55%
The peer-reviewed International Journal of Engineering Inventions (IJEI) is started with a mission to encourage contribution to research in Science and Technology. Encourage and motivate researchers in challenging areas of Sciences and Technology.
Pareto-Optimal Search-Based Software Engineering (POSBSE): A Literature SurveyAbdel Salam Sayyad
Paper presented at the 2nd International Workshop on Realizing Artificial Intelligence Synergies in Software Engineering (RAISE’13), San Francisco, USA. May 2013.
AN APPROACH FOR SOFTWARE EFFORT ESTIMATION USING FUZZY NUMBERS AND GENETIC AL...csandit
One of the most critical tasks during the software development life cycle is that of estimating the effort and time involved in the development of the software product. Estimation may be performed by many ways such as: Expert judgments, Algorithmic effort estimation, Machine
learning and Analogy-based estimation. In which Analogy-based software effort estimation is the process of identifying one or more historical projects that are similar to the project being developed and then using the estimates from them. Analogy-based estimation is integrated with Fuzzy numbers in order to improve the performance of software project effort estimation during
the early stages of a software development lifecycle. Because of uncertainty associated with attribute measurement and data availability, fuzzy logic is introduced in the proposed model.But hardly a historical project is exactly same as the project being estimated due to some distance associated in similarity distance. This means that the most similar project still has a
similarity distance with the project being estimated in most of the cases. Therefore, the effort needs to be adjusted when the most similar project has a similarity distance with the project being estimated. To adjust the reused effort, we build an adjustment mechanism whose
algorithm can derive the optimal adjustment on the reused effort using Genetic Algorithm. The proposed model Combine the fuzzy logic to estimate software effort in early stages with Genetic algorithm based adjustment mechanism may result to near the correct effort estimation.
An approach for software effort estimation using fuzzy numbers and genetic al...csandit
One of the most critical tasks during the software development life cycle is that of estimating the
effort and time involved in the development of the software product. Estimation may be
performed by many ways such as: Expert judgments, Algorithmic effort estimation, Machine
learning and Analogy-based estimation. In which Analogy-based software effort estimation is
the process of identifying one or more historical projects that are similar to the project being
developed and then using the estimates from them. Analogy-based estimation is integrated with
Fuzzy numbers in order to improve the performance of software project effort estimation during
the early stages of a software development lifecycle. Because of uncertainty associated with
attribute measurement and data availability, fuzzy logic is introduced in the proposed model.
But hardly a historical project is exactly same as the project being estimated due to some
distance associated in similarity distance. This means that the most similar project still has a
similarity distance with the project being estimated in most of the cases. Therefore, the effort
needs to be adjusted when the most similar project has a similarity distance with the project
being estimated. To adjust the reused effort, we build an adjustment mechanism whose
algorithm can derive the optimal adjustment on the reused effort using Genetic Algorithm. The
proposed model Combine the fuzzy logic to estimate software effort in early stages with Genetic
algorithm based adjustment mechanism may result to near the correct effort estimation.
A DECISION SUPPORT SYSTEM FOR ESTIMATING COST OF SOFTWARE PROJECTS USING A HY...ijfcstjournal
One of the major challenges for software, nowadays, is software cost estimation. It refers to estimating the
cost of all activities including software development, design, supervision, maintenance and so on. Accurate
cost-estimation of software projects optimizes the internal and external processes, staff works, efforts and
the overheads to be coordinated with one another. In the management software projects, estimation must
be taken into account so that reduces costs, timing and possible risks to avoid project failure. In this paper,
a decision- support system using a combination of multi-layer artificial neural network and decision tree is
proposed to estimate the cost of software projects. In the model included into the proposed system,
normalizing factors, which is vital in evaluating efforts and costs estimation, is carried out using C4.5
decision tree. Moreover, testing and training factors are done by multi-layer artificial neural network and
the most optimal values are allocated to them. The experimental results and evaluations on Dataset
NASA60 show that the proposed system has less amount of the total average relative error compared with
COCOMO model.
A MODEL TO COMPARE THE DEGREE OF REFACTORING OPPORTUNITIES OF THREE PROJECTS ...acijjournal
Refactoring is applied to the software artifacts so as to improve its internal structure, while preserving its
external behavior. Refactoring is an uncertain process and it is difficult to give some units for
measurement. The amount to refactoring that can be applied to the source-code depends upon the skills of
the developer. In this research, we have perceived refactoring as a quantified object on an ordinal scale of
measurement. We have a proposed a model for determining the degree of refactoring opportunities in the
given source-code. The model is applied on the three projects collected from a company. UML diagrams
are drawn for each project. The values for source-code metrics, that are useful in determining the quality of
code, are calculated for each UML of the projects. Based on the nominal values of metrics, each relevant
UML is represented on an ordinal scale. A machine learning tool, weka, is used to analyze the dataset,
imported in the form of arff file, produced by the three projects.
A Defect Prediction Model for Software Product based on ANFISIJSRD
Artificial intelligence techniques are day by day getting involvement in all the classification and prediction based process like environmental monitoring, stock exchange conditions, biomedical diagnosis, software engineering etc. However still there are yet to be simplify the challenges of selecting training criteria for design of artificial intelligence models used for prediction of results. This work focus on the defect prediction mechanism development using software metric data of KC1.We have taken subtractive clustering approach for generation of fuzzy inference system (FIS).The FIS rules are generated at different radius of influence of input attribute vectors and the developed rules are further modified by ANFIS technique to obtain the prediction of number of defects in software project using fuzzy logic system.
A Defect Prediction Model for Software Product based on ANFISIJSRD
Artificial intelligence techniques are day by day getting involvement in all the classification and prediction based process like environmental monitoring, stock exchange conditions, biomedical diagnosis, software engineering etc. However still there are yet to be simplify the challenges of selecting training criteria for design of artificial intelligence models used for prediction of results. This work focus on the defect prediction mechanism development using software metric data of KC1.We have taken subtractive clustering approach for generation of fuzzy inference system (FIS).The FIS rules are generated at different radius of influence of input attribute vectors and the developed rules are further modified by ANFIS technique to obtain the prediction of number of defects in software project using fuzzy logic system.
AN APPROACH FOR SOFTWARE EFFORT ESTIMATION USING FUZZY NUMBERS AND GENETIC AL...cscpconf
One of the most critical tasks during the software development life cycle is that of estimating the effort and time involved in the development of the software product. Estimation may be performed by many ways such as: Expert judgments, lgorithmic effort estimation, Machine learning and Analogy-based estimation. In which Analogy-based software effort estimation is
the process of identifying one or more historical projects that are similar to the project being developed and then using the estimates from them. Analogy-based estimation is integrated with Fuzzy numbers in order to improve the performance of software project effort estimation during the early stages of a software development lifecycle. Because of uncertainty associated with tribute measurement and data availability, fuzzy logic is introduced in the proposed model. But hardly a historical project is exactly same as the project being estimated due to some distance associated in similarity distance. This means that the most similar project still has a similarity distance with the project being estimated in most of the cases. Therefore, the effort needs to be adjusted when the most similar project has a similarity distance with the project being estimated. To adjust the reused effort, we build an adjustment mechanism whose
algorithm can derive the optimal adjustment on the reused effort using Genetic Algorithm. The proposed model Combine the fuzzy logic to estimate software effort in early stages with Genetic algorithm based adjustment mechanism may result to near the correct effort estimation.
Generation of Search Based Test Data on Acceptability Testing Principleiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
A new model for software costestimationijfcstjournal
Accurate and realistic estimation is always considered to be a great challenge in software industry.
Software Cost Estimation (SCE) is the standard application used to manage software projects. Determining
the amount of estimation in the initial stages of the project depends on planning other activities of the
project. In fact, the estimation is confronted with a number of uncertainties and barriers’, yet assessing the
previous projects is essential to solve this problem. Several models have been developed for the analysis of
software projects. But the classical reference method is the COCOMO model, there are other methods
which are also applied such as Function Point (FP), Line of Code(LOC); meanwhile, the expert`s opinions
matter in this regard. In recent years, the growth and the combination of meta-heuristic algorithms with
high accuracy have brought about a great achievement in software engineering. Meta-heuristic algorithms
which can analyze data from multiple dimensions and identify the optimum solution between them are
analytical tools for the analysis of data. In this paper, we have used the Harmony Search (HS)algorithm for
SCE. The proposed model which is a collection of 60 standard projects from Dataset NASA60 has been
assessed.The experimental results show that HS algorithm is a good way for determining the weight
similarity measures factors of software effort, and reducing the error of MRE.
A NEW MODEL FOR SOFTWARE COSTESTIMATION USING HARMONY SEARCHijfcstjournal
Accurate and realistic estimation is always considered to be a great challenge in software industry.
Software Cost Estimation (SCE) is the standard application used to manage software projects. Determining
the amount of estimation in the initial stages of the project depends on planning other activities of the
project. In fact, the estimation is confronted with a number of uncertainties and barriers’, yet assessing the
previous projects is essential to solve this problem. Several models have been developed for the analysis of
software projects. But the classical reference method is the COCOMO model, there are other methods
which are also applied such as Function Point (FP), Line of Code(LOC); meanwhile, the expert`s opinions
matter in this regard. In recent years, the growth and the combination of meta-heuristic algorithms with
high accuracy have brought about a great achievement in software engineering. Meta-heuristic algorithms
which can analyze data from multiple dimensions and identify the optimum solution between them are
analytical tools for the analysis of data. In this paper, we have used the Harmony Search (HS)algorithm for
SCE. The proposed model which is a collection of 60 standard projects from Dataset NASA60 has been
assessed.The experimental results show that HS algorithm is a good way for determining the weight
similarity measures factors of software effort, and reducing the error of MRE.
Productivity Factors in Software Development for PC PlatformIJERA Editor
Identifying the most relevant factors influencing project performance is essential for implementing business strategies by selecting and adjusting proper improvement activities. The two major classification algorithms CRT and ANN that were recommended by the Auto Classifier tool in SPSS Modeler used for determining the most important variables (attributes) of software development in PC environment. While the accuracy of classification of productive versus non-productive cases are relatively close (72% vs 69%), their ranking of important variables are different. CRT ranks the Programming Language as the most important variable and Function Points as the least important. On the other hand, ANN ranks the Function Points as the most important followed by team size and Programming Language.
Load Distribution Composite Design Pattern for Genetic Algorithm-Based Autono...ijsc
Current autonomic computing systems are ad hoc solutions that are designed and implemented from the scratch. When designing software, in most cases two or more patterns are to be composed to solve a bigger problem. A composite design patterns shows a synergy that makes the composition more than just the sum of its parts which leads to ready-made software architectures. As far as we know, there are no studies on composition of design patterns for autonomic computing domain. In this paper we propose pattern-oriented software architecture for self-optimization in autonomic computing system using design patterns composition and multi objective evolutionary algorithms that software designers and/or programmers can exploit to drive their work. Main objective of the system is to reduce the load in the server by distributing the population to clients. We used Case Based Reasoning, Database Access, and Master Slave design patterns. We evaluate the effectiveness of our architecture with and without design patterns compositions. The use of composite design patterns in the architecture and quantitative measurements are presented. A simple UML class diagram is used to describe the architecture.
LOAD DISTRIBUTION COMPOSITE DESIGN PATTERN FOR GENETIC ALGORITHM-BASED AUTONO...ijsc
Current autonomic computing systems are ad hoc solutions that are designed and implemented from the
scratch. When designing software, in most cases two or more patterns are to be composed to solve a bigger
problem. A composite design patterns shows a synergy that makes the composition more than just the sum
of its parts which leads to ready-made software architectures. As far as we know, there are no studies on
composition of design patterns for autonomic computing domain. In this paper we propose pattern-oriented
software architecture for self-optimization in autonomic computing system using design patterns
composition and multi objective evolutionary algorithms that software designers and/or programmers can
exploit to drive their work. Main objective of the system is to reduce the load in the server by distributing
the population to clients. We used Case Based Reasoning, Database Access, and Master Slave design
patterns. We evaluate the effectiveness of our architecture with and without design patterns compositions.
The use of composite design patterns in the architecture and quantitative measurements are presented. A
simple UML class diagram is used to describe the architecture.
A NEW HYBRID FOR SOFTWARE COST ESTIMATION USING PARTICLE SWARM OPTIMIZATION A...ieijjournal1
Software Cost Estimation (SCE) is considered one of the most important sections in software engineering
that results in capabilities and well-deserved influence on the processes of cost and effort. Two factors of
cost and effort in software projects determine the success and failure of projects. The project that will be
completed in a certain time and manpower is a successful one and will have good profit to project
managers. In most of the SCE techniques, algorithmic models such as COCOMO algorithm models have
been used. COCOMO model is not capable of estimating the close approximations to the actual cost,
because it runs in the form of linear. So, the models should be adapted that simultaneously with the number
of Lines of Code (LOC) has the ability to estimate in a fair and accurate fashion for effort factors. Metaheuristic
algorithms can be a good model for SCE due to the ability of local and global search. In this
paper, we have used the hybrid of Particle Swarm Optimization (PSO) and Differential Evolution (DE) for
the SCE. Test results on NASA60 software dataset show that the rate of Mean Magnitude of Relative Error
(MMRE) error on hybrid model, in comparison with COCOMO model is reduced to about 9.55%.
call for papers, research paper publishing, where to publish research paper, journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJEI, call for papers 2012,journal of science and technology, how to get a research paper published, publishing a paper, publishing of journal, research and review articles, engineering journal, International Journal of Engineering Inventions, hard copy of journal, hard copy of certificates, journal of engineering, online Submission, where to publish research paper, journal publishing, international journal, publishing a paper, hard copy journal, engineering journal
Software Quality Engineering is a broad area that is concerned with various approaches to improve software quality. A quality model would prove successful when it suffices the requirements of the developers and the consumers. This research focuses on establishing semantics between the existing techniques related to the software quality engineering and thereby designing a framework for rating software quality.
International Journal of Engineering and Science Invention (IJESI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJESI publishes research articles and reviews within the whole field Engineering Science and Technology, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
This paper presents a set of methods that uses a genetic algorithm for automatic test-data generation in
software testing. For several years researchers have proposed several methods for generating test data
which had different drawbacks. In this paper, we have presented various Genetic Algorithm (GA) based test
methods which will be having different parameters to automate the structural-oriented test data generation
on the basis of internal program structure. The factors discovered are used in evaluating the fitness
function of Genetic algorithm for selecting the best possible Test method. These methods take the test
populations as an input and then evaluate the test cases for that program. This integration will help in
improving the overall performance of genetic algorithm in search space exploration and exploitation fields
with better convergence rate.
Similar to Application of Genetic Algorithm in Software Engineering: A Review (20)
Drying of agricultural products using forced convection indirect solar dryerIRJESJOURNAL
Abstract:- Drying of three agricultural products namely potato slices, onion slices and whole grapes was done using an indigenously designed and fabricated forced convection indirect solar dryer and under open sunlight. The diurnal variation of temperature, relative humidity in the solar dryer was also compared with the ambient temperature and relative humidity during March and April 2017 for all the three products. The study showed increase of temperature and lower humidity inside the drying chamber at different time interval. Hourly moisture loss for all the three agricultural products in the drying chamber and open sun drying was also compared and the percentage of moisture loss in the drying chamber was found to be higher compared to open sun drying for all the products. The mass of water removed for all the three products in the drying chamber was also found to be higher than the open sun drying. Results of the study showed that forced convection indirect solar dryer is better than the open sun drying method for drying the agricultural products more efficiently.
The Problems of Constructing Optimal Onboard Colored RGB Depicting UAV SystemsIRJESJOURNAL
Abstract:-The problems of constructing optimal adaptive onboard color RGB depictingUAV systems have been analyzed. The problem of optimal formation of color signals of RGB color system has been formulated and solved by implementing the adaptive flight mode of UAVs containing an onboard imaging system. An adaptive UAV mode with an imaging system on board is proposed, which consists of adaptive changes in flight altitude depending on the wavelength of the received color signal. As a result of the optimization of the proposed operating mode of the UAV imaging system, an analytic formula for adaptive device control has been obtained. Recommendations have been given on the practical implementation of the proposed method.
Flexible Design Processes to Reduce the Early Obsolescence of BuildingsIRJESJOURNAL
Abstract:- This work intends to analyze the processes of flexibility to improve the adaptability to the users and to define some strategies to delay building obsolescence. Some approaches that address the architectural flexibility processes are studied to understand the rapid transformation of user lifestyles and changes in needs and performance building requirements. Obsolescence is often characterized by the lack of flexibility in the structure and walls, as well as services that change rapidly according to the different uses of buildings. This poses a threat to the built environment, since a large number of buildings are demolished having still years of useful life. In this way, different types of obsolescence are analyzed, focusing on some structural, economic, functional and social aspects of the construction and the use of buildings, seeking the capacity to design and produce adaptive buildings that are more resilient to obsolescence. Thus, some concepts of flexibility and flexible process are presented to promote adaptability in buildings. However, flexibility is a complex process, a long way to achieve adaptability to the built environment and the changing needs of users. The method used in this analysis takes into account the diversity of the design process, making some considerations about the interrelation of the social, functional and technical aspects. Finally, some conclusions about the design methods faced by a flexible approach process can lead to more useful and adaptable spaces for future transformations in order to extend the life cycle and prevent early obsolescence of buildings.
Study on Performance Enhancement of Solar Ejector Cooling SystemIRJESJOURNAL
Abstract: Cooling sector is dominating by vapor compression cooling sector which uses refrigerant which are harmful to environment. The solar ejector cooling system is alternative for vapor compression cycle which uses solar energy to give heat to the generator, which is a viable method for heat generation. The solar ejector cooling system not only fulfills cooling requirement but also helps in energy conservation and protection of environment. It reduces the generator work and decrease the throttling losses. Maintenance requirement and cost is low for ejector cooling system .In this paper, theoretically study is done on enhancement of the performance of solar ejector cooling system. Various system configuration are presented with detailed design. This system still needed a lot of research work to make it alternative for vapor compression cycle based cooling system completely.
Flight Safety Case Study: Adi Sucipto Airport Jogjakarta - IndonesiaIRJESJOURNAL
Abstract: Adi Sucipto Airport-Jogyakarta is an airport with enclave civil status or as TNI-AU airbase (civilian airport within the military area) has limited infrastructure with Azimuth Runway 09-27, has no RESA (Runway End Safety Area). The calculation results using Acceptable Safety Level (ASL) standard 1 x 10-7 shows that the probability of accident risk at wet runway condition is greater than in dry condition. Runway Excursion occurs at the airport, especially when the runway is wet and overrun due to hydroplaning and the plane deviates from the center of runway as well as the aircraft wheels are in contact with ground or obstacle surface outside the runway. It means the thicker layer of water above the runway will cause increased risk of accidents on the runway. This is why standing water should be immediately removed from the runway as quickly as possible. Mitigation efforts need to be done simultaneously with recovery by adding RESA and other preventive efforts in order to water patch and standing water does not exceed 2 mm and apply the mandatory of SOP consistently at the airport.
A Review of Severe Plastic DeformationIRJESJOURNAL
ABSTRACT: This article reviews about Ultrafine grained (UFG) materials processed by Severe Plastic
Deformation. From the period of 1950’s, the researchers made a fountain stone for this technique. Over the last
decades, this SPD technique experienced an enormous growth among the research field. There was a
development of different methods of SPD, production of various materials by SPD with improved and
interesting results based on our requirement. Moreover, different post processing techniques will also help to
enhance the property of the SPD processed material. This paper reviews the overall development of this
technique, various methods of SPD, discussed about the enhancement of the properties and finally concluded
with some specific challenges and issues faced by the modern researchers. It may be helpful to those who wants
specialise in bulk nanomaterials produced by SPD.
Annealing Response of Aluminum Alloy AA6014 Processed By Severe Plastic Defor...IRJESJOURNAL
Abstract: In this paper the study of micro structural stability during annealing with respect to time of conventionally grains (CG) and ultrafine-grained (UFG) of Aluminum AA6014 i s carried out. It has been observed that, the effect of the second phase magnesium-silicon particles in the CG and UFG AA6014 samples leads to a rapid hardness which increases from 40HV10 to 70HV10 within 7 days. Artificial aging shows that the material hardness even increased after 20 hours of annealing at 180°C. In total 30 hours of annealing, the hardness arrives at its maximum and then reduces due to the formation of Mg2Si precipitates, which rise in size and change their coherency. The precipitates cannot efficiently pin the dislocations and act as barriers to the dislocation motion which indicate an overall decrease in the hardness. It also has been found that the ultrafinegrained AA6014 alloy loses its thermal stability at approximately 200°C and recrystallized at 300°C. Thermal stability is strongly dependent on the material purity, second phase particles and/or oxide particles which may break up during rolling and lead to some dispersion strengthening.
Evaluation of Thresholding Based Noncontact Respiration Rate Monitoring using...IRJESJOURNAL
Abstract: - A noncontact method for respiration rate monitoring using thermal imaging was developed and evaluated. Algorithms to capture images, detect the location of the face, locate the corners of the eyes from the detected face and thereafter locate the tip of the nose in each image were developed. The amount of emitted infrared radiation was then determined from the detected tip of the nose. Signal processing techniques were then utilised to obtain the respiration rate in real-time. The method was evaluated on 6 enrolled subjects after obtaining all ethical approvals. The evaluations were conducted against two existing contact based methods; thoracic and abdominal bands. Results showed a correlation coefficient of 0.9974 to 0.9999 depending on the location of the ROI relative to the detected tip of the nose. The main contributions of the work was the successful development and evaluation of the facial features tracking algorithms in thermal imagining, the evaluation of thermal imaging as a technology for respiration monitoring in a hospital environment against existing respiration monitoring systems as well as the real time nature of the method where the frame processing time was 40 ms from capture to respiration feature plotting.
Correlation of True Boiling Point of Crude OilIRJESJOURNAL
Abstract :- The knowledge of the crude boiling point is very important for the refining process design and optimization. In this project the aim is to find the correlation of true boiling points. The study will be very useful in crude transportation and downstream operations. Correlation is tried to obtain by testing a number of crude oil samples from heavy to light. The comparisons of boiling point of different crude samples obtained is tried to compare with already existing correlations. Framol, Destmol and Riazi’s, these three correlation models have taken. The result showed that comparison of three correlation models and which is more accurate.
Combined Geophysical And Geotechnical Techniques For Assessment Of Foundation...IRJESJOURNAL
Abstract: This study was carried out to assess the subsurface conditions around the school of technology complex in Lagos State Polytechnic, Ikorodu, using integrated geophysical and geotechnical techniques. The site lies within the Sedimentary terrain of southwestern Nigeria. Allied Ohmega Resistivity meter was used for data collection of 1-D and 2-D resistivitymeasurement while WinResist software and Dipro software were used for the processing respectively.The results of the vertical electrical sounding indicate that the depth to basement values ranges between 27.6 and 39.5m. The 2D resistivitysurvey has provided valuable information on the lateral and vertical variation of the layer competent for erecting foundation of engineering structures. The CPT probed an average depth of 4.8m and has identified material of very high shear strength associated with dense sand materials. The correlation of the three techniques used revealed similar soil layering consisting of topsoilsandy clay, coarse sand and sand.A mechanically stable coarse sand material was discovered as weathered layer which indicates high load bearing capacity suitable for foundation in the area and can support massive structures.
Abstract:- research stands out because it is provided by the model of Al-Mobaideen (2009) critics to analyze for the governance of information and communications technology (ICT) at the National University of Chimborazo factors which raises the factors such as: strategies and policies, infrastructure and networks, financing and sustainability, and institutional culture that should be taken into account if desired govern the successful integration of ICT in the school. The study is exploratory, the almost total lack of previous studies on Governance of ICT integration at the University. It is concluded that there is a set of organizations with addresses IT markedly different roles in their duties with regard to its orientation to administrative, academic and research. The University has failed to define the strategic role of ICT in their academic, because there is no objective referred to IT academia in 2013-2016 pedi, but also because there is not a pedi-oriented IT the formation. The limited effectiveness of IT organizations in academic activities is provided by the low rate of use of educationalplatformsb_learning.
Gobernanzade las TIC en la Educacion SuperiorIRJESJOURNAL
Abstract:-Se destaca la investigación debido a que se da a conocer mediante el modelo de Al-Mobaideen (2009) los factores críticos a analizar para la gobernanza de las Tecnologías de la información y la Comunicación (TIC) en la Universidad Nacional de Chimborazo donde plantea los factores como:estrategias y políticas, infraestructura y redes, financiación y sostenibilidad, y cultura institucional, que se debe tomar en cuenta si se desea gobernar la integración exitosa de las TIC en la institución educativa. El estudio es exploratoria, por la poca presencia de estudios previos sobre Gobernanza de la integración de las TIC en la Universidad. Se concluye que existe un conjunto de organismos con direcciones de TI con roles notoriamente diferenciados en sus funciones con respecto a su orientación a procesos administrativos, académicos y de investigación. La Universidad no ha logrado definir el rol estratégico de las TIC en su desarrollo académico, porque no existe ningún objetivo referido a TI para el ámbito académico en el PEDI 2013-2016, sino porque además, no se cuenta con un PEDI de TI orientado a la formación. La poca eficacia de los organismos de TI en actividades académicas se da a conocer por la baja tasa de uso de plataformas educativas b_learning.
The Analysis and Perspective on Development of Chinese Automotive Heavy-duty ...IRJESJOURNAL
Abstract: In recent years, under the influence of both China's domestic market demand and emissions standard improvement, Chinese manufacturers put great effort on the research and design of automotive heavy-duty diesel engine. This paper analyzes the technical parameters of heavy duty diesel engine in 11 / 13L displacement section and introduces its performance. At the same time, combined with the development of foreign heavy-duty diesel engine, the future development direction of Chinese heavy-duty diesel engine is forecasted.
Research on The Bottom Software of Electronic Control System In Automobile El...IRJESJOURNAL
Abstract: With the development of science and technology, car replacement faster and faster. The development of the automotive industry has a contradiction, on the one hand, the speed of upgrading the car technology can not keep up with the speed of the performance requirements of the car, on the other hand, the country's automobile exhaust emission standards become more stringent. In addition, the depletion of oil resources led to the rise in gasoline prices, the traditional car is facing a crisis. Considering the situation of gas fuel resource structure and supply situation in China, it is feasible to promote gas fuel engine[1].However, the pollution caused by the car has become one of the major pollution sources in the urban environment and the atmospheric environment, and this trend continues to deteriorate[2].Therefore, alternative energy vehicles and hybrid cars is the main direction of development, and any improvement in the car will be car electronics and software replacement for the premise. On the one hand, natural gas as an alternative to gasoline, with its low prices, excellent combustion emissions, the relative sustainable development and other characteristics of more and more car manufacturers favor;On the other hand, the mainstream of the automotive electronic control unit ECU software development to AUTOSAR structure, low power consumption, functional safety for the development direction. Based on the actual development of natural gas engine control unit, the structure and function of ECU software are studied with reference to AUTOSAR software design standard. This paper studies the structure of the application of the software layer of the electronic control system and the main control strategy under the various conditions of the structure, and puts forward the underlying software resources needed by the application layer software. This paper analyzes the internal and peripheral resources of Infineon XC2785x microcontroller and designs hardware abstraction layer software and ECU abstraction layer software. The current characteristics of the jet valve driven by the natural gas multi-point injection engine were investigated. Automotive electronics technology has been widely used in modern vehicles which, and gradually become the development of new models, improve the performance of the key technical factors[3] .
Evaluation of Specialized Virtual Health Libraries in Scholar Education Evalu...IRJESJOURNAL
Abstract:- The aim is to evaluate the impact on academic training with specialized virtual health libraries (databases and catalogs) available in Institutions of Scholar Education, because there is uncertainty about the appropriate use of these libraries. The research was conducted on the databases available on 2 universities during the academic period August 2015 - February 2016. Using criteria and indicators for evaluating virtual libraries, model quality of university libraries based on fuzzy techniques, Bibliometric and criteria for virtual libraries in health. The study had the participation of 188 students from two universities or groups. The research reveals that for the first group and the second group almost always (60.45%) find the information, the (57.2%) have relevance to the topic, access (45.8%) once a month, and Elseiver and BiblioMedica are the most commonly used, however, mostly ie (78.55%) use traditional libraries versus (58.2%) which are virtual. Descriptive analysis was performed using the software SPSSv20. This experience allows us to confirm that the use of libraries contributes discreetly in academic education, therefore, it requires training plans, reference guides, strengthen the socialization of this resource, free access from anywhere.
Linking Ab Initio-Calphad for the Assessment of the AluminiumLutetium SystemIRJESJOURNAL
Abstract: First-principles calculations within density functional theory (DFT) were used to investigate intermetallics in the Al-Lu system at 0 K. The five compounds of the system were investigated in their observed experimental structures. Thermodynamic modelling of the Au–Lu system was carried out by means of the CALPHAD (calculation of phase diagrams) method. The liquid phase and the intermetallic compounds Al3Lu, Al2Lu, AlLu, Al2Lu3 and AlLu2 are taken into consideration in this optimization. The substitutional solution model was used to describe the liquid phase. The five compounds are treated as stoichiometric phases. The enthalpies of formation of the compounds were found by the ab initio calculations and used in the optimization of the phase diagram.
Thermodynamic Assessment (Suggestions) Of the Gold-Rubidium SystemIRJESJOURNAL
Abstract: Thermodynamic modellings of the Au–Rb system was carried out by means of the CALPHAD (calculation of phase diagrams) method. The liquid phase and the intermetallic compounds Au5Rb, Au2Rb, AuRb and Au7Rb3 and Au3Rb2 (new compounds) in addition to the compound AuRb2 (suspected compound) are taken into consideration in this optimization. The substitutional solution model was used to describe the liquid phase. The six compounds are treated as stoichiometric phases. The enthalpies of formation used in these optimizations were calculated within ab-initio method in precedent work
Elisa Test for Determination of Grapevine Viral Infection in Rahovec, KosovoIRJESJOURNAL
Abstract: Vineyard in Kosovo is estimated to have a great economic potential. There are thousands of hectares of vineyards that contribute to the economic potential of Rahovec by expanding the cultivation area year by year. The vines are affected by a number of viral diseases or pathologies similar to them, which significantly have an impact against the plant life and their production. Therefore, this study was conducted in several farms in Rahovec to determine whether there is a presence of viral infection in the vines. Application of Das-Elisa, Protein A-DAS and Antigen Direct Binding - DASI verified the final identification of viral infection in the collected material. The yellow colour reaction shown on the plate showed the positive result of the Elisa assay for viruses GFLV, ArMV, GLRaV-1, GLRaV-2, GLRaV-3, GVA and GVB in varieties Vranac, Smederevka, Prokup, Afuzali, Grocaka, Demir Kapi, Plovdina, Melika, Zhillavka. The use of specific antibodies will enable the examination of viral diseases in plant materials collected from vineyards and will be oriented to their phytosanitary status.
Abstract. Ensuring of permanent and continuous working process of oil-gas and field equipment alongside with the other factors, depends also on reliability of sealing units. A problem of deterioration modeling of a sealing element of a packer including into an oil field equipment complex is considered in this paper.
Determining Loss of Liquid from Different Types of Mud by Various Addictives ...IRJESJOURNAL
Abstract :- Filtration is used in many industries to separate water from the solid. It is important to find fluid loss in drilling, cementing, fracturing, and almost every other type of downhole treatment design. The filter cake characterization is very essential for well selection of drilling fluid problems and formation damage. Therefore this study is taken up to experimentally investigate the effect of different concentrations of CMC, Starch, Wood fibers, Soda ash, Caustic soda, Bentonite and Barite on filtration loss and formation damages. Three different samples are used in this study at different concentration and a comparison is made. Although the discussion presented here is confined to fluid loss during drilling. Water-based drilling mud’s including Bentonite is wellknown and is being widely used in the petroleum industry. Among the important functions of water-based drilling fluid were to form filter cake on the wall of the well bore, prevent water leakage, and maintain the stability of the well wall. The properties of the water-based drilling fluid, such as the rheology and filtration loss, are affected by the fluid loss additive. Polymers, which are nontoxic, degradable, and environment friendly, are the best choice to be used as drilling fluids additives.
We have compiled the most important slides from each speaker's presentation. This year’s compilation, available for free, captures the key insights and contributions shared during the DfMAy 2024 conference.
Water billing management system project report.pdfKamal Acharya
Our project entitled “Water Billing Management System” aims is to generate Water bill with all the charges and penalty. Manual system that is employed is extremely laborious and quite inadequate. It only makes the process more difficult and hard.
The aim of our project is to develop a system that is meant to partially computerize the work performed in the Water Board like generating monthly Water bill, record of consuming unit of water, store record of the customer and previous unpaid record.
We used HTML/PHP as front end and MYSQL as back end for developing our project. HTML is primarily a visual design environment. We can create a android application by designing the form and that make up the user interface. Adding android application code to the form and the objects such as buttons and text boxes on them and adding any required support code in additional modular.
MySQL is free open source database that facilitates the effective management of the databases by connecting them to the software. It is a stable ,reliable and the powerful solution with the advanced features and advantages which are as follows: Data Security.MySQL is free open source database that facilitates the effective management of the databases by connecting them to the software.
Welcome to WIPAC Monthly the magazine brought to you by the LinkedIn Group Water Industry Process Automation & Control.
In this month's edition, along with this month's industry news to celebrate the 13 years since the group was created we have articles including
A case study of the used of Advanced Process Control at the Wastewater Treatment works at Lleida in Spain
A look back on an article on smart wastewater networks in order to see how the industry has measured up in the interim around the adoption of Digital Transformation in the Water Industry.
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...ssuser7dcef0
Power plants release a large amount of water vapor into the
atmosphere through the stack. The flue gas can be a potential
source for obtaining much needed cooling water for a power
plant. If a power plant could recover and reuse a portion of this
moisture, it could reduce its total cooling water intake
requirement. One of the most practical way to recover water
from flue gas is to use a condensing heat exchanger. The power
plant could also recover latent heat due to condensation as well
as sensible heat due to lowering the flue gas exit temperature.
Additionally, harmful acids released from the stack can be
reduced in a condensing heat exchanger by acid condensation. reduced in a condensing heat exchanger by acid condensation.
Condensation of vapors in flue gas is a complicated
phenomenon since heat and mass transfer of water vapor and
various acids simultaneously occur in the presence of noncondensable
gases such as nitrogen and oxygen. Design of a
condenser depends on the knowledge and understanding of the
heat and mass transfer processes. A computer program for
numerical simulations of water (H2O) and sulfuric acid (H2SO4)
condensation in a flue gas condensing heat exchanger was
developed using MATLAB. Governing equations based on
mass and energy balances for the system were derived to
predict variables such as flue gas exit temperature, cooling
water outlet temperature, mole fraction and condensation rates
of water and sulfuric acid vapors. The equations were solved
using an iterative solution technique with calculations of heat
and mass transfer coefficients and physical properties.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdffxintegritypublishin
Advancements in technology unveil a myriad of electrical and electronic breakthroughs geared towards efficiently harnessing limited resources to meet human energy demands. The optimization of hybrid solar PV panels and pumped hydro energy supply systems plays a pivotal role in utilizing natural resources effectively. This initiative not only benefits humanity but also fosters environmental sustainability. The study investigated the design optimization of these hybrid systems, focusing on understanding solar radiation patterns, identifying geographical influences on solar radiation, formulating a mathematical model for system optimization, and determining the optimal configuration of PV panels and pumped hydro storage. Through a comparative analysis approach and eight weeks of data collection, the study addressed key research questions related to solar radiation patterns and optimal system design. The findings highlighted regions with heightened solar radiation levels, showcasing substantial potential for power generation and emphasizing the system's efficiency. Optimizing system design significantly boosted power generation, promoted renewable energy utilization, and enhanced energy storage capacity. The study underscored the benefits of optimizing hybrid solar PV panels and pumped hydro energy supply systems for sustainable energy usage. Optimizing the design of solar PV panels and pumped hydro energy supply systems as examined across diverse climatic conditions in a developing country, not only enhances power generation but also improves the integration of renewable energy sources and boosts energy storage capacities, particularly beneficial for less economically prosperous regions. Additionally, the study provides valuable insights for advancing energy research in economically viable areas. Recommendations included conducting site-specific assessments, utilizing advanced modeling tools, implementing regular maintenance protocols, and enhancing communication among system components.
6th International Conference on Machine Learning & Applications (CMLA 2024)ClaraZara1
6th International Conference on Machine Learning & Applications (CMLA 2024) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of on Machine Learning & Applications.
Final project report on grocery store management system..pdfKamal Acharya
In today’s fast-changing business environment, it’s extremely important to be able to respond to client needs in the most effective and timely manner. If your customers wish to see your business online and have instant access to your products or services.
Online Grocery Store is an e-commerce website, which retails various grocery products. This project allows viewing various products available enables registered users to purchase desired products instantly using Paytm, UPI payment processor (Instant Pay) and also can place order by using Cash on Delivery (Pay Later) option. This project provides an easy access to Administrators and Managers to view orders placed using Pay Later and Instant Pay options.
In order to develop an e-commerce website, a number of Technologies must be studied and understood. These include multi-tiered architecture, server and client-side scripting techniques, implementation technologies, programming language (such as PHP, HTML, CSS, JavaScript) and MySQL relational databases. This is a project with the objective to develop a basic website where a consumer is provided with a shopping cart website and also to know about the technologies used to develop such a website.
This document will discuss each of the underlying technologies to create and implement an e- commerce website.
Final project report on grocery store management system..pdf
Application of Genetic Algorithm in Software Engineering: A Review
1. International Refereed Journal of Engineering and Science (IRJES)
ISSN (Online) 2319-183X, (Print) 2319-1821
Volume 6, Issue 2 (February 2017), PP. 63-69
www.irjes.com 63 | Page
Application of Genetic Algorithm in Software Engineering: A
Review
Reena1,
Pradeep Kumar Bhatia1
1
Department Of Computer Science & Engineering Guru Jambheshwar University Of Science & Technology,
Hisar(Haryana)
Abstract. The software engineering is comparatively new and regularly changing field. The big challenge of
meeting strict project schedules with high quality software requires that the field of software engineering be
automated to large extent and human resource intervention be minimized to optimum level. To achieve this goal
the researcher have explored the potential of machine learning approaches as they are adaptable, have learning
ability. In this paper, we take a look at how genetic algorithm (GA) can be used to build tool for software
development and maintenance tasks.
Keywords: Genetic Algorithm, Software Testing, Component Repository.
I. INTRODUCTION
Modern software is becoming more expensive to build and maintain. Software development
management and software quality goals are necessary, but not competent for the needs of today's marketplace.
Shorter cycle time, completed with least resources is also in demand [2].The challenge of developing software
system in a fast movingEvolutionary Algorithms scenario gives rise to anumber of demanding situation. First
situation is identifying software components is a crucial task in software development. The second one is to
minimize number of test cases develop for the testing purpose. To answer the challenge, a number of approach
can be utilized one such approach is the evolutionary algorithm [1]. By using evolutionary algorithm software is
developed, modified and maintained at specification level, and automatically produced high quality software in
shorter period [3].This evolutionary approach will enable software engineering to become the discipline
capturing and automating currently undocumented domain and design knowledge [4].
In order to realize its full potential, there are tools and methodologies needed for the various tasks
inherent to the evolutionary algorithm. In this paper, we take a look at how genetic algorithm can be used to
build tool for software development and maintenance task as genetic algorithm have robustness and Genetic
Algorithms are commonly used to generate high-quality solutions to optimization and search problems by
relying on bio-inspired operators such as mutation, crossover and selection [1]. In this paper, we survey the
existing work on application of GA in software engineering and provide research directions for the future work
in this area.
II. GENETIC ALGORITHM (GA) METHODOLOGY
Genetic algorithms (Goldberg, 1989) in particular became popular through the work of John Holland
[5] in the early 1970s, and particularly his book Adaptation in Natural and Artificial Systems (1975). Genetic
Algorithms (GAs) are adaptive heuristic search techniques based on the evolutionary ideas of natural and
genetic selection [6]. It represents an intelligent exploitation of a random search within a defined search space to
solve a problem. Genetic algorithms are based on the principles of the evolution via natural selection, employing
a population of individuals that undergo selection in the presence of variation- inducing operators, such as
mutation and recombination. GAs is best used when the search space is large, complex and poorly understood,
when domain knowledge is scarce or expert knowledge is difficult to encode. GAs also useful when there is a
need to narrow the search space and in case of failure of traditional search methods [5, 6].
Algorithm for a GA is as follows [6]
Initialize (population)
Evaluate (population)
While (stopping condition not satisfied) do
{
Selection (population)
Crossover (population)
Mutate (population)
Evaluate (population)
}
2. Application Of Genetic Algorithm In Software Engineering: A Review
www.irjes.com 64 | Page
The algorithm will repeat until the population has evolved to form a solution to the problem, Or until a
maximum number of iterations have taken place (suggesting that a solution is not Going to be found given the
resources available. Figure 1 depicts the steps involved genetic algorithm.
Figure1. Various Steps of Genetic Algorithm
1. Random population of n chromosomes is generated
2. Fitness value of each chromosome is evaluated
3. Create new population by applying genetic operators like Selection, Crossover, and Mutation etc.
4. New population generation is replaced.
5. If the specified condition is satisfied stop and return the solution.
III. SOFTWARE DEVELOPMENT LIFE CYCLE (SDLC) AND APPLICATIONS OF GAS IN
SOFTWARE ENGINEERING
A variety of life cycle models has been proposed and is based on task involved in developing software [8].
Figure 2 shows SDLC/CBSD phases and applications of GAs in software engineering.
3. Application Of Genetic Algorithm In Software Engineering: A Review
www.irjes.com 65 | Page
4 Applications of GAs in Software Engineering
Several areas in software development have already witnessed the use of GAs. In this section, we take
look at some reported result of application of GAs in the field of software engineering. The list is definitely not
a complete. It only serves as an indication that people realize the potential of GAs and begin to reap the benefits
from applying them in software development.
4.1 Software Project Effort Estimation
Software cost estimation is one of the most challenging issues in software project development. To
produce the accurate estimation, many models have been developed, but no model proves efficient with the
uncertainty of the project development. Most of these models are based on the size measure, such as Lines of
Code (LOC) and Function Point (FP) and Size estimation accuracy directly effect on cost estimation accuracy.
As all we know the COCOMO model is the important model for Software Cost Estimation. Today’s effort
estimation models are based on soft computing techniques such as, genetic algorithm, fuzzy logic, neural
network etc for finding the accurate predictive software development effort and time estimation.Genetic
Algorithm can provide significant enhancement in accuracy and has the potential to be a valid additional tool for
software effort estimation in large project. Genetic algorithm has been used for difficult numerical optimization
problems and also used to solve system identification, signal processing and path searching problems [26].
Brajesh et al. proposed a model to estimate the software effort for projects sponsored by NASA using
binary genetic algorithm. Modified version of the COCOMO model was provided to consider the effect of
methodology in effort estimation. The performance of the developed model was tested on NASA software
project data and the developed models were able to provide good estimation capabilities [27].
Vishaliet et al. proposed algorithm (GAs) was tested and the obtained results were compared with the
ones obtained using the current COCOMO model coefficients. The results of the experiment show that in most
cases the results obtained using the coefficients optimized by the proposed algorithm are close to the ones
obtained using the current coefficients. Comparing organic and semi-detached COCOMO model modes, it can
be stated that use of the coefficients optimized by the GA and ACO in the organic mode produces better results
in comparison with the results obtained using the current COCOMO model coefficients [28].
Asthaet et al. proposed Genetic Algorithm (GAs) is tested on TURKISH and INDUSTRY dataset and
the obtained results are compared with the ones obtained using the current COCOMO II PA model coefficients.
The proposed model is able to provide better estimation capabilities. It is concluded that, By comparing the
results, it can be stated that having the appropriate statistical data describing the software development projects,
GAs based coefficients can be used to produces better results in comparison with the results obtained using the
current COCOMO II PA model coefficients. The results also show that in most cases the results obtained using
the coefficients optimized with the propose algorithm are close to the ones obtained using the current
coefficients. The results also prove that in most cases the results obtained using the coefficients optimized with
the propose algorithm are less than the real effort values [29]. Isa et al. have proposed a hybrid model based on
GA and ACO for optimization of the effective factors’ weight in NASA dataset software projects. The results
show that the proposed model is more efficient than COCOMO model in software projects cost estimation and
holds less Magnitude of Relative Error (MRE) in comparison to COCOMO model [30].
4.2 Software Metrics (Design and Coding)
Software metrics are numeric value related to software development. Metrics have traditionally been
consisting through the definition of an equation, but this technique is limited by the fact that all the
interrelationships among all the parameters be fully understood. The aim of research is to find the alternative
methods for generating software metrics. Deriving a metrics using a GAs has several advantages [12].
R Vankudothet et al. work on selection of system software component. It is an important decision of
design stage and has a significant impact on various system quality attributes. To determine system software
component based on architectural style selection, the software functionalities have to be distributed among the
components of software. The author present a method based on the Genetic Algorithm that use cases the concept
and design procedure of Genetic Algorithm as techniques is proposed to identify software components and their
responsibilities. To select a proper Genetic Algorithm method, first the proposed method is performed on a
number of software systems using different Genetic Algorithm methods, and the results are verified by expert,
and the best recommended. By sensitivity analysis, the effect of features on accuracy of Genetic Algorithm is
evaluated then Finally determine the appropriate number of Genetic Algorithm (i.e. the number of software
components), metrics of the interior cohesion of Genetic Algorithm and the coupling among them are used[31].
CBSD is used to reduce software development time by bringing the system to markets as early as possible.
CBSD process consists of four major processes: component qualification, component adaptation, component
composition and component update [10]. To realize the benefits which CBS brings it is imperative that the right
software component is selected for a project, because selecting inappropriate component may results in
4. Application Of Genetic Algorithm In Software Engineering: A Review
www.irjes.com 66 | Page
increased time and cost of software development but CBSD aims at reducing [11, 12]. Component selection is a
major challenge to CBS developers, due to the multiplicity of similar components on the market with varying
capabilities. Several approaches and criteria have been proposed for component selection, there is no well-
defined procedure to select optimized components. K Vijayalakshmiet. al has given an automated approach
based on Genetic Algorithm that enables the selection of software components both considering functional and
non-functional requirements to find the best combination of components [9], [10], [11], [12].
Seyed Mohammed et al. propose a novel GA-based algorithm (Genetic Algorithm) as a powerful
optimization search algorithm, called SCI-GA (Software Component Identification using Genetic Algorithm), to
identify components from analysis models. The SCI-GA uses software cohesion, coupling, and complexity
measurements to define its fitness function. For performance evaluation, the algorithm SCI-GA is evaluated
using three real-world cases. The results show that SCI-GA can identify correct suboptimal software
components, and performs far better than alternative heuristics like k-means and FCA-Based methods [1].
Kwonget.et al. has given the formulation of an optimization model of software components selection
for CBSS development. This model has two objectives: maximizing the functional performance of the CBSS
and maximizing the cohesion and minimizing the coupling of software modules. A genetic algorithm (GA) is
used to solve the optimization model for determining the optimal selection of software components for CBSS
development. It was prove by giving an example of developing a financial system for small- and medium-size
enterprises is used to illustrate the proposed methodology [10].
Saxsena et al. an attempt to throw light which on the one of the major issue of component based
software engineering is concerned with the “Component Selection”. Genetic Algorithms based approach is used
for component selection to minimize the gap between components are selected [11].
4.3 Software Testing Activities
Software testing is the process of executing a program with the intention finding bugs. Software testing
consumes major resource in term of effort, time in software product’s lifecycle. Test cases and test data
generation is the key problem in software testing and as well as its automation improves the efficiency and
effectiveness and lowers the high cost of software testing. Generation of test data using random, symbolic and
dynamic approach is not enough to generate optimal amount of test data. Some other problems, like non-
recognition of occurrences of infinite loops and inefficiency to generate test data for complex programs makes
these techniques unsuitable for generating test data. That why there is need for generating test data using search
based technique. In addition to these there is also need of generating test cases that concentrate on error prone
areas of code [13], [14], [15], [16].
The application of Genetic Algorithm in Software Testing is a new area of research that brings about
the cross fertilization of ideas across two domains. Genetic Algorithm is used to generate test cases while
ensuring that the generated test cases are not redundant. It maximizes the test coverage for the generated test
cases. In order to carry out the effectiveness of the test cases and test data the quantification, measurement and
the perfect modeling is required which is done by using the accurate suite of software test metrics. The test
metrics are used to measure the number, complexity, quality. Abhishek et.al applied the optimization study of
the test case generation based on the Genetic Algorithm and generates test cases which are far more reliable
[17], [18].
By examining the most critical paths first, obtain an effective way to approach testing which in turn
helps to refine effort and cost estimation in the testing phase. The experiments conducted so far are based on
relatively small examples and more research needs to be conducted with larger commercial examples.Yang et. al
introduce an approach of generating test data for a specific single path based on genetic algorithms. The
similarity between the target path and execution path with sub path overlapped is taken as the fitness value to
evaluate the individuals of a population and drive GA to search the appropriate solutions. The authors conducted
several experiments to examine the effectiveness of the designed fitness function, and evaluated the
performance of the function with regards to its convergence ability and consumed time. Results prove that the
function performs better as compared with the other two typical fitness functions for the specific paths
employed by the authors [19], [20].
Aladeen et al[14] have compared the software test data for automatic path coverage using genetic
algorithm with Yong [20] for generating test data of path testing. They found GAs is useful in reducing the time
required for lengthy testing by generating the meaningful test cases for path testing. The GAs is required to be
built for structural testing for reduce execution time by generating more suitable test cases.
Roy et al. propose a technique that uses a Gas for automatic test – data generation. A GAs is a heuristic
that mimics the evolution of natural species in searching for the optimal solution to a problem. In the test-data
generation application, the solution sought by the GAs is test data that causes execution of a given statement,
branch, path or definition-use pair in the program under test. The test data generation technique was
implemented in a tool called TGen in which parallel processing was used to improve the performance of the
5. Application Of Genetic Algorithm In Software Engineering: A Review
www.irjes.com 67 | Page
search. To experiment with TGen, a random test data generator called Random was also implemented. Both
TGen and Random were used to experiment with the generation of test data for statement and branch coverage
of six programs [41].
Rajappa et al.proposed graph theory based on genetic approach to generate test cases for software
testing. In this approach the directed graph of all the intermediate states of the system for the expected behaviour
is created and the base population of genetic algorithm is generated by creating a population of all the nodes of
the graph. A pair of nodes referred to as parents are then selected from the population to perform crossover and
mutation on them to obtain the optimum nodes. The process is continued until all the nodes are covered and this
process is followed for the generation of test case in the real time system. The technique is more accurate in case
of network testing or any other system testing where the predictive model based tests are not optimized to
produce the output [15]. Parveenand Tai have demonstrated that it is possible to apply Genetic Algorithm
techniques for finding the most critical paths for improving software testing efficiency. The Genetic Algorithms
also outperforms the exhaustive search and local search techniques and in conclusion, by examining the most
critical paths first, we obtain a more effective way to approach testing which in turn helps to refine effort and
cost estimation in the testing phase [42].K Singh used Genetic algorithm in scheduling of tasks to be executed
on a multiprocessor system. Genetic algorithms are well suited to multiprocessor scheduling problems. As the
resources are increased available to the GAs, it is able to find better solutions in short time. GAs performs better
as compared to other traditional techniques. So GAs appears to be the most flexible algorithm for problems
using multiple processors. It also indicates that the GAs is able to adapt automatically to changes in the problem
to be solved [24].
4.4 Other Software Metrics (Quality, Reliability and Maintenance)
Garvin describes quality from five different views: transcendental view, user view, manufacturers
view, product view and value based view. Quality must be monitored from the early phases to final phase such
as analysis, design, implementation and maintenance phases. There are many quality models given, some of the
standard models are listed here: McCall’s model (1979), FCMM model and Bohem’s model. McCall’s model
contains 11 attributes, out of which two are described here such as reliability and maintenance [33].
M Amoui et al. work for Improving software quality. It is a major area in software development
process. Despite all previous attempts to evolve software for quality improvement, these methods are neither
scalable nor fully automatable so in this research authors approach software evolution problem by reformulating
it as a search problem. For this purpose, author apply software transformations in a form of GOF patterns to
UML design model and evaluated the quality of the transformed design according to Object-Oriented metrics,
particularly ’Distance from the Main Sequence’. This research based formulation of the problem enables us to
use Genetic Algorithm for optimizing the metrics and find the best sequence of transformations. The
implementation results show that Genetic Algorithm is able to find the optimal solution efficiently, especially
when different genetic operators, adapted to characteristics of transformations, are used [34].
D M Thakore et al. work on the security issues by using GAs. Assigning access specifier is not an easy
task as it decides over all security of any software though there are many metrics tools available to measure the
security at early stage. But assignment of access specifier is totally based on the human judgment and
understanding .Objective of Secure Coupling Measurement Tool (SCMT) is to generate all possible solutions by
applying Genetic Algorithm (GA). It is quietly different than any other security Measurement Tool because it
filters input design before applying metrics by GA.SCMT uses coupling, also feature of OO design to determine
the security at design level. It Takes input as a UML class diagram with basic constraints and generates alternate
solutions. Tool also provides metrics at code level to compute the security at code level. The result of both the
metrics gives proof of secure design [35]. S H Aljahdali use GAs as powerful technique to estimate the
parameters of well known reliability model. Software reliability models are useful to estimate the probability of
the software fail along the time. Several different models have been proposed to predict the software reliability
growth (SRGM); but none of them has proven to perform well considering different project characteristics. The
ability to predict the number of faults in the software during development and testing phases.GAs is a powerful
machine learning technique and optimization techniques to estimate the parameters of well-known reliably
growth models. Moreover, machine learning algorithms, proposed the solution to overcome the uncertainties in
the modeling by combining multiple models aiming at a more accurate prediction at the expense of increased
uncertainty [36]. Baqais et al.[38] used GAs for estimating maintenance effort and cost. Maintenance is an
important activity in the software development life cycle and no software product can do without undergoing the
process of maintenance. Estimating a software’s maintainability effort and cost is not an easy task considering
the various factors that influence the proposed measurement in software development. Abdulrahman et.al
proposes an Evolutionary Neural Network (NN) model to predict software maintainability. The proposed model
is based on a hybrid intelligent technique wherein a neural network is trained for prediction and a genetic
algorithm (GA) implementation is used for evolving the neural network topology until an optimal topology is
6. Application Of Genetic Algorithm In Software Engineering: A Review
www.irjes.com 68 | Page
reached and the model was applied on a popular open source program, namely, Android. The results are very
fine, where the correlation between actual and predicted points reaches 0.91 [37].
IV. CONCLUSION AND RESEARCH DIRECTIONS
In this paper, we show how GAs has been used in tackling many software engineering problems. The GAs has
been used in various phases of software development like from requirement and analysis phase and software
testing phase. It is also used developing new metrics. This will definitely help maturing software engineering
discipline. There is urging to develop GAs based tools that become a part of software engineering and help in
automating the software development process to optimal level. So there is a need for GAs community to come
forward in help of software engineering discipline, so that full potential of GAs can be utilized in solving the
problem faced by software professionals.Similar type studies must be carried out with large data sets to improve
technique of test case generation. Moreover we can say that GAs is emerging field in software engineering.
REFERENCES
[1]. H. Seyed, M Hossein, and S Jalili, "SCI-GA: Software Component Identification using Genetic
Algorithm”, Journal of Object Technology, 2013, pp. 1-3.
[2]. K Vijayalakshmi, N Ramaraj, and RAmuthakkannan, "Improvement of component selection process
using genetic algorithm for component-based software development", International Journal of
Information Systems and Change Management, 2008, pp. 63-80.
[3]. Y Singh, P K Bhatia, A Kaur, and O Sangwan, "Application of neural networks in software
engineering: A review", In International Conference on Information Systems, Technology and
Management, Springer Berlin Heidelberg, 2009, pp. 128-137.
[4]. M Harman, S AMansouri, and Y Zhang, "Search based software engineering: A comprehensive
analysis and review of trends techniques and applications", Department of Computer Science, King’s
College London, Tech. Rep. TR-09-03, 2009.
[5]. M R Girgis, "Automatic Test Data Generation for Data Flow Testing Using a Genetic Algorithm", J.
UCS 11, 2005, PP.898-915.
[6]. G M Morris, D S Goodsell, R S. Halliday, Ruth Huey, William E Hart, R K Belew, and A J Olson,
"Automated docking using a Lamarckian genetic algorithm and an empirical binding free energy
function", Journal of computational chemistry 1998, pp.1639-1662.
[7]. H Mühlenbein, and D S Voosen, "Predictive models for the breeder genetic algorithm i. continuous
parameter optimization", Evolutionary computation 1993, pp. 25-49.
[8]. J F Tang, L F Mu, C K Kwong, and X G. Luo, "An optimization model for software component
selection under multiple applications development", European Journal of Operational Research 212,
2011, PP. 301-311.
[9]. J Pande, C J Garcia, and D Pan, "Optimal component selection for component based software
development using pliability metric", ACM SIGSOFT Software Engineering Notes 38, no. 1, 2013,
pp. 1-6.
[10]. A Dixit, and P. C. Saxena, "Software component retrieval using genetic algorithms", In Computer and
Automation Engineering, 2009. ICCAE'09. International Conference on, IEEE, 2009, pp. 151-155.
[11]. S Parnami, K S Sharma, and S V Chande., "A survey on generation of test cases and test data using
artificial intelligence techniques", International Journal of Advances in Computer Networks and its
Security 2, no. 1, 2012, pp. 16-18.
[12]. A S Ghiduk, and M R Girgis, "Using genetic algorithms and dominance concepts for generating
reduced test data", Informatica 34, no. 3 2010.
[13]. D C Koboldt, K M Steinberg, D E Larson, R K. Wilson, and E R. Mardis, "The next-generation
sequencing revolution and its impact on genomics", Cell 155, no. 1, 2013, pp.27-38.
[14]. S M Mohi-Aldeen, R Mohamad, and S Deris, "Automatic Test Case Generation for Structural Testing
Using Negative Selection Algorithm", 2009.
[15]. V Rajappa, ABiradar, and S Panda. "Efficient software test case generation using genetic algorithm
based graph theory", In 2008 First International Conference on Emerging Trends in Engineering and
Technology, IEEE, 2008, pp. 298-303.
[16]. Y Fuqing, M Hong, and LKeqin, "Software Reuse and Software Component Technology [J]", Acta
Electronica Sinica 2, 1999.
[17]. S Sabharwal, R Sibal, and C Sharma, "Prioritization of test case scenarios derived from activity
diagram using genetic algorithm", In Computer and Communication Technology (ICCCT), 2010
International Conference on, IEEE,2010, pp. 481-485.
[18]. R P Pargas, M J Harrold, and R R Peck, "Test-data generation using genetic algorithms", Software
Testing Verification and Reliability 9, no. 4 1999, pp.263-282.
7. Application Of Genetic Algorithm In Software Engineering: A Review
www.irjes.com 69 | Page
[19]. C Sharm, S Sabharwal, and R Sibal, "A survey on software testing techniques using genetic
algorithm", 2014.
[20]. A Kaur, and S Goyal, "A genetic algorithm for regression test case prioritization using code
coverage", International journal on computer science and engineering 3, no. 5, 2011, pp. 1839-1847.
[21]. R Krishnamoorthi, and SA S A Mary, "Regression test suite prioritization using genetic
algorithms", International Journal of Hybrid Information Technology 2, no. 3, 2009, pp.35-52.
[22]. A Bertolino, "Software testing research: Achievements, challenges, dreams", In 2007 Future of
Software Engineering, IEEE Computer Society, 2007, pp.85-103.
[23]. P McMinn, "Search-based software test data generation: A survey," Software Testing Verification and
Reliability 14, no. 2, 2004, pp.105-156.
[24]. K Singh, "Effective Software Testing using Genetic Algorithms", Journal of Global Research in
Computer Science 2, no. 4, 2011.
[25]. N Haghpanah, S Moaven, J Habibi, M Kargar, and S H Yeganeh, "Approximation algorithms for
software component selection problem", In 14th Asia-Pacific Software Engineering Conference
(APSEC'07), IEEE, 2007, pp. 159-166.
[26]. S Bhatia, A Bawa, and V K Attri, "A Review on Genetic algorithm to deal with Optimization of
Parameters of Constructive Cost Model", International Journal of Advanced Research in Computer and
Communication Engineering 4, no. 4, 2015.
[27]. B K Singh and A. K. Misra, "Software effort estimation by genetic algorithm tuned parameters of
modified constructive cost model for nasa software projects", International Journal of Computer
Applications 59, no. 9, 2012.
[28]. A Dhiman and C Diwaker ,"Optimization of COCOMO II effort estimation using genetic algorithm",
American International Journal of Research in Science, Technology, Engineering & Mathematics 3, no.
2, 2013.
[29]. I Maleki, A Ghaffari and M Masdari, "A new approach for software cost estimation with hybrid genetic
algorithm and ant colony optimization", International Journal of Innovation and Applied Studies 5, no.
1, 2014.
[30]. R Vankudoth, P Shireesha and T. Rajani, “A Model of System Software Components Using Genetic
Algorithm and Techniques”, International Journal of Advanced Research in Computer Science and
Software Engineering, 2016, pp. 301-306.
[31]. A Martens, H Koziolek, S Becker, and R Reussner, "Automatically improve software architecture
models for performance, reliability and cost using evolutionary algorithms", In Proceedings of the first
joint WOSP/SIPEW international conference on Performance engineering, ACM, 2010, pp.105- 116.
[32]. J. A McCall, P. K, Richards and G. F. Wallers, “Factors in software quality “, Griffiths Air Force Base,
N. Y: Rome Air Development Center Air Force Systems Command, 1977.
[33]. M Amoui, S Mirarab, S Ansari, and C Lucas, "A genetic algorithm approach to design evolution using
design pattern transformation", International Journal of Information Technology and Intelligent
Computing 1, no. 2, 2006, pp. 235-244.
[34]. D M Thakore and T Kamble, "Use of Genetic Algorithm in Quality Measurement", International
Journal of Computer Applications 60, no. 8, 2012, pp. 24-28.
[35]. S H Aljahdali, and M E El-Telbany, "Genetic algorithms for optimizing ensemble of models in
software reliability prediction", International Journal on Artificial Intelligence and Machine Learning
(AIML) ICGST 8, no. 1, 2008, pp. 5-13.
[36]. AA B Baqais, M Alshayeb, and Z ABaig, "Hybrid intelligent model for software maintenance
prediction", Proceedings of the World Congress on Engineering UK 2013.
[37]. M Jyoti, and L Bhambhu, "Modified Genetic Algorithm for Efficient Regression Test Cases",
International Journal of Advanced Research in Computer and Communication Engineering, 2015, pp.
206-209.
[38]. R Malhotra and D Tiwari, "Development of a framework for test case prioritization using genetic
algorithm", ACM SIGSOFT Software Engineering Notes 38, no. 3, 2013,pp.1-6.
[39]. S Yoo and M Harman, "Regression testing minimization, selection and prioritization: a
survey", Software Testing, Verification and Reliability 22, no. 2, 2012, pp. 67-120.
[40]. R P Paragas, M J Harrold and R R Peck, “Test data generation using genetic algorithm”, software
testing verification & reliability 9, no. 4, 1994, pp.263-282.
[41]. P R Srivastava, and T Kim, "Application of genetic algorithm in software testing", International
Journal of software Engineering and its Applications 3, no. 4, 2009, pp. 87-96.