COMPARATIVE STUDY OF SOFTWARE ESTIMATION TECHNIQUES ijseajournal
Many information technology firms among other organizations have been working on how to perform estimation of the sources such as fund and other resources during software development processes. Software development life cycles require lot of activities and skills to avoid risks and the best software estimation technique is supposed to be employed. Therefore, in this research, a comparative study was conducted, that consider the accuracy, usage, and suitability of existing methods. It will be suitable for the project managers and project consultants during the whole software project development process. In this project technique such as linear regression; both algorithmic and non-algorithmic are applied. Model, composite and regression techniques are used to derive COCOMO, COCOMO II, SLIM and linear multiple respectively. Moreover, expertise-based and linear-based rules are applied in non-algorithm methods. However, the technique needs some advancement to reduce the errors that are experienced during the software development process. Therefore, this paper in relation to software estimation techniques has proposed a model that can be helpful to the information technology firms, researchers and other firms that use information technology in the processes such as budgeting and decision-making processes.
EARLY STAGE SOFTWARE DEVELOPMENT EFFORT ESTIMATIONS – MAMDANI FIS VS NEURAL N...cscpconf
Accurately estimating the software size, cost, effort and schedule is probably the biggest
challenge facing software developers today. It has major implications for the management of
software development because both the overestimates and underestimates have direct impact for
causing damage to software companies. Lot of models have been proposed over the years by
various researchers for carrying out effort estimations. Also some of the studies for early stage
effort estimations suggest the importance of early estimations. New paradigms offer alternatives
to estimate the software development effort, in particular the Computational Intelligence (CI)
that exploits mechanisms of interaction between humans and processes domain
knowledge with the intention of building intelligent systems (IS). Among IS,
Artificial Neural Network and Fuzzy Logic are the two most popular soft computing techniques
for software development effort estimation. In this paper neural network models and Mamdani
FIS model have been used to predict the early stage effort estimations using the student dataset.
It has been found that Mamdani FIS was able to predict the early stage efforts more efficiently in
comparison to the neural network models based models.
Using Data Mining to Identify COSMIC Function Point Measurement Competence IJECEIAES
Cosmic Function Point (CFP) measurement errors leads budget, schedule and quality problems in software projects. Therefore, it’s important to identify and plan requirements engineers’ CFP training need quickly and correctly. The purpose of this paper is to identify software requirements engineers’ COSMIC Function Point measurement competence development need by using machine learning algorithms and requirements artifacts created by engineers. Used artifacts have been provided by a large service and technology company ecosystem in Telco. First, feature set has been extracted from the requirements model at hand. To do the data preparation for educational data mining, requirements and COSMIC Function Point (CFP) audit documents have been converted into CFP data set based on the designed feature set. This data set has been used to train and test the machine learning models by designing two different experiment settings to reach statistically significant results. Ten different machine learning algorithms have been used. Finally, algorithm performances have been compared with a baseline and each other to find the best performing models on this data set. In conclusion, REPTree, OneR, and Support Vector Machines (SVM) with Sequential Minimal Optimization (SMO) algorithms achieved top performance in forecasting requirements engineers’ CFP training need.
Iceemas 119- state of art of metrics of aspect oriented programmingMazen Ghareb
The document summarizes Mazen Ismaeel Ghareb's conference presentation on aspect-oriented programming (AOP) metrics. It discusses AOP and various proposed AOP metrics from literature. The methodology analyzes 6 common AOP metrics from 23 papers. Results show most AOP metrics improve aspects of software development over object-oriented programming, though the effects depend on implementation and problem. The conclusion is that the chosen metrics generally demonstrate AOP techniques can enhance modularity, coupling, cohesion and other qualities.
ER Publication,
IJETR, IJMCTR,
Journals,
International Journals,
High Impact Journals,
Monthly Journal,
Good quality Journals,
Research,
Research Papers,
Research Article,
Free Journals, Open access Journals,
erpublication.org,
Engineering Journal,
Science Journals,
The document describes a proposed web application for automating project management tasks at an engineering institute. The application would allow students to form groups, get project approvals, submit work, and receive feedback and evaluations. It consists of two modules - one for online project work and another to evaluate student and project progress. The goal is to streamline project activities and provide a centralized platform for communication between students and guides.
Application of Genetic Algorithm in Software Engineering: A ReviewIRJESJOURNAL
Abstract. The software engineering is comparatively new and regularly changing field. The big challenge of meeting strict project schedules with high quality software requires that the field of software engineering be automated to large extent and human resource intervention be minimized to optimum level. To achieve this goal the researcher have explored the potential of machine learning approaches as they are adaptable, have learning ability. In this paper, we take a look at how genetic algorithm (GA) can be used to build tool for software development and maintenance tasks.
The article proposes a new model for optimizing software effort and cost estimation based on code reusability. The model compares new projects to previously completed, similar projects stored in a code repository. By searching for and retrieving reusable code, functions, and methods from old projects, the model aims to reduce effort and cost estimates for new software development. The model is described as being based on the concept of estimation by analogy and using innovative search and retrieval techniques to achieve code reuse and thus decreased cost and effort estimates.
COMPARATIVE STUDY OF SOFTWARE ESTIMATION TECHNIQUES ijseajournal
Many information technology firms among other organizations have been working on how to perform estimation of the sources such as fund and other resources during software development processes. Software development life cycles require lot of activities and skills to avoid risks and the best software estimation technique is supposed to be employed. Therefore, in this research, a comparative study was conducted, that consider the accuracy, usage, and suitability of existing methods. It will be suitable for the project managers and project consultants during the whole software project development process. In this project technique such as linear regression; both algorithmic and non-algorithmic are applied. Model, composite and regression techniques are used to derive COCOMO, COCOMO II, SLIM and linear multiple respectively. Moreover, expertise-based and linear-based rules are applied in non-algorithm methods. However, the technique needs some advancement to reduce the errors that are experienced during the software development process. Therefore, this paper in relation to software estimation techniques has proposed a model that can be helpful to the information technology firms, researchers and other firms that use information technology in the processes such as budgeting and decision-making processes.
EARLY STAGE SOFTWARE DEVELOPMENT EFFORT ESTIMATIONS – MAMDANI FIS VS NEURAL N...cscpconf
Accurately estimating the software size, cost, effort and schedule is probably the biggest
challenge facing software developers today. It has major implications for the management of
software development because both the overestimates and underestimates have direct impact for
causing damage to software companies. Lot of models have been proposed over the years by
various researchers for carrying out effort estimations. Also some of the studies for early stage
effort estimations suggest the importance of early estimations. New paradigms offer alternatives
to estimate the software development effort, in particular the Computational Intelligence (CI)
that exploits mechanisms of interaction between humans and processes domain
knowledge with the intention of building intelligent systems (IS). Among IS,
Artificial Neural Network and Fuzzy Logic are the two most popular soft computing techniques
for software development effort estimation. In this paper neural network models and Mamdani
FIS model have been used to predict the early stage effort estimations using the student dataset.
It has been found that Mamdani FIS was able to predict the early stage efforts more efficiently in
comparison to the neural network models based models.
Using Data Mining to Identify COSMIC Function Point Measurement Competence IJECEIAES
Cosmic Function Point (CFP) measurement errors leads budget, schedule and quality problems in software projects. Therefore, it’s important to identify and plan requirements engineers’ CFP training need quickly and correctly. The purpose of this paper is to identify software requirements engineers’ COSMIC Function Point measurement competence development need by using machine learning algorithms and requirements artifacts created by engineers. Used artifacts have been provided by a large service and technology company ecosystem in Telco. First, feature set has been extracted from the requirements model at hand. To do the data preparation for educational data mining, requirements and COSMIC Function Point (CFP) audit documents have been converted into CFP data set based on the designed feature set. This data set has been used to train and test the machine learning models by designing two different experiment settings to reach statistically significant results. Ten different machine learning algorithms have been used. Finally, algorithm performances have been compared with a baseline and each other to find the best performing models on this data set. In conclusion, REPTree, OneR, and Support Vector Machines (SVM) with Sequential Minimal Optimization (SMO) algorithms achieved top performance in forecasting requirements engineers’ CFP training need.
Iceemas 119- state of art of metrics of aspect oriented programmingMazen Ghareb
The document summarizes Mazen Ismaeel Ghareb's conference presentation on aspect-oriented programming (AOP) metrics. It discusses AOP and various proposed AOP metrics from literature. The methodology analyzes 6 common AOP metrics from 23 papers. Results show most AOP metrics improve aspects of software development over object-oriented programming, though the effects depend on implementation and problem. The conclusion is that the chosen metrics generally demonstrate AOP techniques can enhance modularity, coupling, cohesion and other qualities.
ER Publication,
IJETR, IJMCTR,
Journals,
International Journals,
High Impact Journals,
Monthly Journal,
Good quality Journals,
Research,
Research Papers,
Research Article,
Free Journals, Open access Journals,
erpublication.org,
Engineering Journal,
Science Journals,
The document describes a proposed web application for automating project management tasks at an engineering institute. The application would allow students to form groups, get project approvals, submit work, and receive feedback and evaluations. It consists of two modules - one for online project work and another to evaluate student and project progress. The goal is to streamline project activities and provide a centralized platform for communication between students and guides.
Application of Genetic Algorithm in Software Engineering: A ReviewIRJESJOURNAL
Abstract. The software engineering is comparatively new and regularly changing field. The big challenge of meeting strict project schedules with high quality software requires that the field of software engineering be automated to large extent and human resource intervention be minimized to optimum level. To achieve this goal the researcher have explored the potential of machine learning approaches as they are adaptable, have learning ability. In this paper, we take a look at how genetic algorithm (GA) can be used to build tool for software development and maintenance tasks.
The article proposes a new model for optimizing software effort and cost estimation based on code reusability. The model compares new projects to previously completed, similar projects stored in a code repository. By searching for and retrieving reusable code, functions, and methods from old projects, the model aims to reduce effort and cost estimates for new software development. The model is described as being based on the concept of estimation by analogy and using innovative search and retrieval techniques to achieve code reuse and thus decreased cost and effort estimates.
Test case prioritization using firefly algorithm for software testingJournal Papers
Firefly Algorithm is applied to optimize the ordering of test cases for software testing. Test cases are represented as fireflies, with their similarity distance calculated using string metrics determining the firefly brightness. The Firefly Algorithm prioritizes test cases by moving brighter fireflies, representing more dissimilar test cases, to the front of the test sequence. Experiments on benchmark programs show the Firefly Algorithm approach achieves better or equal average percentage of faults detected and time performance compared to existing works.
A Novel Optimization towards Higher Reliability in Predictive Modelling towar...IJECEIAES
Although, the area of software engineering has made a remarkable progress in last decade but there is less attention towards the concept of code reusability in this regards.Code reusability is a subset of Software Reusability which is one of the signature topics in software engineering. We review the existing system to find that there is no progress or availability of standard research approach toward code reusability being introduced in last decade. Hence, this paper introduced a predictive framework that is used for optimizing the performance of code reusability. For this purpose, we introduce a case study of near real-time challenge and involved it in our modelling. We apply neural network and Damped-Least square algorithm to perform optimization with a sole target to compute and ensure highest possible reliability. The study outcome of our model exhibits higher reliability and better computational response time
The Impact of Software Complexity on Cost and Quality - A Comparative Analysi...ijseajournal
Early prediction of software quality is important for better software planning and controlling. In early
development phases, design complexity metrics are considered as useful indicators of software testing
effort and some quality attributes. Although many studies investigate the relationship between design
complexity and cost and quality, it is unclear what we have learned beyond the scope of individual studies.
This paper presented a systematic review on the influence of software complexity metrics on quality
attributes. We aggregated Spearman correlation coefficients from 59 different data sets from 57 primary
studies by a tailored meta-analysis approach. We found that fault proneness and maintainability are most
frequently investigated attributes. Chidamber & Kemerer metric suite is most frequently used but not all of
them are good quality attribute indicators. Moreover, the impact of these metrics is not different in
proprietary and open source projects. The result provides some implications for building quality model
across project type.
International Journal of Engineering and Science Invention (IJESI)inventionjournals
This document discusses adopting aspect-oriented programming (AOP) in enterprise-wide computing. It provides a brief history of AOP, from its inception at Xerox PARC in the 1990s to the development of AspectJ in the late 1990s. It then reviews related work studying the benefits and challenges of using AOP, such as improved modularity and separation of concerns but also increased complexity. Many studies found quantitative benefits to maintenance from AOP but challenges in adoption. The document concludes by discussing uses of AOP in enterprises, noting both benefits like modularizing cross-cutting concerns, but also challenges such as difficulties aspectizing concurrency and failures.
Model-Based Performance Prediction in Software Development: A SurveyMr. Chanuwan
This document provides a survey of model-based approaches for predicting software performance early in the development lifecycle. It reviews approaches that use queueing networks, stochastic Petri nets, and other models. The approaches are evaluated based on how integrated the software and performance models are, how early performance analysis can be done in the lifecycle, and the level of automation support. The survey finds that while progress has been made, fully integrated solutions spanning the entire lifecycle are still needed. Promising future work includes approaches with more semantic integration of models and higher degrees of automation.
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
Automated exam question set generator using utility based agent and learning ...Journal Papers
This document proposes an Automated Exam Question Set Generator (AEQSG) that uses two intelligent agents - a Utility Based Agent (UBA) and a Learning Agent (LA). The UBA chooses exam questions based on user preferences or utilities, while the LA learns from past exam results to improve future question set generation. The AEQSG also applies Bloom's Taxonomy and Genetic Algorithms to generate question sets that meet guidelines while distributing questions by difficulty level. This approach aims to reduce educators' time spent creating exam question sets and improve their quality.
Software metrics sucess, failures and new directionsAndrws Vieira
This document summarizes the history and status of software metrics in both academia and industry. It discusses that while academic research on software metrics has grown exponentially, industrial use of metrics has remained focused on simple counts like lines of code and defects. The document argues that traditional regression models used to relate metrics to quality are inadequate, and that capturing uncertainty and combining evidence is needed. It introduces Bayesian belief networks as an approach to building management tools using simple metrics while handling these issues.
Eric Nyberg's Presentation "From Jeopardy! To Cognitive Agents: Effective Learning in the Wild" on Cognitive Systems Institute Group Speaker Series July 9, 2015
The program being evaluated is officially named "Computer and Software Engineering" and offers both computer engineering and software engineering options. While the department head suggested using the electrical/computer/communications criteria since it was used previously, the correct approach is to use both the electrical/computer criteria and the software engineering criteria, as the program contains elements of both and the PEV must be qualified to evaluate both types of programs.
International Journal of Computational Engineering Research(IJCER)ijceronline
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
In the present paper, applicability and
capability of A.I techniques for effort estimation prediction has
been investigated. It is seen that neuro fuzzy models are very
robust, characterized by fast computation, capable of handling
the distorted data. Due to the presence of data non-linearity, it is
an efficient quantitative tool to predict effort estimation. The one
hidden layer network has been developed named as OHLANFIS
using MATLAB simulation environment.
Here the initial parameters of the OHLANFIS are
identified using the subtractive clustering method. Parameters of
the Gaussian membership function are optimally determined
using the hybrid learning algorithm. From the analysis it is seen
that the Effort Estimation prediction model developed using
OHLANFIS technique has been able to perform well over normal
ANFIS Model.
Function Point Software Cost Estimates using Neuro-Fuzzy techniqueijceronline
Software estimation accuracy is among the greatest challenges for software developers. As Neurofuzzy based system is able to approximate the non-linear function with more precision so it is used as a soft computing approach to generate model by formulating the relationship based on its training. The approach presented in this paper is independent of the nature and type of estimation. In this paper, Function point is used as algorithmic model and an attempt is being made to validate the soundness of Neuro fuzzy technique using ISBSG and NASA project data.
An Approach of Improve Efficiencies through DevOps AdoptionIRJET Journal
This document discusses adopting DevOps practices to improve organizational efficiencies. It begins with an abstract discussing how organizations waste resources and how DevOps aims to address this through lean principles and continuous feedback. It then discusses the history and concepts of DevOps, proposing a DevOps adoption model. It outlines factors that affect IT performance and cultural transformation. The document also describes the research design of a study conducted through interviews with DevOps professionals. It identifies four main challenges to DevOps adoption: lack of awareness, lack of support, implementing technologies, and adapting processes. The analysis focuses on the lack of awareness challenge, noting confusion around DevOps definitions and resistance to "buzzwords".
Reduced Software Complexity for E-Government Applications with ZEF FrameworkTELKOMNIKA JOURNAL
The situation of dynamic change is unpredictable and always growth increasingly. It also can
happen anytime and anywhere. The one kind which is always changing is the government policy.This
condition is suggested take the impact for software for information system. It will cause replacement,
modification, and enhancement of software for information system. There is some commonality and
variability of software features in Indonesian Government. Hence, to manage it, we present enhancement
of Zuma’s E-Government Framework (ZEF) for reduce software complexity.We enhance ZEF Framework
using SPLE and GORE approach in order to improve traditional software development.It can reduce, if
the changing continuously happen.The measurement of software complexity relate to functionality of
system.It can describe with function point, because function point can describe logical software
complexity also. The preliminary result of this study can reduce efficiency of software complexity such as
information processing size, technical complexity adjustment factors and function points in e-government
applications.
Academic Resources Architecture Framework Planning using ERP in Cloud ComputingIRJET Journal
This document discusses an academic resources architecture framework for planning and using enterprise resource planning (ERP) systems in cloud computing. The framework is designed to meet the needs of schools, colleges, and universities by automating administrative tasks and streamlining processes. The framework includes three main service models - software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). The framework aims to improve transparency, productivity, and control through automation, leading to higher overall efficiency for educational institutions. It also discusses using cloud-based e-learning and learning analytics to improve current e-learning systems that lack appropriate infrastructure and integrated applications.
Exploring the Efficiency of the Program using OOAD MetricsIRJET Journal
This document proposes a methodology to analyze the efficiency of object-oriented programs using OOAD (Object Oriented Analysis and Design) metrics. The methodology involves compiling a program successively until it is error-free, recording the error rate at each compilation. These results are then compared to determine how many compilations were needed for the program to be error-free, indicating its efficiency. The methodology is experimentally validated on a sample Java program, with results showing the error rate decreasing with each compilation until the program is error-free after the 8th compilation, demonstrating good efficiency.
The document discusses several topics related to improving software cost estimation including investigating new sizing techniques based on requirements and design phases, analyzing complexity, assessing risk and return on investment, and evaluating existing models like function points. It also notes challenges like lack of standardized processes and unstable technologies. More research cooperation between academia and industry is needed to develop trusted models.
This document discusses elements that contribute to legacy program complexity. It identifies factors such as difficulty understanding old code, high cost of maintenance and replacement, large size, poor design, integration challenges with new technologies, lack of documentation, inflexibility, long processing times, unavailability of original staff, reliability issues, and bugs. The paper explores each of these elements in detail and argues that legacy programs are complex due to a combination of these interrelated factors such as large size, complex designs with many interconnected parts, and difficulty integrating old code and platforms with new technologies.
This document discusses different types of software metrics including process, product, and project metrics. It defines metrics as quantitative measures of attributes and discusses how they can be used as indicators to improve processes and projects. Process metrics measure attributes of the development process over long periods of time. Product metrics measure attributes of the software at different stages. Project metrics are used to monitor and control projects. The document also discusses size-oriented and function-oriented metrics for normalization and comparison purposes. It provides examples of calculating function points and deriving metrics like errors per function point.
A Model To Compare The Degree Of Refactoring Opportunities Of Three Projects ...acijjournal
Refactoring is applied to the software artifacts so as to improve its internal structure, while preserving its
external behavior. Refactoring is an uncertain process and it is difficult to give some units for
measurement. The amount to refactoring that can be applied to the source-code depends upon the skills of
the developer. In this research, we have perceived refactoring as a quantified object on an ordinal scale of
measurement. We have a proposed a model for determining the degree of refactoring opportunities in the
given source-code. The model is applied on the three projects collected from a company. UML diagrams
are drawn for each project. The values for source-code metrics, that are useful in determining the quality of
code, are calculated for each UML of the projects. Based on the nominal values of metrics, each relevant
UML is represented on an ordinal scale. A machine learning tool, weka, is used to analyze the dataset,
imported in the form of arff file, produced by the three projects
Test case prioritization using firefly algorithm for software testingJournal Papers
Firefly Algorithm is applied to optimize the ordering of test cases for software testing. Test cases are represented as fireflies, with their similarity distance calculated using string metrics determining the firefly brightness. The Firefly Algorithm prioritizes test cases by moving brighter fireflies, representing more dissimilar test cases, to the front of the test sequence. Experiments on benchmark programs show the Firefly Algorithm approach achieves better or equal average percentage of faults detected and time performance compared to existing works.
A Novel Optimization towards Higher Reliability in Predictive Modelling towar...IJECEIAES
Although, the area of software engineering has made a remarkable progress in last decade but there is less attention towards the concept of code reusability in this regards.Code reusability is a subset of Software Reusability which is one of the signature topics in software engineering. We review the existing system to find that there is no progress or availability of standard research approach toward code reusability being introduced in last decade. Hence, this paper introduced a predictive framework that is used for optimizing the performance of code reusability. For this purpose, we introduce a case study of near real-time challenge and involved it in our modelling. We apply neural network and Damped-Least square algorithm to perform optimization with a sole target to compute and ensure highest possible reliability. The study outcome of our model exhibits higher reliability and better computational response time
The Impact of Software Complexity on Cost and Quality - A Comparative Analysi...ijseajournal
Early prediction of software quality is important for better software planning and controlling. In early
development phases, design complexity metrics are considered as useful indicators of software testing
effort and some quality attributes. Although many studies investigate the relationship between design
complexity and cost and quality, it is unclear what we have learned beyond the scope of individual studies.
This paper presented a systematic review on the influence of software complexity metrics on quality
attributes. We aggregated Spearman correlation coefficients from 59 different data sets from 57 primary
studies by a tailored meta-analysis approach. We found that fault proneness and maintainability are most
frequently investigated attributes. Chidamber & Kemerer metric suite is most frequently used but not all of
them are good quality attribute indicators. Moreover, the impact of these metrics is not different in
proprietary and open source projects. The result provides some implications for building quality model
across project type.
International Journal of Engineering and Science Invention (IJESI)inventionjournals
This document discusses adopting aspect-oriented programming (AOP) in enterprise-wide computing. It provides a brief history of AOP, from its inception at Xerox PARC in the 1990s to the development of AspectJ in the late 1990s. It then reviews related work studying the benefits and challenges of using AOP, such as improved modularity and separation of concerns but also increased complexity. Many studies found quantitative benefits to maintenance from AOP but challenges in adoption. The document concludes by discussing uses of AOP in enterprises, noting both benefits like modularizing cross-cutting concerns, but also challenges such as difficulties aspectizing concurrency and failures.
Model-Based Performance Prediction in Software Development: A SurveyMr. Chanuwan
This document provides a survey of model-based approaches for predicting software performance early in the development lifecycle. It reviews approaches that use queueing networks, stochastic Petri nets, and other models. The approaches are evaluated based on how integrated the software and performance models are, how early performance analysis can be done in the lifecycle, and the level of automation support. The survey finds that while progress has been made, fully integrated solutions spanning the entire lifecycle are still needed. Promising future work includes approaches with more semantic integration of models and higher degrees of automation.
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
Automated exam question set generator using utility based agent and learning ...Journal Papers
This document proposes an Automated Exam Question Set Generator (AEQSG) that uses two intelligent agents - a Utility Based Agent (UBA) and a Learning Agent (LA). The UBA chooses exam questions based on user preferences or utilities, while the LA learns from past exam results to improve future question set generation. The AEQSG also applies Bloom's Taxonomy and Genetic Algorithms to generate question sets that meet guidelines while distributing questions by difficulty level. This approach aims to reduce educators' time spent creating exam question sets and improve their quality.
Software metrics sucess, failures and new directionsAndrws Vieira
This document summarizes the history and status of software metrics in both academia and industry. It discusses that while academic research on software metrics has grown exponentially, industrial use of metrics has remained focused on simple counts like lines of code and defects. The document argues that traditional regression models used to relate metrics to quality are inadequate, and that capturing uncertainty and combining evidence is needed. It introduces Bayesian belief networks as an approach to building management tools using simple metrics while handling these issues.
Eric Nyberg's Presentation "From Jeopardy! To Cognitive Agents: Effective Learning in the Wild" on Cognitive Systems Institute Group Speaker Series July 9, 2015
The program being evaluated is officially named "Computer and Software Engineering" and offers both computer engineering and software engineering options. While the department head suggested using the electrical/computer/communications criteria since it was used previously, the correct approach is to use both the electrical/computer criteria and the software engineering criteria, as the program contains elements of both and the PEV must be qualified to evaluate both types of programs.
International Journal of Computational Engineering Research(IJCER)ijceronline
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
In the present paper, applicability and
capability of A.I techniques for effort estimation prediction has
been investigated. It is seen that neuro fuzzy models are very
robust, characterized by fast computation, capable of handling
the distorted data. Due to the presence of data non-linearity, it is
an efficient quantitative tool to predict effort estimation. The one
hidden layer network has been developed named as OHLANFIS
using MATLAB simulation environment.
Here the initial parameters of the OHLANFIS are
identified using the subtractive clustering method. Parameters of
the Gaussian membership function are optimally determined
using the hybrid learning algorithm. From the analysis it is seen
that the Effort Estimation prediction model developed using
OHLANFIS technique has been able to perform well over normal
ANFIS Model.
Function Point Software Cost Estimates using Neuro-Fuzzy techniqueijceronline
Software estimation accuracy is among the greatest challenges for software developers. As Neurofuzzy based system is able to approximate the non-linear function with more precision so it is used as a soft computing approach to generate model by formulating the relationship based on its training. The approach presented in this paper is independent of the nature and type of estimation. In this paper, Function point is used as algorithmic model and an attempt is being made to validate the soundness of Neuro fuzzy technique using ISBSG and NASA project data.
An Approach of Improve Efficiencies through DevOps AdoptionIRJET Journal
This document discusses adopting DevOps practices to improve organizational efficiencies. It begins with an abstract discussing how organizations waste resources and how DevOps aims to address this through lean principles and continuous feedback. It then discusses the history and concepts of DevOps, proposing a DevOps adoption model. It outlines factors that affect IT performance and cultural transformation. The document also describes the research design of a study conducted through interviews with DevOps professionals. It identifies four main challenges to DevOps adoption: lack of awareness, lack of support, implementing technologies, and adapting processes. The analysis focuses on the lack of awareness challenge, noting confusion around DevOps definitions and resistance to "buzzwords".
Reduced Software Complexity for E-Government Applications with ZEF FrameworkTELKOMNIKA JOURNAL
The situation of dynamic change is unpredictable and always growth increasingly. It also can
happen anytime and anywhere. The one kind which is always changing is the government policy.This
condition is suggested take the impact for software for information system. It will cause replacement,
modification, and enhancement of software for information system. There is some commonality and
variability of software features in Indonesian Government. Hence, to manage it, we present enhancement
of Zuma’s E-Government Framework (ZEF) for reduce software complexity.We enhance ZEF Framework
using SPLE and GORE approach in order to improve traditional software development.It can reduce, if
the changing continuously happen.The measurement of software complexity relate to functionality of
system.It can describe with function point, because function point can describe logical software
complexity also. The preliminary result of this study can reduce efficiency of software complexity such as
information processing size, technical complexity adjustment factors and function points in e-government
applications.
Academic Resources Architecture Framework Planning using ERP in Cloud ComputingIRJET Journal
This document discusses an academic resources architecture framework for planning and using enterprise resource planning (ERP) systems in cloud computing. The framework is designed to meet the needs of schools, colleges, and universities by automating administrative tasks and streamlining processes. The framework includes three main service models - software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). The framework aims to improve transparency, productivity, and control through automation, leading to higher overall efficiency for educational institutions. It also discusses using cloud-based e-learning and learning analytics to improve current e-learning systems that lack appropriate infrastructure and integrated applications.
Exploring the Efficiency of the Program using OOAD MetricsIRJET Journal
This document proposes a methodology to analyze the efficiency of object-oriented programs using OOAD (Object Oriented Analysis and Design) metrics. The methodology involves compiling a program successively until it is error-free, recording the error rate at each compilation. These results are then compared to determine how many compilations were needed for the program to be error-free, indicating its efficiency. The methodology is experimentally validated on a sample Java program, with results showing the error rate decreasing with each compilation until the program is error-free after the 8th compilation, demonstrating good efficiency.
The document discusses several topics related to improving software cost estimation including investigating new sizing techniques based on requirements and design phases, analyzing complexity, assessing risk and return on investment, and evaluating existing models like function points. It also notes challenges like lack of standardized processes and unstable technologies. More research cooperation between academia and industry is needed to develop trusted models.
This document discusses elements that contribute to legacy program complexity. It identifies factors such as difficulty understanding old code, high cost of maintenance and replacement, large size, poor design, integration challenges with new technologies, lack of documentation, inflexibility, long processing times, unavailability of original staff, reliability issues, and bugs. The paper explores each of these elements in detail and argues that legacy programs are complex due to a combination of these interrelated factors such as large size, complex designs with many interconnected parts, and difficulty integrating old code and platforms with new technologies.
This document discusses different types of software metrics including process, product, and project metrics. It defines metrics as quantitative measures of attributes and discusses how they can be used as indicators to improve processes and projects. Process metrics measure attributes of the development process over long periods of time. Product metrics measure attributes of the software at different stages. Project metrics are used to monitor and control projects. The document also discusses size-oriented and function-oriented metrics for normalization and comparison purposes. It provides examples of calculating function points and deriving metrics like errors per function point.
A Model To Compare The Degree Of Refactoring Opportunities Of Three Projects ...acijjournal
Refactoring is applied to the software artifacts so as to improve its internal structure, while preserving its
external behavior. Refactoring is an uncertain process and it is difficult to give some units for
measurement. The amount to refactoring that can be applied to the source-code depends upon the skills of
the developer. In this research, we have perceived refactoring as a quantified object on an ordinal scale of
measurement. We have a proposed a model for determining the degree of refactoring opportunities in the
given source-code. The model is applied on the three projects collected from a company. UML diagrams
are drawn for each project. The values for source-code metrics, that are useful in determining the quality of
code, are calculated for each UML of the projects. Based on the nominal values of metrics, each relevant
UML is represented on an ordinal scale. A machine learning tool, weka, is used to analyze the dataset,
imported in the form of arff file, produced by the three projects
A MODEL TO COMPARE THE DEGREE OF REFACTORING OPPORTUNITIES OF THREE PROJECTS ...acijjournal
This document presents a model for quantifying and comparing the degree of refactoring opportunities in three software projects. The model involves drawing UML diagrams for the projects, calculating source code metrics for each UML diagram, representing the diagrams on an ordinal scale based on the metrics, and using a machine learning tool (Weka) to analyze the resulting dataset. The tool uses a Naive Bayesian classifier to generate a confusion matrix for each project, allowing evaluation of the model's performance at classifying refactoring opportunities as low, medium, or high. The model is applied to three projects from a company to test its ability to measure and compare refactoring opportunities in code.
This document summarizes a research paper that studied how to integrate agile software development methods like Extreme Programming (XP) into traditional stage-gate project management models. It discusses how agile methods have evolved for smaller projects but must work within larger product development contexts. The paper presents a case study of two large software projects that used XP within stage-gate management. It finds that integrating XP is possible if the interfaces with the agile subproject and management attitudes towards agility are properly managed.
Remote interpreter API model for supporting computer programming adaptive lea...TELKOMNIKA JOURNAL
This document describes a study that developed a remote interpreter API model to support adaptive learning for computer programming courses. The model allows students to write and run program code directly within a learning management system (LMS). A web API server runs the program code remotely to reduce the computational load on the LMS server. The API returns output and error information to provide feedback on students' code. Testing showed the API had minimal impact on server resources and fast response times when running Python and PHP code simultaneously. The model provides an effective way to assess students' psychomotor programming skills within an LMS in an adaptive learning environment.
SBGC provides IEEE software projects for students in various domains including Java, J2ME, J2EE, .NET and MATLAB. It offers two categories of projects - projects with new ideas/papers and selecting from their project list. They ensure projects are implemented satisfactorily and students understand all aspects. SBGC provides latest 2012-2013 projects for various engineering and technology students as well as MBA students. It offers project support including abstracts, reports, presentations and certificates.
An Adjacent Analysis of the Parallel Programming Model Perspective: A SurveyIRJET Journal
This document provides an overview and analysis of parallel programming models. It begins with an abstract discussing the growing demand for parallel computing and challenges with existing parallel programming frameworks. It then reviews several relevant studies on parallel programming models and architectures. The document goes on to describe several key parallel programming models in more detail, including the Parallel Random Access Machine (PRAM) model, Unrestricted Message Passing (UMP) model, and Bulk Synchronous Parallel (BSP) model. It discusses aspects of each model like architecture, communication methods, and associated cost models. The overall goal is to compare benefits and limitations of different parallel programming models.
A SURVEY ON ACCURACY OF REQUIREMENT TRACEABILITY LINKS DURING SOFTWARE DEVELO...ijiert bestjournal
There are number of routing protocols proposed for the data transmission in WSN. Initially single path routing schemes with number of variations are proposed. Sti ll there were some drawbacks in single path routing . Single path routing was unable to provide the reliability and h igh throughput. Also security level was not conside red while routing. Recently,to remove the drawbacks of the s ingle path routing new routing technique is propose d called as multipath routing. In this paper we discussed the different multipath routing protocols with number of variants. Initiall y multipath routing was proposed for the purpose of guaranteed delivery of packet to sink in case of link or node failure. There are other protocols which are proposed for the reli ability,energy saving,security and high throughpu t. Some multipath routing protocols have discussed the load balancing and security during packet transmission.
The peer-reviewed International Journal of Engineering Inventions (IJEI) is started with a mission to encourage contribution to research in Science and Technology. Encourage and motivate researchers in challenging areas of Sciences and Technology.
Integrated Analysis of Traditional Requirements Engineering Process with Agil...zillesubhan
In the past few years, agile software development approach has emerged as a most attractive software development approach. A typical CASE environment consists of a number of CASE tools operating on a common hardware and software platform and note that there are a number of different classes of users of a CASE environment. In fact, some users such as software developers and managers wish to make use of CASE tools to support them in developing application systems and monitoring the progress of a project. This development approach has quickly caught the attention of a large number of software development firms. However, this approach particularly pays attention to development side of software development project while neglects critical aspects of requirements engineering process. In fact, there is no standard requirement engineering process in this approach and requirements engineering activities vary from situation to situation. As a result, there emerge a large number of problems which can lead the software development projects to failure. One of major drawbacks of agile approach is that it is suitable for small size projects with limited team size. Hence, it cannot be adopted for large size projects. We claim that this approach can be used for large size projects if traditional requirements engineering approach is combined with agile manifesto. In fact, the combination of traditional requirements engineering process and agile manifesto can also help resolve a large number of problems exist in agile development methodologies. As in software development the most important thing is to know the clear customer’s requirements and also through modeling (data modeling, functional modeling, behavior modeling). Using UML we are able to build efficient system starting from scratch towards the desired goal. Through UML we start from abstract model and develop the required system through going in details with different UML diagrams. Each UML diagram serves different goal towards implementing a whole project.
This document describes an online job recruitment system built using PHP. It allows job seekers to register, search for jobs, and manage their profiles. Employers can register, post jobs to the system, and manage job listings. The system has administrative, employer, and job seeker modules. It aims to make the job search and recruitment process easier and more accessible for all users. A feasibility study was conducted and the system was found to be technically, economically, and behaviorally feasible. The system will use PHP for the front end, MySQL for the database, and run on a Windows server environment.
The document discusses software evolution and maintenance. It covers evolutionary software development, the staged model of software lifespan, the phased model of software change, research and teaching approaches, and software maintenance. Key topics include iterative development, concept location, impact analysis, reasoning about evolution, and the end of software evolution. The purpose is to provide an overview of these topics and discuss current research and future directions.
Similar to Upslis faculty tenure lecture presentation (20)
Transform Your Communication with Cloud-Based IVR SolutionsTheSMSPoint
Discover the power of Cloud-Based IVR Solutions to streamline communication processes. Embrace scalability and cost-efficiency while enhancing customer experiences with features like automated call routing and voice recognition. Accessible from anywhere, these solutions integrate seamlessly with existing systems, providing real-time analytics for continuous improvement. Revolutionize your communication strategy today with Cloud-Based IVR Solutions. Learn more at: https://thesmspoint.com/channel/cloud-telephony
What is Master Data Management by PiLog Groupaymanquadri279
PiLog Group's Master Data Record Manager (MDRM) is a sophisticated enterprise solution designed to ensure data accuracy, consistency, and governance across various business functions. MDRM integrates advanced data management technologies to cleanse, classify, and standardize master data, thereby enhancing data quality and operational efficiency.
Do you want Software for your Business? Visit Deuglo
Deuglo has top Software Developers in India. They are experts in software development and help design and create custom Software solutions.
Deuglo follows seven steps methods for delivering their services to their customers. They called it the Software development life cycle process (SDLC).
Requirement — Collecting the Requirements is the first Phase in the SSLC process.
Feasibility Study — after completing the requirement process they move to the design phase.
Design — in this phase, they start designing the software.
Coding — when designing is completed, the developers start coding for the software.
Testing — in this phase when the coding of the software is done the testing team will start testing.
Installation — after completion of testing, the application opens to the live server and launches!
Maintenance — after completing the software development, customers start using the software.
UI5con 2024 - Keynote: Latest News about UI5 and it’s EcosystemPeter Muessig
Learn about the latest innovations in and around OpenUI5/SAPUI5: UI5 Tooling, UI5 linter, UI5 Web Components, Web Components Integration, UI5 2.x, UI5 GenAI.
Recording:
https://www.youtube.com/live/MSdGLG2zLy8?si=INxBHTqkwHhxV5Ta&t=0
Zoom is a comprehensive platform designed to connect individuals and teams efficiently. With its user-friendly interface and powerful features, Zoom has become a go-to solution for virtual communication and collaboration. It offers a range of tools, including virtual meetings, team chat, VoIP phone systems, online whiteboards, and AI companions, to streamline workflows and enhance productivity.
A Study of Variable-Role-based Feature Enrichment in Neural Models of CodeAftab Hussain
Understanding variable roles in code has been found to be helpful by students
in learning programming -- could variable roles help deep neural models in
performing coding tasks? We do an exploratory study.
- These are slides of the talk given at InteNSE'23: The 1st International Workshop on Interpretability and Robustness in Neural Software Engineering, co-located with the 45th International Conference on Software Engineering, ICSE 2023, Melbourne Australia
Most important New features of Oracle 23c for DBAs and Developers. You can get more idea from my youtube channel video from https://youtu.be/XvL5WtaC20A
SOCRadar's Aviation Industry Q1 Incident Report is out now!
The aviation industry has always been a prime target for cybercriminals due to its critical infrastructure and high stakes. In the first quarter of 2024, the sector faced an alarming surge in cybersecurity threats, revealing its vulnerabilities and the relentless sophistication of cyber attackers.
SOCRadar’s Aviation Industry, Quarterly Incident Report, provides an in-depth analysis of these threats, detected and examined through our extensive monitoring of hacker forums, Telegram channels, and dark web platforms.
Introducing Crescat - Event Management Software for Venues, Festivals and Eve...Crescat
Crescat is industry-trusted event management software, built by event professionals for event professionals. Founded in 2017, we have three key products tailored for the live event industry.
Crescat Event for concert promoters and event agencies. Crescat Venue for music venues, conference centers, wedding venues, concert halls and more. And Crescat Festival for festivals, conferences and complex events.
With a wide range of popular features such as event scheduling, shift management, volunteer and crew coordination, artist booking and much more, Crescat is designed for customisation and ease-of-use.
Over 125,000 events have been planned in Crescat and with hundreds of customers of all shapes and sizes, from boutique event agencies through to international concert promoters, Crescat is rigged for success. What's more, we highly value feedback from our users and we are constantly improving our software with updates, new features and improvements.
If you plan events, run a venue or produce festivals and you're looking for ways to make your life easier, then we have a solution for you. Try our software for free or schedule a no-obligation demo with one of our product specialists today at crescat.io
Odoo ERP software
Odoo ERP software, a leading open-source software for Enterprise Resource Planning (ERP) and business management, has recently launched its latest version, Odoo 17 Community Edition. This update introduces a range of new features and enhancements designed to streamline business operations and support growth.
The Odoo Community serves as a cost-free edition within the Odoo suite of ERP systems. Tailored to accommodate the standard needs of business operations, it provides a robust platform suitable for organisations of different sizes and business sectors. Within the Odoo Community Edition, users can access a variety of essential features and services essential for managing day-to-day tasks efficiently.
This blog presents a detailed overview of the features available within the Odoo 17 Community edition, and the differences between Odoo 17 community and enterprise editions, aiming to equip you with the necessary information to make an informed decision about its suitability for your business.
OpenMetadata Community Meeting - 5th June 2024OpenMetadata
The OpenMetadata Community Meeting was held on June 5th, 2024. In this meeting, we discussed about the data quality capabilities that are integrated with the Incident Manager, providing a complete solution to handle your data observability needs. Watch the end-to-end demo of the data quality features.
* How to run your own data quality framework
* What is the performance impact of running data quality frameworks
* How to run the test cases in your own ETL pipelines
* How the Incident Manager is integrated
* Get notified with alerts when test cases fail
Watch the meeting recording here - https://www.youtube.com/watch?v=UbNOje0kf6E
Graspan: A Big Data System for Big Code AnalysisAftab Hussain
We built a disk-based parallel graph system, Graspan, that uses a novel edge-pair centric computation model to compute dynamic transitive closures on very large program graphs.
We implement context-sensitive pointer/alias and dataflow analyses on Graspan. An evaluation of these analyses on large codebases such as Linux shows that their Graspan implementations scale to millions of lines of code and are much simpler than their original implementations.
These analyses were used to augment the existing checkers; these augmented checkers found 132 new NULL pointer bugs and 1308 unnecessary NULL tests in Linux 4.4.0-rc5, PostgreSQL 8.3.9, and Apache httpd 2.2.18.
- Accepted in ASPLOS ‘17, Xi’an, China.
- Featured in the tutorial, Systemized Program Analyses: A Big Data Perspective on Static Analysis Scalability, ASPLOS ‘17.
- Invited for presentation at SoCal PLS ‘16.
- Invited for poster presentation at PLDI SRC ‘16.
Neo4j - Product Vision and Knowledge Graphs - GraphSummit ParisNeo4j
Dr. Jesús Barrasa, Head of Solutions Architecture for EMEA, Neo4j
Découvrez les dernières innovations de Neo4j, et notamment les dernières intégrations cloud et les améliorations produits qui font de Neo4j un choix essentiel pour les développeurs qui créent des applications avec des données interconnectées et de l’IA générative.
Revolutionizing Visual Effects Mastering AI Face Swaps.pdfUndress Baby
The quest for the best AI face swap solution is marked by an amalgamation of technological prowess and artistic finesse, where cutting-edge algorithms seamlessly replace faces in images or videos with striking realism. Leveraging advanced deep learning techniques, the best AI face swap tools meticulously analyze facial features, lighting conditions, and expressions to execute flawless transformations, ensuring natural-looking results that blur the line between reality and illusion, captivating users with their ingenuity and sophistication.
Web:- https://undressbaby.com/
3. Academic Background
B. S. in Library and
Information Science,
School of Library and
Information Science
University of the
Philippines – UP
Diliman
4. Academic Background
Master of Science in
Computer Science,
Department of
Computer Science,
College of Engineering
University of the
Philippines – UP
Diliman. (June 26,
2016)
5. Academic Background
Certified Librarian – Board for Librarian (November
2007)
One Year Full-Time Certificate in Information
Technology, Specialization: Applications
Development, University of the Philippines
Information Technology Training Center – UP
Diliman
• UPITTC Scholar
6. Professional Experiences in the
Academe
Faculty, University of the
Philippines School of Library
and Information Studies (UP
SLIS), June 2013 – Present
QMS Lead, eUP Project, March
2016 - December 2016
Program Developer Associate,
eUP Project, June 2015 - July
2017
Project Manager, UPITDC –
CHED Project, January 7, 2013
– March 7, 2013
Project Manager, UP ITDC -
Technical and Professional
Services for the Rehabilitation
of CHED’s Internet, Website
and Email Services, April 10 –
27, 2012
Software Analyst, UP ITDC -
CATT
CSharpMigration_ChartGui
Project of Fujitsu Ten
Solutions Philippines, UP ITTC,
February 2012 - April 2012
Project Manager, Youth
Conference in IT (YCIT) 2010.
February 17 and 18 2010
7. Professional Experiences in the
Academe
Sr. Applications Development
Training Officer, University of
the Philippines IT Training
Center (UPITTC), February
2010 – May 2013
Deputy Training Manager,
University of the Philippines IT
Training Center (UPITTC),
February 2009 – May 2013
Deputy Quality Management
Representative, ISO 9001-2008
QMS, University of the
Philippines IT Training Center,
February 2010 – April 2013
Speaker and Trainer
• Software Quality Assurance
• Systems Thinking
• Project Management
• Embedded Indexing
• Understanding Digital
Preservation
• Cloud Computing: Web
Technologies
• Careers in IT
8. Professional Experiences in the
Corporate Industry
Faculty, University of the
Philippines School of Library
and Information Studies (UP
SLIS), June 2013 – Present
IT Project Manager, Telus
House McKinley West Buildout
Telus International Philippines,
January 2016 - Present
QMS Lead, eUP Project, March
2016 - December 2016
Program Developer Associate,
eUP Project, June 2015 - July
2017
Project Manager, UPITDC –
CHED Project, January 7, 2013
– March 7, 2013
Project Manager, UP ITDC -
Technical and Professional
Services for the Rehabilitation
of CHED’s Internet, Website
and Email Services, April 10 –
27, 2012
Software Analyst, UP ITDC -
CATT
CSharpMigration_ChartGui
Project of Fujitsu Ten
Solutions Philippines, UP ITTC,
February 2012 - April 2012
9. A Control Structure - Token Based Metric for Software
Functional Cohesion, Entropy and Re-engineering
10. A Control Structure - Token Based Metric for Software
Functional Cohesion, Entropy and Re-engineering
14. Theoretical Framework
In software engineering, software configuration
management (SCM) is the task of tracking and
controlling changes in the software
15. Theoretical Framework
In softwar
A field under Software Quality Assurance (SQA)
which is task to measure and maintain the
quality of the software.
Evoke Technologies
16. Background of the Study
• When companies use technology such as
software in process control and management,
two of many possible objectives for
continuous growth they have are
(1) efficiency in resource utilization and
productivity, and
(2) an overhead amount of time and related cost
in updating and managing this technology.
17. Background of the Study
• When companies use
technology such as software
in process control and
management, two of many
possible objectives for
continuous growth they have
are
(1) efficiency in resource
utilization and productivity,
and
(2) an overhead amount of time
and related cost in updating
and managing this technology.
Sometimes the
existence of the second
may be contradictory
to achieving the first
one
18. Background of the Study
• When companies use
technology such as software
in process control and
management, two of many
possible objectives for
continuous growth they have
are
(1) efficiency in resource
utilization and productivity,
and
(2) an overhead amount of time
and related cost in updating
and managing this technology.
Proper and pro-active
environment on
enterprise
management maintains
that the cost of the
overhead does not
supersede overall
system (enterprise)
efficiency.
19. Background of the Study
• When companies use
technology such as software
in process control and
management, two of many
possible objectives for
continuous growth they have
are
(1) efficiency in resource
utilization and productivity,
and
(2) an overhead amount of time
and related cost in updating
and managing this technology.
Furthermore
unfortunate it may be
but it is readily
recognizable in
companies when their
software eventually
becomes hindrance,
sometimes the cause of
paralysis, in achieving
overall enterprise
efficiency.
20. Introduction
A work on software engineering by IVAR
JOCOBSON describes software entropy as
follows…
The second law of thermodynamics, in principle,
states that a closed system's disorder cannot
be reduced, it can only remain unchanged or
increased.
A measure of this disorder is entropy.
21. Introduction
A work on software engineering
by IVAR JOCOBSON
describes software entropy
as follows…
The second law of
thermodynamics, in
principle, states that a closed
system's disorder cannot be
reduced, it can only remain
unchanged or increased.
A measure of this disorder is
entropy.
This law also seems plausible
for software systems;
as a system is modified, its disorder,
or entropy, always increases.
This is known as
software entropy.
22. Topic
This paper proposes a quantification of
software entropy in terms of
measuring control structures to
variables cohesion.
23. Introduction
Within software development, there are similar theories;
Lehman (1985) suggested a number of laws, of which aid
in the quantification of software entropy
• A computer program that is used will be modified
• When a program is modified, its complexity will
increase, provided that one does not actively work
against this.
The process of code refactoring can result in stepwise
reductions in software entropy.
24. Introduction
Within software development, there
are similar theories;
Lehman (1985) suggested a number
of laws, of which aid in the
quantification of software
entropy
• A computer program that is
used will be modified
• When a program is modified, its
complexity will increase,
provided that one does not
actively work against this.
The process of code refactoring
can result in stepwise reductions
in software entropy.
Topic
This paper proposes a
quantification of
software entropy in
terms of measuring
control structures to
variables
cohesion.
25. Cohesion
• is the measure of the strength of functional
association of elements within a procedure
We strive for the highest possible cohesion
– i.e. make elements of a procedure as strongly
related to one another as possible source
26. Why High Cohesion
• High cohesion usually means low coupling –
the degree to which each program module
relies on each other
• It insures that the functional breakdown of
the program reflects the functional
organization of the original problem
• High cohesion has been shown to be a good
predictor of maintainability
32. Topics Covered
• Rationalization for quantification of software entropy
• Related studies and researches on quantification of software entropy
– Chidamber and Kemerer 1991 , Quantifying Methods Complexity by introducing
Cyclomatic Complexity
– Chidamber and Kemerer 1994, Lack of Cohesion of Methods (LCOM)
– Beiman and Ott 1994, examined functional cohesion of procedures using data
slice abstraction
– Derived measurements from these study is Yourdon and Constantine’s
• Strong Functional Cohesion (SFC)
• Weak Functional Cohesion (WFC)
– Beiman and Kang 1997, introduced Measuring Design-Level Cohesion by using
association-based and sliced-based approaches
• Code-level Cohesion Measurement
A CONTROL STRUCTURE-TOKEN-BASED METRIC FOR SOFTWARE
FUNCTIONAL COHESION, ENTROPY AND RE-ENGINEERING
My
Paper
34. Related Works
• As a related study, the Measuring Design-Level
Cohesion of Beiman and Kang showed that an
analytical and empirical analysis on the design
level measurement corresponded closely with the
code-level cohesion measurement.
• Therefore Input Output Dependence Graph
(IODG) which was adapted from Lakhotia’s
variable dependence graph defined the
relationship between input and output
components of a module causal to cohesion
measurement.
35. Related Works
• Aside from deriving calculation for cohesion, Steven, Myers, and
Constantine introduced module cohesion (SMC Cohesion) that is
measurement of cohesiveness represented on an ordinal scale
approach
• In this study modules are categorized among
– Coincidental cohesion
– Logical cohesion
– Temporal cohesion
– Procedural cohesion
– Communicational cohesion
– Sequential cohesion
– Functional cohesion
• Thus stating coincidental cohesion regarded as the weakest and
functional cohesion the strongest cohesion
37. The Objective
• The contribution of this study, therefore,
focuses on building a metric to enhance the
real world implementation of sound
engineering theories and practices by
formulating and quantitatively measuring the
concept known as Software Entropy by
measuring control structures to variables
cohesion.
38. The Objective
•This aims to help software designers,
developers, and system managers to
pinpoint when their update in LOCs
causes entropy to the stability and
robustness of their current software.
39. The Objective
•This could also help in rationalizing
software update, partial or total
re-engineering and /or migration.
40. The Methodology
• This study will run both the IODG methods
and the SMC ordinal cohesion scale but
– Unlike IODG that accounts Input and Output
variables only
– Unlike Data Slices approach that accounts all
variables only
• This research will consider all variables and
control structures in a module in computing
cohesion
41. The Methodology
• Then the computed cohesion will be
compared against the SMC ordinal cohesion
scale to determine the COHESION METRIC
45. Definition of Terms
• Token
Let component y be a variable or a constant,
• Control Structure
Let control structure c be either if, while or
do-while,
57. Data Slice Profile for IF
• Functional Cohesion is
measured based from
the procedure output
and the components of
the procedure that
contribute to the said
output.
58. Data Slice Profile for IF
• Using this approach, it is
ensured that maximum
functional cohesiveness is
achieved.
• For the cases of multiple
outputs in a procedure,
each components
contributing to different
outputs will be
considered together with
how they are bound.
59. Results and Findings
• To apply the Data Slice Abstraction, the above
table shows several data tokens contributing
to a data slice, which is the Output z
• and where data tokens are the components or
the statements in the above tables.
64. Conclusions
• Clearly, by incorporating these 9 types of relationships
among control structures and tokens, we augment the
information we can derive regarding the cohesiveness
of the elements within a module m.
• However, knowing when and how to refactor codes is
an issue of the skill of the designer and/or
programmer to discern.
• This paper would measure cohesion within a
procedure and compare it to the SMC Cohesion
ordinal scale.
• However entire code should undergo repetitive
computation to really see the exact SMC Cohesion
level.
65. Future Works
• Automation of the computation of the
functional cohesion
• An Assembly-Level Procedural Cohesion
(Hardware)
66. Acknowledgements
This research work is a result of the
partnership of the University of the
Philippines Information Technology
Development Center (UP ITDC) and
Fujitsu Ten Limited represented by
Fujitsu Ten Solutions Philippines, Inc.