Predicting the time to build software is a very complex task for software engineering managers. There are complex factors that can directly interfere with the productivity of the development team. Factors directly related to the complexity of the system to be developed drastically change the time necessary for the completion of the works with the software factories. This work proposes the use of a hybrid system based on artificial neural networks and fuzzy systems to assist in the construction of an expert system based on rules to support in the prediction of hours destined to the development of software according to the complexity of the elements present in the same. The set of fuzzy rules obtained by the system helps the management and control of software development by providing a base of interpretable estimates based on fuzzy rules. The model was submitted to tests on a real database, and its results were promissory in the construction of an aid mechanism in the predictability of the software construction
Transitioning IT Projects to Operations Effectively in Public Sector : A Case...ijmpict
Operational effectiveness is measured by the application availability to end-users and the extent of
convenient usage of the application to perform their business functions. This paper demonstrates how
varying project transition process can affect the operational effectiveness. This explanatory case study
uses various projects in a South Australian government agency as the candidates for evaluation. With
various applications that existed in the production environment, the end-users had varying levels of
satisfaction. This research analyses factors influencing the operational efficiency the projects in transition
from project delivery into operations. The evidence clearly demonstrates criticality of the transition
process of applications from project delivery phase to operations phase. The research analyses the
findings specific to government agencies and presents recommendations. These findings can be useful to
public sector agencies for improving availability of IT applications in operations.
ITERATIVE AND INCREMENTAL DEVELOPMENT ANALYSIS STUDY OF VOCATIONAL CAREER INF...ijseajournal
Software development process presents various types of models with their corresponding phases required to be accordingly followed in delivery of quality products and projects. Despite the various expertise and skills of systems analysts, designers, and programmers, systems failure is inevitable when a suitable development process model is not followed. This paper focuses on the Iterative and Incremental Development (IID)model and justified its role in the analysis and design software systems. The paper adopted the qualitative research approach that justified and harnessed the relevance of IID in the context of systems analysis and design using the Vocational
Career Information System (VCIS) as a case study. The paper viewed the IID as a change-driven software development process model. The results showed some system specification, functional specification of system and design specifications that can be used in implementing the VCIS using the IID model. Thus, the paper concluded that in systems analysis and design, it is imperative to consider a suitable development process that reflects the engineering mind-set, with heavy emphasis on good analysis and design for quality assurance.
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...ijseajournal
Rework is a known vicious circle in software development since it plays a central role in the generation of
delays, extra costs and diverse risks introduced after software delivery. It eventually triggers a negative
impact on the quality of the software developed. In order to cater the rework issue, this paper goes in depth
with the notion of rework in software development as it occurs in practice by analysing a development
process on an organisation in Mauritius where rework is a major issue. Meticulous strategies to reduce
rework are then analysed and discussed. The paper ultimately leads to the recommendation of the best
strategy that is software configuration management to reduce the rework problem in software development
IRJET- Factors in Selection of Construction Project Management Software i...IRJET Journal
The document discusses factors to consider when selecting construction project management software in India. It conducted interviews with 15 experts in the construction industry with experience ranging from 5-30 years. The interviews aimed to understand the software selection process. Based on the literature review and interviews, the document proposes a model for software selection with 8 steps: 1) identify software options, 2) review organization policies, 3) analyze the project's needs, 4) analyze the client's needs, 5) inquire the purpose of planning, 6) analyze software performance and price, 7) check available skills, and 8) select and use software. The model categorizes factors as either project specific or general to guide effective software selection.
Contributors to Reduce Maintainability Cost at the Software Implementation PhaseWaqas Tariq
This document discusses factors that can reduce software maintenance costs during the implementation phase. It identifies that maintenance costs are highest during software development phases. The objective is to define criteria to assess software quality characteristics and assist during implementation. This will help reduce maintenance costs by creating criteria groups to support writing standard code, developing a model to apply criteria, and increasing understandability. Student groups will study code standardization, write programs, and test software maintenance on programs to validate the model and proposed criteria.
Relational Analysis of Software Developer’s Quality AssuresIOSR Journals
This document discusses relational analysis of software developer quality and measures. It begins by introducing the importance of software architecture and development models in ensuring project success. It then discusses measuring processes, products, and resources in software engineering. Internal attributes like size and complexity can be measured from products alone, while external attributes like reliability require executing the code. The research aims to measure internal attributes of the process. It outlines different types of process and product metrics used to measure properties and quality. Finally, it discusses specific defect and lines of code metrics used during implementation to estimate defects and size code.
The modern business environment requires organizations to be flexible and open to change if they are to gain and retain their competitive age. Competitive business environment needs to modernize existing legacy system in to self-adaptive ones. Reengineering presents an approach to transfer a legacy system towards an evolvable system. Software reengineering is a leading system evolution technique which helps in effective cost control, quality improvements and time and risk reduction. However successful improvement of legacy system through reengineering requires portfolio analysis of legacy application around various quality and functional parameters some of which includes reliability and modularity of the functions, level of usability and maintainability as well as policy and standards of software architecture and availability of required documents. Portfolio analysis around these parameters will help to examine the legacy application on quality and functional gaps within the application [1].
An Elite Model for COTS Component Selection ProcessIJEACS
This document presents a multi-agent approach for selecting commercial off-the-shelf (COTS) software components. It proposes a semi-automated model called ABCS that uses multiple agents to identify suitable candidate components based on requirements. The agents each handle sub-tasks like matching requirements, evaluating security, cost-benefit analysis, and integration testing. They coordinate to produce a weighted list of candidates from which experts can select the most suitable component. The model aims to reduce the time and improve the knowledge involved in COTS component selection.
Transitioning IT Projects to Operations Effectively in Public Sector : A Case...ijmpict
Operational effectiveness is measured by the application availability to end-users and the extent of
convenient usage of the application to perform their business functions. This paper demonstrates how
varying project transition process can affect the operational effectiveness. This explanatory case study
uses various projects in a South Australian government agency as the candidates for evaluation. With
various applications that existed in the production environment, the end-users had varying levels of
satisfaction. This research analyses factors influencing the operational efficiency the projects in transition
from project delivery into operations. The evidence clearly demonstrates criticality of the transition
process of applications from project delivery phase to operations phase. The research analyses the
findings specific to government agencies and presents recommendations. These findings can be useful to
public sector agencies for improving availability of IT applications in operations.
ITERATIVE AND INCREMENTAL DEVELOPMENT ANALYSIS STUDY OF VOCATIONAL CAREER INF...ijseajournal
Software development process presents various types of models with their corresponding phases required to be accordingly followed in delivery of quality products and projects. Despite the various expertise and skills of systems analysts, designers, and programmers, systems failure is inevitable when a suitable development process model is not followed. This paper focuses on the Iterative and Incremental Development (IID)model and justified its role in the analysis and design software systems. The paper adopted the qualitative research approach that justified and harnessed the relevance of IID in the context of systems analysis and design using the Vocational
Career Information System (VCIS) as a case study. The paper viewed the IID as a change-driven software development process model. The results showed some system specification, functional specification of system and design specifications that can be used in implementing the VCIS using the IID model. Thus, the paper concluded that in systems analysis and design, it is imperative to consider a suitable development process that reflects the engineering mind-set, with heavy emphasis on good analysis and design for quality assurance.
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...ijseajournal
Rework is a known vicious circle in software development since it plays a central role in the generation of
delays, extra costs and diverse risks introduced after software delivery. It eventually triggers a negative
impact on the quality of the software developed. In order to cater the rework issue, this paper goes in depth
with the notion of rework in software development as it occurs in practice by analysing a development
process on an organisation in Mauritius where rework is a major issue. Meticulous strategies to reduce
rework are then analysed and discussed. The paper ultimately leads to the recommendation of the best
strategy that is software configuration management to reduce the rework problem in software development
IRJET- Factors in Selection of Construction Project Management Software i...IRJET Journal
The document discusses factors to consider when selecting construction project management software in India. It conducted interviews with 15 experts in the construction industry with experience ranging from 5-30 years. The interviews aimed to understand the software selection process. Based on the literature review and interviews, the document proposes a model for software selection with 8 steps: 1) identify software options, 2) review organization policies, 3) analyze the project's needs, 4) analyze the client's needs, 5) inquire the purpose of planning, 6) analyze software performance and price, 7) check available skills, and 8) select and use software. The model categorizes factors as either project specific or general to guide effective software selection.
Contributors to Reduce Maintainability Cost at the Software Implementation PhaseWaqas Tariq
This document discusses factors that can reduce software maintenance costs during the implementation phase. It identifies that maintenance costs are highest during software development phases. The objective is to define criteria to assess software quality characteristics and assist during implementation. This will help reduce maintenance costs by creating criteria groups to support writing standard code, developing a model to apply criteria, and increasing understandability. Student groups will study code standardization, write programs, and test software maintenance on programs to validate the model and proposed criteria.
Relational Analysis of Software Developer’s Quality AssuresIOSR Journals
This document discusses relational analysis of software developer quality and measures. It begins by introducing the importance of software architecture and development models in ensuring project success. It then discusses measuring processes, products, and resources in software engineering. Internal attributes like size and complexity can be measured from products alone, while external attributes like reliability require executing the code. The research aims to measure internal attributes of the process. It outlines different types of process and product metrics used to measure properties and quality. Finally, it discusses specific defect and lines of code metrics used during implementation to estimate defects and size code.
The modern business environment requires organizations to be flexible and open to change if they are to gain and retain their competitive age. Competitive business environment needs to modernize existing legacy system in to self-adaptive ones. Reengineering presents an approach to transfer a legacy system towards an evolvable system. Software reengineering is a leading system evolution technique which helps in effective cost control, quality improvements and time and risk reduction. However successful improvement of legacy system through reengineering requires portfolio analysis of legacy application around various quality and functional parameters some of which includes reliability and modularity of the functions, level of usability and maintainability as well as policy and standards of software architecture and availability of required documents. Portfolio analysis around these parameters will help to examine the legacy application on quality and functional gaps within the application [1].
An Elite Model for COTS Component Selection ProcessIJEACS
This document presents a multi-agent approach for selecting commercial off-the-shelf (COTS) software components. It proposes a semi-automated model called ABCS that uses multiple agents to identify suitable candidate components based on requirements. The agents each handle sub-tasks like matching requirements, evaluating security, cost-benefit analysis, and integration testing. They coordinate to produce a weighted list of candidates from which experts can select the most suitable component. The model aims to reduce the time and improve the knowledge involved in COTS component selection.
Insights on Research Techniques towards Cost Estimation in Software Design IJECEIAES
This document summarizes research on techniques for cost estimation in software design. It begins by describing common cost estimation techniques like Constructive Cost Modeling (COCOMO) and Function Point Analysis. It then analyzes research trends in cost estimation, effort estimation, and fault prediction based on literature from 2010 to present. Fewer than 50 papers were found related to overall cost estimation, less than 25 for effort estimation, and only 9 for fault prediction. The document then reviews existing research addressing general cost estimation, enhancement of Function Point Analysis, statistical modeling approaches, cost estimation for embedded systems, and estimation for fourth generation languages and NASA projects. Most techniques use COCOMO or extend existing models with techniques like fuzzy logic, neural networks, or statistical
STATE-OF-THE-ART IN EMPIRICAL VALIDATION OF SOFTWARE METRICS FOR FAULT PRONEN...IJCSES Journal
With the sharp rise in software dependability and failure cost, high quality has been in great demand.However, guaranteeing high quality in software systems which have grown in size and complexity coupled with the constraints imposed on their development has become increasingly difficult, time and resource consuming activity. Consequently, it becomes inevitable to deliver software that have no serious faults. In
this case, object-oriented (OO) products being the de facto standard of software development with their unique features could have some faults that are hard to find or pinpoint the impacts of changes. The earlier faults are identified, found and fixed, the lesser the costs and the higher the quality. To assess product quality, software metrics are used. Many OO metrics have been proposed and developed. Furthermore,
many empirical studies have validated metrics and class fault proneness (FP) relationship. The challenge is which metrics are related to class FP and what activities are performed. Therefore, this study bring together the state-of-the-art in fault prediction of FP that utilizes CK and size metrics. We conducted a systematic literature review over relevant published empirical validation articles. The results obtained are
analysed and presented. It indicates that 29 relevant empirical studies exist and measures such as complexity, coupling and size were found to be strongly related to FP.
ANALYSIS OF SOFTWARE QUALITY USING SOFTWARE METRICSijcsa
Software metrics have a direct link with measurement in software engineering. Correct measurement is the prior condition in any engineering fields, and software engineering is not an exception, as the size and complexity of software increases, manual inspection of software becomes a harder task. Most Software Engineers worry about the quality of software, how to measure and enhance its quality. The overall objective of this study was to asses and analysis’s software metrics used to measure the software product and process.
In this Study, the researcher used a collection of literatures from various electronic databases, available since 2008 to understand and know the software metrics. Finally, in this study, the researcher has been identified software quality is a means of measuring how software is designed and how well the software conforms to that design. Some of the variables that we are looking for software quality are Correctness, Product quality, Scalability, Completeness and Absence of bugs, However the quality standard that was used from one organization is different from others for this reason it is better to apply the software metrics to measure the quality of software and the current most common software metrics tools to reduce the subjectivity of faults during the assessment of software quality. The central contribution of this study is an overview about software metrics that can illustrate us the development in this area, and a critical analysis about the main metrics founded on the various literatures.
Knowledge or Rule based Expert systems systems are widely used in engineering applications and in problem-solving. Rapid development today has brought with it environmental problems that cause loss or destruction of natural resources. Environmental impact assessment (EIA) has been acknowledged as a powerful planning and decisionmaking tool to assess new development projects. It requires qualified personnel with special expertise and responsibility in their domain. Rule-based EIA systems incorporate expert’s knowledge and act as a device-giving system. The system has an advantage over human experts and can significantly reduce the complexity of a planning task like EIA.
Extending open up for autonomic computing english_finalversionupload 6vpssantos
This document proposes extending the OpenUP software process to better support the development of autonomous software systems with a focus on eliciting non-functional requirements (NFRs). It introduces two new artifacts: 1) an NFR Description to document identified NFRs and resolve any conflicts or ambiguities, and 2) Misuse Cases to help uncover additional hidden NFRs. A case study on a Brazilian Emergency System is presented to illustrate applying the extended OpenUP process with the new artifacts during requirements elicitation.
HYBRID PRACTICES IN GLOBAL SOFTWARE DEVELOPMENT: A SYSTEMATIC LITERATURE REVIEWijseajournal
Although agile methods in their purest way fit several companies, it has been a challenge to perform them in environments with distributed teams developing large software applications. Contractual items, for projects under development for external organizations, introduce additional complexities for pure agilebased approaches. The majority of global teams and companies use hybrid development practices that combine different development methods and frameworks. This research provides results from an empirical field study on how the hybrids practices are adopted in Global Software Development (GSD) projects. A systematic literature review was conducted to capture the status of combining agile with plan-driven in GSD projects. The results were limited to peer-reviewed conference papers or journal articles, published between 2001 and 2020. The present study selected 37 papers from five different bibliographic databases. In the end, 16 practices were summarized and described as hybrid by GSD projects. Based on the findings of this study, the authors can conclude that the contribution of this study is not only limited to identifying how hybrid development practices are applied in GSD but also allowing that practitioners can have a basis for adapting their development methods.
This document summarizes a research paper that examines the use of data mining techniques to predict software aging-related bugs from imbalanced datasets. The paper compares the performance of general data mining techniques versus techniques developed for imbalanced datasets on a real-world dataset of aging bugs found in MySQL software. The results show that techniques designed for imbalanced datasets, such as SMOTEbagging and MSMOTEboosting, performed better than general techniques at correctly predicting the minority class of data points related to aging bugs. The paper concludes that imbalanced dataset techniques are more useful for predicting rare aging bugs from imbalanced software bug datasets.
EVALUATION OF SOFTWARE DEGRADATION AND FORECASTING FUTURE DEVELOPMENT NEEDS I...ijseajournal
This article is an extended version of a previously published conference paper. In this research, JHotDraw (JHD), a well-tested and widely used open source Java-based graphics framework developed with the best software engineering practice was selected as a test suite. Six versions of this software were profiled, and data collected dynamically, from which four metrics namely (1) entropy (2) software maturity index, COCOMO effort and duration metrics were used to analyze software degradation, maturity level and use
the obtained results as input to time series analysis in order to predict effort and duration period that may
be needed for the development of future versions. The novel idea is that, historical evolution data is used to
project, predict and forecast resource requirements for future developments. The technique presented in
this paper will empower software development decision makers with a viable tool for planning and decision
making.
Issues of Embedded System Component Based Development in Mesh NetworksIRJET Journal
This document discusses issues related to embedded system component-based development in mesh networks. It begins with an abstract discussing domain engineering and embedded component-based development. It then discusses the embedded component mining process, which involves exploration, excogitation, and exploitation phases. Next, it discusses embedded component-based development and benefits. It introduces the MoteView 3-tier architecture for embedded mesh networks. Finally, it discusses techniques for designing embedded component-based applications and subtle relationships between component-based development and object-oriented methodologies.
This paper presents findings from the evaluation study carried out to review the UAE national ID card software. The paper consults the relevant literature to explain many of the concepts and frameworks explained herein. The findings of the evaluation work that was primarily based on the ISO 9126 standard for system quality measurement highlighted many practical areas that if taken into account is argued to more likely increase the success chances of similar system implementation projects.
SECURING SOFTWARE DEVELOPMENT STAGES USING ASPECT-ORIENTATION CONCEPTSijseajournal
The document summarizes research on securing software development stages using aspect-orientation concepts. It proposes a model called the Aspect-Oriented Software Security Development Life Cycle (AOSSDLC) which incorporates security activities into each stage of the software development life cycle. The model aims to efficiently integrate security as a cross-cutting concern using aspect orientation. It is concluded that aspect orientation allows security features to be installed without changing the existing software structure, providing benefits over other approaches.
This document discusses challenges faced during software component selection for component-based software development. It identifies key challenges as performance, time, component size, fault tolerance, reliability, functionality, compatibility, and available component subsets. Specifically, it states that performance, coupling, cohesion and interfaces impact each other, that using more commercial off-the-shelf components reduces time and cost, and that selecting components based on higher-level programming languages and fault tolerance can help address several of these challenges.
Positive developments but challenges still ahead a survey study on ux profe...Journal Papers
This survey study summarizes previous research on UX professionals' work practices and identifies key issues: (1) UX professionals' knowledge and practices, (2) organizational integration challenges, and (3) involvement in local communities. The study surveys 422 UX professionals in 5 countries about these issues. Results show that professionals have strong UX knowledge and use common methods/tools, but organizational integration challenges remain such as lack of resources and user involvement. Involvement in local communities is still limited despite their presence. Overall progress is seen, but more work is needed to address longstanding challenges.
An Approach of Improve Efficiencies through DevOps AdoptionIRJET Journal
This document discusses adopting DevOps practices to improve organizational efficiencies. It begins with an abstract discussing how organizations waste resources and how DevOps aims to address this through lean principles and continuous feedback. It then discusses the history and concepts of DevOps, proposing a DevOps adoption model. It outlines factors that affect IT performance and cultural transformation. The document also describes the research design of a study conducted through interviews with DevOps professionals. It identifies four main challenges to DevOps adoption: lack of awareness, lack of support, implementing technologies, and adapting processes. The analysis focuses on the lack of awareness challenge, noting confusion around DevOps definitions and resistance to "buzzwords".
Developing reusable software components for distributed embedded systemseSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
This document discusses using Failure Mode and Effects Analysis (FMEA) to analyze causes of longer lead times in software processes at small and medium enterprises (SMEs). It first reviews the software development process for a web application project. It then describes the steps taken in an FMEA: 1) potential failure modes were identified, 2) impacts of each failure were assessed, 3) failures were ranked by severity, 4) likelihoods of occurrences for each failure were ranked, and 5) rankings were assigned to identify which failures were detected most frequently. This FMEA analysis identified specific failure modes contributing to longer lead times and their impacts, allowing SMEs to prioritize addressing high risk failures.
This document describes an online job recruitment system built using PHP. It allows job seekers to register, search for jobs, and manage their profiles. Employers can register, post jobs to the system, and manage job listings. The system has administrative, employer, and job seeker modules. It aims to make the job search and recruitment process easier and more accessible for all users. A feasibility study was conducted and the system was found to be technically, economically, and behaviorally feasible. The system will use PHP for the front end, MySQL for the database, and run on a Windows server environment.
A Guideline Tool for Ongoing Product Evaluation in Small and Medium-Sized Ent...IJECEIAES
As consumer demand for user friendly software increases, usability evaluation is crucial to develop software systems which are easy to learn and use. However, implementation of usability evaluation is challenging for small and medium-sized enterprises (SMEs) due to factors such as lack of technical expertise, knowledge and experience of methods and standards. This results in neglect, or poorly executed evaluations of projects, resulting in software that disappoints and frustrates clients. To overcome this loss of competitiveness, we propose here a visual incorporation tool derived from ISO standards that would assist software development teams in SMEs in understanding and implementing usability evaluations. It shows fundamental Usability Engineering (UE) and Software Engineering (SE) activities and artifacts relevant to the usability evaluation and software development process, with potential incorporation points being highlighted. Dependencies and relationships are shown by links between activities and artifacts. Additionally, convergent artifacts of both disciplines were identified and shown. Evaluation of the proposed tool was based on the questionnaire results of software development practitioners from SMEs.
A review paper: optimal test cases for regression testing using artificial in...IJECEIAES
This document provides a review of optimal test cases for regression testing using artificial intelligence techniques. It discusses regression testing and techniques used for test case selection and prioritization, including retest all, regression test selection, and test case prioritization. Metrics for evaluating the efficiency of these techniques are described, including average percentage of faults detected, average percentage block coverage, and average percentage decision coverage. The document also reviews various artificial intelligence techniques that have been used for regression testing, such as neural networks, fuzzy logic, genetic algorithms, and machine learning. It provides examples of studies that have applied these techniques to select optimal test cases and improve the efficiency of regression testing.
The document discusses the objectives, feasibility study, and implementation specifications for an Income Tax Department Management System project. The objectives are to overcome paper-based problems and easily manage records of PAN card holders and employees. A feasibility study assesses the technical, operational, and economic feasibility of the proposed system. The implementation will use ASP.NET on Windows with a SQL Server database. Hardware requirements include a Pentium PC with 512MB RAM and 80GB hard drive.
Monitoring and Visualisation Approach for Collaboration Production Line Envir...Waqas Tariq
In this paper, a tool, called SPMonitor, to monitor and visualize of run-time execution productive processes is proposed. SPMonitor enables dynamically visualizing and monitoring workflows running in a system. It displays versatile information about currently executed workflows providing better understanding about processes and the general functionality of the domain. Moreover, SPMonitor enhances cooperation between different stakeholders by offering extensive communication and problem solving features that allow actors concerned to react more efficiently to different anomalies that may occur during a workflow execution. The ideas discussed are validated through the study of real case related to airbus assembly lines.
Insights on Research Techniques towards Cost Estimation in Software Design IJECEIAES
This document summarizes research on techniques for cost estimation in software design. It begins by describing common cost estimation techniques like Constructive Cost Modeling (COCOMO) and Function Point Analysis. It then analyzes research trends in cost estimation, effort estimation, and fault prediction based on literature from 2010 to present. Fewer than 50 papers were found related to overall cost estimation, less than 25 for effort estimation, and only 9 for fault prediction. The document then reviews existing research addressing general cost estimation, enhancement of Function Point Analysis, statistical modeling approaches, cost estimation for embedded systems, and estimation for fourth generation languages and NASA projects. Most techniques use COCOMO or extend existing models with techniques like fuzzy logic, neural networks, or statistical
STATE-OF-THE-ART IN EMPIRICAL VALIDATION OF SOFTWARE METRICS FOR FAULT PRONEN...IJCSES Journal
With the sharp rise in software dependability and failure cost, high quality has been in great demand.However, guaranteeing high quality in software systems which have grown in size and complexity coupled with the constraints imposed on their development has become increasingly difficult, time and resource consuming activity. Consequently, it becomes inevitable to deliver software that have no serious faults. In
this case, object-oriented (OO) products being the de facto standard of software development with their unique features could have some faults that are hard to find or pinpoint the impacts of changes. The earlier faults are identified, found and fixed, the lesser the costs and the higher the quality. To assess product quality, software metrics are used. Many OO metrics have been proposed and developed. Furthermore,
many empirical studies have validated metrics and class fault proneness (FP) relationship. The challenge is which metrics are related to class FP and what activities are performed. Therefore, this study bring together the state-of-the-art in fault prediction of FP that utilizes CK and size metrics. We conducted a systematic literature review over relevant published empirical validation articles. The results obtained are
analysed and presented. It indicates that 29 relevant empirical studies exist and measures such as complexity, coupling and size were found to be strongly related to FP.
ANALYSIS OF SOFTWARE QUALITY USING SOFTWARE METRICSijcsa
Software metrics have a direct link with measurement in software engineering. Correct measurement is the prior condition in any engineering fields, and software engineering is not an exception, as the size and complexity of software increases, manual inspection of software becomes a harder task. Most Software Engineers worry about the quality of software, how to measure and enhance its quality. The overall objective of this study was to asses and analysis’s software metrics used to measure the software product and process.
In this Study, the researcher used a collection of literatures from various electronic databases, available since 2008 to understand and know the software metrics. Finally, in this study, the researcher has been identified software quality is a means of measuring how software is designed and how well the software conforms to that design. Some of the variables that we are looking for software quality are Correctness, Product quality, Scalability, Completeness and Absence of bugs, However the quality standard that was used from one organization is different from others for this reason it is better to apply the software metrics to measure the quality of software and the current most common software metrics tools to reduce the subjectivity of faults during the assessment of software quality. The central contribution of this study is an overview about software metrics that can illustrate us the development in this area, and a critical analysis about the main metrics founded on the various literatures.
Knowledge or Rule based Expert systems systems are widely used in engineering applications and in problem-solving. Rapid development today has brought with it environmental problems that cause loss or destruction of natural resources. Environmental impact assessment (EIA) has been acknowledged as a powerful planning and decisionmaking tool to assess new development projects. It requires qualified personnel with special expertise and responsibility in their domain. Rule-based EIA systems incorporate expert’s knowledge and act as a device-giving system. The system has an advantage over human experts and can significantly reduce the complexity of a planning task like EIA.
Extending open up for autonomic computing english_finalversionupload 6vpssantos
This document proposes extending the OpenUP software process to better support the development of autonomous software systems with a focus on eliciting non-functional requirements (NFRs). It introduces two new artifacts: 1) an NFR Description to document identified NFRs and resolve any conflicts or ambiguities, and 2) Misuse Cases to help uncover additional hidden NFRs. A case study on a Brazilian Emergency System is presented to illustrate applying the extended OpenUP process with the new artifacts during requirements elicitation.
HYBRID PRACTICES IN GLOBAL SOFTWARE DEVELOPMENT: A SYSTEMATIC LITERATURE REVIEWijseajournal
Although agile methods in their purest way fit several companies, it has been a challenge to perform them in environments with distributed teams developing large software applications. Contractual items, for projects under development for external organizations, introduce additional complexities for pure agilebased approaches. The majority of global teams and companies use hybrid development practices that combine different development methods and frameworks. This research provides results from an empirical field study on how the hybrids practices are adopted in Global Software Development (GSD) projects. A systematic literature review was conducted to capture the status of combining agile with plan-driven in GSD projects. The results were limited to peer-reviewed conference papers or journal articles, published between 2001 and 2020. The present study selected 37 papers from five different bibliographic databases. In the end, 16 practices were summarized and described as hybrid by GSD projects. Based on the findings of this study, the authors can conclude that the contribution of this study is not only limited to identifying how hybrid development practices are applied in GSD but also allowing that practitioners can have a basis for adapting their development methods.
This document summarizes a research paper that examines the use of data mining techniques to predict software aging-related bugs from imbalanced datasets. The paper compares the performance of general data mining techniques versus techniques developed for imbalanced datasets on a real-world dataset of aging bugs found in MySQL software. The results show that techniques designed for imbalanced datasets, such as SMOTEbagging and MSMOTEboosting, performed better than general techniques at correctly predicting the minority class of data points related to aging bugs. The paper concludes that imbalanced dataset techniques are more useful for predicting rare aging bugs from imbalanced software bug datasets.
EVALUATION OF SOFTWARE DEGRADATION AND FORECASTING FUTURE DEVELOPMENT NEEDS I...ijseajournal
This article is an extended version of a previously published conference paper. In this research, JHotDraw (JHD), a well-tested and widely used open source Java-based graphics framework developed with the best software engineering practice was selected as a test suite. Six versions of this software were profiled, and data collected dynamically, from which four metrics namely (1) entropy (2) software maturity index, COCOMO effort and duration metrics were used to analyze software degradation, maturity level and use
the obtained results as input to time series analysis in order to predict effort and duration period that may
be needed for the development of future versions. The novel idea is that, historical evolution data is used to
project, predict and forecast resource requirements for future developments. The technique presented in
this paper will empower software development decision makers with a viable tool for planning and decision
making.
Issues of Embedded System Component Based Development in Mesh NetworksIRJET Journal
This document discusses issues related to embedded system component-based development in mesh networks. It begins with an abstract discussing domain engineering and embedded component-based development. It then discusses the embedded component mining process, which involves exploration, excogitation, and exploitation phases. Next, it discusses embedded component-based development and benefits. It introduces the MoteView 3-tier architecture for embedded mesh networks. Finally, it discusses techniques for designing embedded component-based applications and subtle relationships between component-based development and object-oriented methodologies.
This paper presents findings from the evaluation study carried out to review the UAE national ID card software. The paper consults the relevant literature to explain many of the concepts and frameworks explained herein. The findings of the evaluation work that was primarily based on the ISO 9126 standard for system quality measurement highlighted many practical areas that if taken into account is argued to more likely increase the success chances of similar system implementation projects.
SECURING SOFTWARE DEVELOPMENT STAGES USING ASPECT-ORIENTATION CONCEPTSijseajournal
The document summarizes research on securing software development stages using aspect-orientation concepts. It proposes a model called the Aspect-Oriented Software Security Development Life Cycle (AOSSDLC) which incorporates security activities into each stage of the software development life cycle. The model aims to efficiently integrate security as a cross-cutting concern using aspect orientation. It is concluded that aspect orientation allows security features to be installed without changing the existing software structure, providing benefits over other approaches.
This document discusses challenges faced during software component selection for component-based software development. It identifies key challenges as performance, time, component size, fault tolerance, reliability, functionality, compatibility, and available component subsets. Specifically, it states that performance, coupling, cohesion and interfaces impact each other, that using more commercial off-the-shelf components reduces time and cost, and that selecting components based on higher-level programming languages and fault tolerance can help address several of these challenges.
Positive developments but challenges still ahead a survey study on ux profe...Journal Papers
This survey study summarizes previous research on UX professionals' work practices and identifies key issues: (1) UX professionals' knowledge and practices, (2) organizational integration challenges, and (3) involvement in local communities. The study surveys 422 UX professionals in 5 countries about these issues. Results show that professionals have strong UX knowledge and use common methods/tools, but organizational integration challenges remain such as lack of resources and user involvement. Involvement in local communities is still limited despite their presence. Overall progress is seen, but more work is needed to address longstanding challenges.
An Approach of Improve Efficiencies through DevOps AdoptionIRJET Journal
This document discusses adopting DevOps practices to improve organizational efficiencies. It begins with an abstract discussing how organizations waste resources and how DevOps aims to address this through lean principles and continuous feedback. It then discusses the history and concepts of DevOps, proposing a DevOps adoption model. It outlines factors that affect IT performance and cultural transformation. The document also describes the research design of a study conducted through interviews with DevOps professionals. It identifies four main challenges to DevOps adoption: lack of awareness, lack of support, implementing technologies, and adapting processes. The analysis focuses on the lack of awareness challenge, noting confusion around DevOps definitions and resistance to "buzzwords".
Developing reusable software components for distributed embedded systemseSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
This document discusses using Failure Mode and Effects Analysis (FMEA) to analyze causes of longer lead times in software processes at small and medium enterprises (SMEs). It first reviews the software development process for a web application project. It then describes the steps taken in an FMEA: 1) potential failure modes were identified, 2) impacts of each failure were assessed, 3) failures were ranked by severity, 4) likelihoods of occurrences for each failure were ranked, and 5) rankings were assigned to identify which failures were detected most frequently. This FMEA analysis identified specific failure modes contributing to longer lead times and their impacts, allowing SMEs to prioritize addressing high risk failures.
This document describes an online job recruitment system built using PHP. It allows job seekers to register, search for jobs, and manage their profiles. Employers can register, post jobs to the system, and manage job listings. The system has administrative, employer, and job seeker modules. It aims to make the job search and recruitment process easier and more accessible for all users. A feasibility study was conducted and the system was found to be technically, economically, and behaviorally feasible. The system will use PHP for the front end, MySQL for the database, and run on a Windows server environment.
A Guideline Tool for Ongoing Product Evaluation in Small and Medium-Sized Ent...IJECEIAES
As consumer demand for user friendly software increases, usability evaluation is crucial to develop software systems which are easy to learn and use. However, implementation of usability evaluation is challenging for small and medium-sized enterprises (SMEs) due to factors such as lack of technical expertise, knowledge and experience of methods and standards. This results in neglect, or poorly executed evaluations of projects, resulting in software that disappoints and frustrates clients. To overcome this loss of competitiveness, we propose here a visual incorporation tool derived from ISO standards that would assist software development teams in SMEs in understanding and implementing usability evaluations. It shows fundamental Usability Engineering (UE) and Software Engineering (SE) activities and artifacts relevant to the usability evaluation and software development process, with potential incorporation points being highlighted. Dependencies and relationships are shown by links between activities and artifacts. Additionally, convergent artifacts of both disciplines were identified and shown. Evaluation of the proposed tool was based on the questionnaire results of software development practitioners from SMEs.
A review paper: optimal test cases for regression testing using artificial in...IJECEIAES
This document provides a review of optimal test cases for regression testing using artificial intelligence techniques. It discusses regression testing and techniques used for test case selection and prioritization, including retest all, regression test selection, and test case prioritization. Metrics for evaluating the efficiency of these techniques are described, including average percentage of faults detected, average percentage block coverage, and average percentage decision coverage. The document also reviews various artificial intelligence techniques that have been used for regression testing, such as neural networks, fuzzy logic, genetic algorithms, and machine learning. It provides examples of studies that have applied these techniques to select optimal test cases and improve the efficiency of regression testing.
The document discusses the objectives, feasibility study, and implementation specifications for an Income Tax Department Management System project. The objectives are to overcome paper-based problems and easily manage records of PAN card holders and employees. A feasibility study assesses the technical, operational, and economic feasibility of the proposed system. The implementation will use ASP.NET on Windows with a SQL Server database. Hardware requirements include a Pentium PC with 512MB RAM and 80GB hard drive.
Monitoring and Visualisation Approach for Collaboration Production Line Envir...Waqas Tariq
In this paper, a tool, called SPMonitor, to monitor and visualize of run-time execution productive processes is proposed. SPMonitor enables dynamically visualizing and monitoring workflows running in a system. It displays versatile information about currently executed workflows providing better understanding about processes and the general functionality of the domain. Moreover, SPMonitor enhances cooperation between different stakeholders by offering extensive communication and problem solving features that allow actors concerned to react more efficiently to different anomalies that may occur during a workflow execution. The ideas discussed are validated through the study of real case related to airbus assembly lines.
The document describes a web application called Atharva Fest that was developed to simplify the process of managing intercollege events. Some key points:
- The app was built using ReactJS for the frontend and MongoDB and NodeJS for the backend. It allows admins to register colleges, create and manage events, view student registrations, and generate certificates for winners. Teachers can manage events assigned by admins.
- The goals of the app were to reduce errors, provide a user-friendly interface for planning events online, and allow students to view info and register for events through a browser on any device.
- The document reviews similar existing event management systems and their limitations before describing the proposed app's features like online registration
The document describes a new methodology for eliciting software requirements for smart handheld devices. The methodology is focused on users' work processes. It involves observing users' activities, identifying stakeholders, discussing requirements with stakeholders, finding inspiration from other software, and brainstorming user needs and goals. As an example, the methodology is applied to develop an iPad-based software for improving the learning performance of playgroup students.
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...IJNSA Journal
This study examines the traditional approach to software development within the United Kingdom Government and the accreditation process. Initially we look at the Waterfall methodology that has been used for several years. We discuss the pros and cons of Waterfall before moving onto the Agile Scrum methodology. Agile has been adopted by the majority of Government digital departments including the Government Digital Services. Agile, despite its ability to achieve high rates of productivity organized in short, flexible, iterations, has faced security professionals’ disbelief when working within the U.K. Government. One of the major issues is that we develop in Agile but the accreditation process is conducted using Waterfall resulting in delays to go live dates. Taking a brief look into the accreditation process that is used within Government for I.T. systems and applications, we focus on giving the accreditor the assurance they need when developing new applications and systems. A framework has been produced by utilising the Open Web Application Security Project’s (OWASP) Application Security Verification Standard (ASVS). This framework will allow security and Agile to work side by side and produce secure code.
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...IJNSA Journal
The document discusses software development methodologies used by the UK government, specifically comparing the traditional Waterfall methodology to the more modern Agile Scrum methodology. It notes that while Agile has been adopted for development, the accreditation process still follows Waterfall, creating delays. The document then proposes a security framework based on OWASP's Application Security Verification Standard that could allow secure development within Agile sprints and provide assurance for accreditors.
Using Fuzzy Clustering and Software Metrics to Predict Faults in large Indust...IOSR Journals
This document describes a study that uses fuzzy clustering and software metrics to predict faults in large industrial software systems. The study uses fuzzy c-means clustering to group software components into faulty and fault-free clusters based on various software metrics. The study applies this method to the open-source JEdit software project, calculating metrics for 274 classes and identifying faults using repository data. The results show 88.49% accuracy in predicting faulty classes, demonstrating that fuzzy clustering can be an effective technique for fault prediction in large software systems.
DESIGN PATTERNS IN THE WORKFLOW IMPLEMENTATION OF MARINE RESEARCH GENERAL INF...AM Publications
This paper proposes the use of design patterns in a marine research general information platform. The development of the platform refers to a design of complicated system architecture. Creation and execution of the research workflow nodes and designing of visualization library suited for marine users play an important role in the whole software architecture. This paper studies the requirements characteristic in marine research fields and has implemented a series of framework to solve these problems based on object-oriented and design patterns techniques. These frameworks make clear the relationship in all directions between modules and layers of software, which communicate through unified abstract interface and reduce the coupling between modules and layers. The building of these frameworks is importantly significant in advancing the reusability of software and strengthening extensibility and maintainability of the system.
With the emergence of virtualization and cloud computing technologies, several services are housed on virtualization platform. Virtualization is the technology that many cloud service providers rely on for efficient management and coordination of the resource pool. As essential services are also housed on cloud platform, it is necessary to ensure continuous availability by implementing all necessary measures. Windows Active Directory is one such service that Microsoft developed for Windows domain networks. It is included in Windows Server operating systems as a set of processes and services for authentication and authorization of users and computers in a Windows domain type network. The service is required to run continuously without downtime. As a result, there are chances of accumulation of errors or garbage leading to software aging which in turn may lead to system failure and associated consequences. This results in software aging. In this work, software aging patterns of Windows active directory service is studied. Software aging of active directory needs to be predicted properly so that rejuvenation can be triggered to ensure continuous service delivery. In order to predict the accurate time, a model that uses time series forecasting technique is built.
The adoption of cloud environment for various application uses has led to security and privacy concern of user’s data. To protect user data and privacy on such platform is an area of concern.
Many cryptography strategy has been presented to provide secure sharing of resource on cloud platform. These methods tries to achieve a secure authentication strategy to realize feature such as self-blindable access tickets, group signatures, anonymous access tickets, minimal disclosure of tickets and revocation but each one varies in realization of these features. Each feature requires different cryptography mechanism for realization. Due to this it induces computation complexity which affects the deployment of these models in practical application. Most of these techniques are designed for a particular application environment and adopt public key cryptography which incurs high cost due to computation complexity.
To address these issues this work present an secure and efficient privacy preserving of mining data on public cloud platform by adopting party and key based authentication strategy. The proposed SCPPDM (Secure Cloud Privacy Preserving Data Mining) is deployed on Microsoft azure cloud platform. Experiment is conducted to evaluate computation complexity. The outcome shows the proposed model achieves significant performance interm of computation overhead and cost.
A survey of predicting software reliability using machine learning methodsIAESIJAI
In light of technical and technological progress, software has become an urgent need in every aspect of human life, including the medicine sector and industrial control. Therefore, it is imperative that the software always works flawlessly. The information technology sector has witnessed a rapid expansion in recent years, as software companies can no longer rely only on cost advantages to stay competitive in the market, but programmers must provide reliable and high-quality software, and in order to estimate and predict software reliability using machine learning and deep learning, it was introduced A brief overview of the important scientific contributions to the subject of software reliability, and the researchers' findings of highly efficient methods and techniques for predicting software reliability.
The article proposes a new model for optimizing software effort and cost estimation based on code reusability. The model compares new projects to previously completed, similar projects stored in a code repository. By searching for and retrieving reusable code, functions, and methods from old projects, the model aims to reduce effort and cost estimates for new software development. The model is described as being based on the concept of estimation by analogy and using innovative search and retrieval techniques to achieve code reuse and thus decreased cost and effort estimates.
SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...ijseajournal
In software development phase software artifacts are not in consistent states such as: some of the class artifacts are fully developed some are half developed, some are major developed, some are minor developed and some are not developed yet. At this stage allowing too many software requirement changes may possibly delay in project delivery and increase development budget of the software. On the other hand rejecting too many changes may increase customer dissatisfaction. Software change effort estimation is one of the most challenging and important activity that helps software project managers in accepting or rejecting changes during software development phase. This paper extends our previous works on developing a software requirement change effort estimation model prototype tool for the software development phase. The significant achievements of the tool are demonstrated through an extensive experimental validation using several case studies. The experimental analysis shows improvement in the estimation accuracy over current change effort estimation models.
IRJET-Towards a Methodology for the Development of Plug-InIRJET Journal
This document proposes a methodology for developing plug-ins for Microsoft Word using the software development life cycle (SDLC) approach. It outlines the key phases of plug-in development: 1) requirements and planning to determine goals and tasks, 2) design and architecture to layout the system, and 3) construction through coding and integrating modules. It also discusses testing integrated modules and software functionality, and provides an example of a content improvement plug-in that checks spelling, grammar, and tone in documents. The methodology aims to make plug-in development easier through a structured process.
Review on Algorithmic and Non Algorithmic Software Cost Estimation Techniquesijtsrd
Effective software cost estimation is the most challenging and important activities in software development. Developers want a simple and accurate method of efforts estimation. Estimation of the cost before starting of work is a prediction and prediction always not accurate. Software effort estimation is a very critical task in the software engineering and to control quality and efficiency a suitable estimation technique is crucial. This paper gives a review of various available software effort estimation methods, mainly focus on the algorithmic model and non algorithmic model. These existing methods for software cost estimation are illustrated and their aspect will be discussed. No single technique is best for all situations, and thus a careful comparison of the results of several approaches is most likely to produce realistic estimation. This paper provides a detailed overview of existing software cost estimation models and techniques. This paper presents the strength and weakness of various cost estimation methods. This paper focuses on some of the relevant reasons that cause inaccurate estimation. Pa Pa Win | War War Myint | Hlaing Phyu Phyu Mon | Seint Wint Thu "Review on Algorithmic and Non-Algorithmic Software Cost Estimation Techniques" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd26511.pdfPaper URL: https://www.ijtsrd.com/engineering/-/26511/review-on-algorithmic-and-non-algorithmic-software-cost-estimation-techniques/pa-pa-win
COMPARATIVE STUDY OF SOFTWARE ESTIMATION TECHNIQUES ijseajournal
Many information technology firms among other organizations have been working on how to perform estimation of the sources such as fund and other resources during software development processes. Software development life cycles require lot of activities and skills to avoid risks and the best software estimation technique is supposed to be employed. Therefore, in this research, a comparative study was conducted, that consider the accuracy, usage, and suitability of existing methods. It will be suitable for the project managers and project consultants during the whole software project development process. In this project technique such as linear regression; both algorithmic and non-algorithmic are applied. Model, composite and regression techniques are used to derive COCOMO, COCOMO II, SLIM and linear multiple respectively. Moreover, expertise-based and linear-based rules are applied in non-algorithm methods. However, the technique needs some advancement to reduce the errors that are experienced during the software development process. Therefore, this paper in relation to software estimation techniques has proposed a model that can be helpful to the information technology firms, researchers and other firms that use information technology in the processes such as budgeting and decision-making processes.
AN ITERATIVE HYBRID AGILE METHODOLOGY FOR DEVELOPING ARCHIVING SYSTEMSijseajournal
With the massive growth of the organizations files, the needs for archiving system become a must. A lot of time is consumed in collecting requirements from the organization to build an archiving system. Sometimes the system does not meet the organization needs. This paper proposes a domain-based requirement engineering system that efficiently and effectively develops different archiving systems based on new
suggested technique that merges the two best used agile methodologies: extreme programming (XP) and SCRUM. The technique is tested on a real case study. The results shows that the time and effort consumed during analyzing and designing the archiving systems decreased significantly. The proposed methodology also reduces the system errors that may happen at the early stages of the development of the system.
AN ITERATIVE HYBRID AGILE METHODOLOGY FOR DEVELOPING ARCHIVING SYSTEMSijseajournal
With the massive growth of the organizations files, the needs for archiving system become a must. A lot of
time is consumed in collecting requirements from the organization to build an archiving system. Sometimes
the system does not meet the organization needs. This paper proposes a domain-based requirement
engineering system that efficiently and effectively develops different archiving systems based on new
suggested technique that merges the two best used agile methodologies: extreme programming (XP) and
SCRUM. The technique is tested on a real case study. The results shows that the time and effort consumed
during analyzing and designing the archiving systems decreased significantly. The proposed methodology
also reduces the system errors that may happen at the early stages of the development of the system.
Similar to REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTION AND SOFTWARE DEVELOPMENT (20)
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
How to Add Chatter in the odoo 17 ERP ModuleCeline George
In Odoo, the chatter is like a chat tool that helps you work together on records. You can leave notes and track things, making it easier to talk with your team and partners. Inside chatter, all communication history, activity, and changes will be displayed.
Assessment and Planning in Educational technology.pptxKavitha Krishnan
In an education system, it is understood that assessment is only for the students, but on the other hand, the Assessment of teachers is also an important aspect of the education system that ensures teachers are providing high-quality instruction to students. The assessment process can be used to provide feedback and support for professional development, to inform decisions about teacher retention or promotion, or to evaluate teacher effectiveness for accountability purposes.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
Executive Directors Chat Leveraging AI for Diversity, Equity, and InclusionTechSoup
Let’s explore the intersection of technology and equity in the final session of our DEI series. Discover how AI tools, like ChatGPT, can be used to support and enhance your nonprofit's DEI initiatives. Participants will gain insights into practical AI applications and get tips for leveraging technology to advance their DEI goals.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
This presentation was provided by Steph Pollock of The American Psychological Association’s Journals Program, and Damita Snow, of The American Society of Civil Engineers (ASCE), for the initial session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session One: 'Setting Expectations: a DEIA Primer,' was held June 6, 2024.
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTION AND SOFTWARE DEVELOPMENT
1. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
DOI : 10.5121/ijaia.2018.9602 13
REGULARIZED FUZZY NEURAL NETWORKS TO AID
EFFORT FORECASTING IN THE CONSTRUCTION AND
SOFTWARE DEVELOPMENT
Paulo Vitor de Campos Souza, Augusto Junio Guimaraes, Vanessa Souza
Araujo, Thiago Silva Rezende, Vinicius Jonathan Silva Araujo
Avenue Amazonas 5253, Belo Horizonte, Brazil
CEFET-MG 1,1,1
, *Faculty Una Betim_
a
Avenue. Governador Valadares, 640 - Centro,
b
Betim - MG, 32510-010.
ABSTRACT
Predicting the time to build software is a very complex task for software engineering managers. There are
complex factors that can directly interfere with the productivity of the development team. Factors directly
related to the complexity of the system to be developed drastically change the time necessary for the
completion of the works with the software factories. This work proposes the use of a hybrid system based
on artificial neural networks and fuzzy systems to assist in the construction of an expert system based on
rules to support in the prediction of hours destined to the development of software according to the
complexity of the elements present in the same. The set of fuzzy rules obtained by the system helps the
management and control of software development by providing a base of interpretable estimates based on
fuzzy rules. The model was submitted to tests on a real database, and its results were promissory in the
construction of an aid mechanism in the predictability of the software construction.
KEYWORDS
Fuzzy neural networks, effort forecasting, use case point, expert systems,
1. INTRODUCTION
Software development tracks the growth of technology worldwide. The continuing need for
systems that facilitate human life grows as the world population has more access to electronic
resources. The construction of systems is an area 5 that has been increasing its participation in the
market, mainly with the awareness by companies and the government that the operational
efficiency must be present in several areas of its performance [1]. However, managing the
construction of software is not a simple task. Factors can disrupt the construction of software
efficiently such as team expertise, the complexity of use cases used 10 or even the lack of
standardization of work by developers. Numerous factors undermine the time estimates made by
software managers, hampering financial and productivity estimates [2]. A study has been carried
out to collect data referring to several multiple criteria present in software development. These
studies have begun evaluations to understand the nature of this problem and to 15 build
__________________
*Faculty Una Betim, 1
Since 2015
2. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
14
mechanisms that allow managers to be able to close the deadline for delivering a software product
to their client [3]. This paper proposes to use a real database collected in Turkish software
development companies [4] for a hybrid model of fuzzy neural networks, to construct a system of
rules to assist in the construction of expert systems for the resolution of problems of time
estimation 20 of software development [5]. The fuzzy rules are constructed through the first layer
of the hybrid model that uses concepts from the Anfis algorithm [6] to create equally spaced
membership functions for each input variable. This approach allows creating Takagi Sugeno rules
of the IF / THEN type [7], allowing the database to be interpreted more straightforwardly by the
managers involved in the process. The second layer uses fuzzy logic neurons, which perform the
aggregation of the neurons of the of the first layer. Because it is a problem where the number of
neurons generated can be high due to the number of characteristics evaluated, a bootstrap-based
smoothing technique is applied to define the essential neurons of the problem. The final weights
of the second layer will be used by a neural aggregation network that has a single artificial neuron
[5].The synaptic weights will be generated using the concept of extreme learning machine [8],
where there is no need to update the internal parameters of the fuzzy neural network continually.
The paper is organized as follows: Section II presents the central concepts that guide the research,
such as the definitions of fuzzy neural networks, concepts related to the complexity of software
construction. Section III will have as its central focus the presentation of aspects related to the
database and how it has already been worked in the literature, evidencing the contribution of this
work. In it will also be present concepts and forms with which the model of fuzzy neural
networks will act in the resolution of the problems. Section IV will present the methodology used
in the tests and the methodologies, parameters, and configurations of the tests and their respective
results will be presented. Finally, in section V the conclusions of the work will be presented.
2. LITERATURE REVIEW
2.1. Software development
Software development has significantly evolved through the introduction of new technologies.
The use of mobile phones, direct access to information and agility in business requires that this
production is increasingly dynamic and cohesive to meet the emerging needs of this target
audience. Nowadays, software production is an area of study of several professors and
researchers in the area of software engineering, allowing new techniques of inspection,
evaluation, and improvement of activities in the construction of applications are a current aspect
in software factories [1]. However, there is a specific barrier to efficiency in software production
and customer satisfaction. In surveys carried out by [9] and [10] it is noted that the majority of
customers are dissatisfied with the services provided by software factories, mainly stand out the
complaints due to lack of organization, product out of compliance and main delays in deliveries.
Software development suffers from immeasurable external variables such as the lack of
technological resources to develop a specific task requested by the contractor, lack of skilled
labor, lack of tools in one language, high volatility of customer requests during the phase of
project execution, among others. The complexity of the system itself is a factor that cannot be
disregarded. The time for the production of a use case varies according to several external agents,
but the complexity of it interferes directly in this type of evaluation [11]. As Well, as the use
cases, the complexity of the actors involved in the problem, the connection between internal and
external systems, the use of techniques such as refactoring and reusing are highlighted. It is not a
fact that several techniques were developed to favor agile development over traditional
development. These alternatives can improve delivery time, but your forecasting still becomes a
3. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
15
complex factor, depending solely on the experience of the project manager, who must know very
well the potential of his team and the level of complexity of the system that is responsible [2].
2.2. Use Case Points
Several techniques have been proposed to facilitate the management of soft ware construction,
such as the SLOC (Source Lines of Code), [11] and FP (Function Points) [12] technique. These
methods seek to evaluate comprehension according to features present in code lines or function
points, making a proportional relation between them. In this paper, we highlight the approach
proposed by [13] that pieces of evidence the complexity of the software linking it to the
characteristics present in the cases of use of the system. Some factors are relevant to this
approach: At first, the actors of the system are assigned graded labels, such as simple, medium
and complex. This assessment lot depends on the synergy between business and systems analysis
team to not measure undue influence on the actors who will be using the system. The next step is
to set the Unweighted Weight of Actors (UAW) which is calculated by summing the weights of
all actors [13]. The same procedure is performed with the use cases, characterizing them as
simple, medium and complex, depending on the number of steps to run on your mainstream and
alternate flows. In short, the fewer steps that are performed to complete an activity in a use case,
the less complex it will be for the system and consequently to be developed. In the same way,
as in the actors, the Unadjusted Use-Case Weight (UUCW) is calculated as the sum of the
weights of all use cases. Then the UAW is added to the UUCW to produce the Unadjusted Use-
Case Points (UUCP) [13]. Finally, the Evaluation Points are then adjusted according to the
Technical Complexity Factors (TCF), such as the number of people involved, the client
engagement, the level of knowledge of the team, among others, and environmental factors (EF)
that external threats, such as crises, lack of corporate redemption, socio-economic or political
events, are linked. Environmental factors and their weights were imported from Function Point
theory, and technical complexity factors. Thus, UCP = UUCP * TCF * EF [13]. Figure 1 shows
an applied example of any evaluation of UCP. Note that the weights and values will vary
according to the nature of the problem being solved and the experts involved in this analysis. This
means that within a UCP analysis, which may seem complicated for an inexperienced team in
software that works with banking applications, it will be simpler for a team that has already
developed systems for another financial institution. This allows the weight of some analysis
parameters to be different in the UCP analysis of two companies that work with software
development.
2.3. Software effort estimation
There are several comparative studies between the techniques of definition or estimation of
efforts in software construction, as in [15]. Empirical studies using artificial neural networks were
also applied in the prediction of CPU [16]. It should be noted that studies related to software
testing efforts were also applied using UCP [17].
2.4. Intelligence models used in predicting use-case points
Intelligent models use the concepts of artificial intelligence to solve problems of various natures.
These problems involve situations that business and human
4. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
16
Figure 1: Example UCP [14]
beings constantly experience. They can act in the pattern classification identifying images on web
[18], license plate [19], saliency map estimation [20], image recognition [21, 22, 23] image
recognition in medical domain [24], text recognition [25], fault identification [26], among others.
Already in the data mining part of forecasting development recent work was developed by [27,
28, 29, 30].They use clever techniques to help managers and developers to understand the
patterns involved in software development.Fuzzy rules-based fuzzy systems were used in [31] to
perform the estimation of effort in the production of software using the UCP concepts. Already
in the work of [32] the concepts of linear regression and perceptron were combined to solve the
problems of estimation of efforts. Already in the works of [33] the focus was the same because
the concepts of a cascade conceived the artificial neural network used. Finally, the model
proposed by [34] worked to find the software effort with a hybrid two-layer model, where the
first layer was composed of fuzzy neurons and the second layer composed of an artificial neuron.
It should be noted that in the model addressed by [31], fuzzy rules were also generated, but the
training approach proposed in this article, as it was used with few dimensions, differs from the
proposal of this work that intends to use all dimensions of the problem. The following are
presented intelligent model architectures that were used to support this type of effort prediction,
with emphasis on the models proposed in [31], and [32]
5. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
17
Figure 2: Effort model architecture [35]
2.5. Fuzzy Neural Network
Fuzzy neural networks are characterized by neural networks formed of fuzzy neurons [36].Thus
a Fuzzy neural network can be defined as a fuzzy system that is trained by an algorithm provided
by a neural network. The use of hybrid systems allows the joint use of advantageous properties
that each one of the approaches can promote to solve the problem. Fuzzy systems are known for
their ability to transform data into more coherent information for human understanding,
especially if model responses will serve to construct expert systems. In the case of artificial neural
networks, their ability to adapt to various types of training, learning techniques can perform
activities commonly practiced by humans in a coherent way. Given this analogy, the union of the
neural network with the fuzzy logic comes with the intention of softening the deficiency of each
of these systems, making us have a more efficient, robust and easy to understand a system [37].
2.6. ANFIS
The Fuzzy Inference System (FIS) is a computational model based on the concepts of fuzzy set
theory capable of generating fuzzy rules of the IF/THEN style and fuzzy reasoning. Its structure
has three fundamental steps: a rule base, a database, and a reasoning base [6]. This type of model
is capable of performing a non-linear mapping from its input space to the output space. This
mapping is accompanied by many fuzzy rules IF/THEN, where each one describes the local
behavior of the mapping. In these rules, the antecedent implements a multidimensional neural
partition according to the type of algorithm chosen (grid, decision tree or grouping).
3. FUZZY NEURAL NETWORK SOFTWARE EFFORT PREDICTION USING
CONCEPTS OF POINTS OF USE CASES.
3.1. Network architecture
The fuzzy neural network described in this section is composed of three layers, moreover, derives
from the work of [38]. In the first layer, fuzzification is used through the concept of Anfis model.
The membership functions adopted in the first layer are equally spaced and can be of the
Gaussian or Triangular type. Already in the second layer the logical neurons of the and neuron
6. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
18
1
1
type, different from the or neuron adopted in [5]. These neurons have weights and activation
functions determined at random and through t-norms and s-norms to aggregate the neurons of the
first layer. To define the weights that connect the second layer with the output layer, the concept
of a extreme learning machine [8] is used to act on the neuron with a healthy activation function.
And neuron is used to construct fuzzy neural networks in the second layer to solve pattern
recognition problems and bring interpretability to the model. Figure 4 illustrates the feed forward
topology of the fuzzy neural networks considered in this paper. The first layer is composed of
neurons whose activation functions are membership functions of fuzzy sets defined for the input
variables using Anfis. For each input variable xij, L clouds are defined Alj, l = 1 L whose
membership functions are the activation functions of the corresponding neurons. Thus, the
outputs of the first layer are the membership degrees associated with the input values, i.e., ajl =µA
for j = 1... N and l = 1 L, where N is the number of inputs and L is the number of fuzzy
sets for each input results by Anfis [5]. The second layer is composed by L fuzzy and neurons.
Each neuron performs a weighted aggregation of some of the first layer outputs. This aggregation
is performed using the weights wil (for i = 1 N and l = 1 L). For each input variable j, only one
first layer output ajl is defined as input of the l -th neuron. So that w is sparse, each neuron of the
second layer is associated with an input variable. Finally, the output layer is composed of one
neuron whose activation functions are leaky ReLu [39]. The output of the model is:
= ( )
where z0 = 1, v0 is the bias, and zj and vj, j = 1, ..., l are the output of each fuzzy neuron of the
second layer and their corresponding weight, respectively. Figure 4 presents an example of
FNN architecture proposed in this paper.
This is an improved function of the ReLU function [40] because a small linear component is
inserted at the input of the neuron. This type of change allows small changes to be noticed and
neurons that would be relevant to the model are not discarded. Its function is expressed by [39]:
fLeakyReLU (x, α) = max (αx, x) (2)
The logical neurons used in the second layer of the model are of the andneu- ron type, where the
input signals are individually combined with the weights and performed the subsequent global
aggregation. The andneuron used inthis work can be expressed as [41]:
where T are t-norms, s is a s-norms. Fuzzy rules can be extracted from and- neurons according to
the following example:
Rule1: If xi1 is A1
with certainty w11...
and xi2 is A2
with certainty w21...
7. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
19
2
2
3
3
Then y1 is v1
Rule2 : If xi1 is A1
with certainty w12...
and xi2 is A2
with certainty w22...
Then y2 is v2
Rule3 : If xil is A1
with certainty w13...
Then y3 is v3
Rule4 : If xi2 is A2
with certainty w23...
Then y4 is v4 (4)
These rules allow the creation of a building base for expert systems [37].
Figure 3: FNN architecture
3.2 Training Fuzzy Neural Network
The membership functions in the first layer of the FNN are adopted as Gaussian or Triangular.
The number of neurons created with the input data partition is exponential between the number of
membership functions andthe number of features present in the problem database. The number of
neurons L in the first layer is defined according to the input data, and by the number of
membership functions M), defined parametrically. The second layer performs the aggregation of
the L neurons from the first layer through the and neurons.
After the construction of the L and neurons the bolasso algorithm [42] is executed to select LARS
(a regression algorithm for high-dimensional data that is proficient in measuring exactly the
regression coefficients but also a subset of candidate regressors to be incorporated in the final
model) using the most significant neurons (called Ls). The final network architecture is defined
through a feature extraction technique based on l1 regularization and resampling. The learning
algorithm assumes that the output hidden layer composed of the candidate neurons can be written
as [43]:
8. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
20
where v = [v0, v1, v2, ..., vL] is the weight vector of the output layer and z
(xi) = [z0,z1(xi),z2(xi)zL(xi) ] the output vector of the second layer, for z0 =1.
In this context, z (xi) is considered as the non-linear mapping of the input space for a space of
fuzzy characteristics of dimension L~ ρ [43].
The LARS algorithm can be used to perform the model selection since for a given value of λ
only a fraction (or none) of the regressors have corresponding nonzero weights. If λ = 0, the
problem becomes unrestricted regression, and all weights are nonzero. As λmax increases from
0 to a given value λmax, the number of nonzero weights decreases to zero. For the problem
considered in this paper, the zl regressors are the outputs of the significant neurons. [42].
Bolasso procedure is summarized in Algorithm 1.
Algorithm 1: Bolasso- bootstrap-enhanced least absolute shrinkage operator
(b1) Let n be the number of examples, (lines) in X: (b2) Show n examples of (X, Y), uniformly
and with substitution, called here (Xsamp, ysamp). (b3) Determine which weights are nonzero
given a λ value. (b4) Repeat steps b1: b3 for a specified number of bootstraps b. (b5) Take the
intersection of the non-zero weights indexes of all bootstrap replications. Select the resulting
variables. (b6) Revise using the variables selected via non-regularized least squares regression (if
requested). (b7) Repeat the procedure for each value of b bootstraps and λ (actually done more
efficiently by collecting interim results). (b8) Determine ”optimal” values for λ and b.
Subsequently, following the determination of the network topology, the pre- dictions of the
evaluation of the vector of weights’ output layer are performed. In this paper, this vector is
considered by the Moore-Penrose pseudo Inverse [43]:
Z+
is the Moore-Penrose pseudo Inverse of z, which is the minimum norm of the least squares
solution for the output weights. synthesized as demonstrated in Algorithm 2. It has three
parameters:
1- the number of grid size, ρ;
2- the number of bootstrap replications, bt;
3- 3- the consensus threshold, λ.
4. REGRESSION MODELS OF USE CASE POINT PROBLEMS
4.1 Assumptions and Initial Test Configurations
The tests performed in this paper seek to find a predictor model for the definition of the effort in
9. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
21
hours for the construction of software using the fuzzy
Algorithm 2: Forecasting Effort -FNN training____________________
1) Define the number os membership functions, M.
2) Define bootstrap replications, bt.
3) Define consensus threshold, λ
4) Calculate L neurons in the first layer using Anfis.
5) Construct L fuzzy neurons with Gaussian ou Triangularmembership functions
constructed with center and sigma values derived from Anfis.
6) Define the weights and bias of the fuzzy neurons randomly.
7) Construct L and neurons with random weights and bias on the second layer of
the network by welding the L fuzzy neurons of the first layer.
8) For all K entries do
(8.1) Calculate the mapping zk(xk) using and neurons
9) Select significant Ls using the lasso bootstrap according to the settings of bt
and λ.
10)Estimate the weights of the output layer (6)
11)Calculate output y using leaky ReLU. (2)_____________________________
neural network. The accuracy of the training and the test of the model will be realized by
checking the values obtained by the model, comparing them with the expected result. In this
context, they are evaluated through the mean square error (MSQE). The formula for defining your
calculation is shown below:
To perform the training, 30 repetitions were performed with the samples made available through
the software construction effort study. The percentage is defined as 70 % of the samples allocated
for training and the remaining 30 % for the test phase of the model. To avoid trends in the
characteristics of each of the examples, a proposal was made where all the samples destined to
the training and testing of the fuzzy neural network were randomly sampled. This ensures that
there will be no dependencies of the data stream for the model results. All samples involved in the
test were normalized with mean zero and variance 1. The activation functions of the third layer
neuron are of the leaky ReLU type. The values of the bootstrap replicate, the decision consensus
for the use of the bolus are, respectively, 16 and 0.7. These values were found to be optimal values
using a 5 k-fold technique in preliminary tests. For fuzzy neural networks using equally spaced
Gaussian or triangular functions, it was defined as 2 by the same k-fold process that the
configuration parameters of the bolasso method were found
4.2. Database used in the tests.
Datasetwascollected by[4]fromthree softwarecompaniesthat providedthe data (D1, D2, and D3)
and is based on the following problem areas: Insurance, Government, Banks, and other domains.
This database provides data such as the methodology used in software production, the
complexity of weights for cases of uses and actors, and other metrics relevant to the evaluation of
efforts. To be applied in the fuzzy neural network, the columns referring to the textual
information were transformed into numerical values.
10. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
22
The Principal dimensions used in this evaluation are: donator, simple actor methodology, average
actors, complex actors, actor weight, simple use case, medium use case, complicated use case, use
case weight, technical complexity, the real effort 20 hours. The methodology was defined as 0 for
traditional methods and 1 for agile methodologies.
4.3. Prediction tests on efforts in software construction.
Table I presents the results of the model proposed in this paper
After the tests, it was verified that the model behaves very efficiently in the prediction of efforts
related to the construction of software. In the comparison between the two models, the one that
uses triangular functions had the better result. The approach utilized all the dimensions provided
in the database and hadclose results to theregressionmodels proposedby[4] whichused asubgroup
of features of that database. This approach worked on the final network with an adequate number
of neurons.
The following is the results of the prediction performed by the fuzzy neural network. This
example demonstrates through graphs, relevant characteristics of the answers obtained by the
model In figure 4 results of training. Fig 5 the results of tests.
4.4 Interpretability of the problem based on fuzzy rules.
Figure 6 below shows the representation of Anfis used in the first layer of the model. The
following are examples of rules with the bases used. Each mem- bership function can receive
values to be consistent with the analyzed context.
If (Donator is complex) and (Methodology is Agile) and (SimpleActor is large) and
(MediumActor is small) and (Complexactor is small) and (Weigh tActor is high) and
(usecaseSimple is large)and (usecaseMedium is small) and (usecaseComplex is large) and -
0.2482 then (effort is 1762.18) (8)
Figure 4: Predict Train Results example
11. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
23
Figure 5: Predict Test Results
Figure 6: Anfis Results
(Weight Use Case is high) and (Techinical complexity is low) and (complexity factors is low) and
(realeffort20hours is high) with certainty -0.0307 then (effort is 167.863) 8. If (Donator is small)
and (Methodology is traditional) and (Simple Actor is large) and (Medium Actor is small) and
(Complex actor is small) and (Weight Actor is low) and (use case Simple is large) and (use case
Medium is small) and (use case Complex is small) and (Weight Use Case is high) and (Techinical
complexity is high) and (complexity factors is high) and (realeffort20hours is low) with certainty
5. CONCLUSION
The results presented by the fuzzy neural network model are promising for the prediction of effort
in software construction. In projects with more than 5000 service hours, averaging around 150
hours may be a fundamental way of helping managers predict. The advantage proposed by this
work was to use all the arguments present in the UCP since as new projects are analyzed, the
data of new evaluations can change the influence of the weights and correlations involved in the
variables made available for the test. This type of generated fuzzy rules can be more adapted than
works that made a selection of characteristics, especially if the nature of this discarded
information becomes relevant to the problem. The fuzzy neural network approach allows
knowledge to be extracted from the database to foster training and knowledge for software
development firms. With traditional approaches, we obtain the estimation of software effort, but
the results are not interpretable because they are a black box problem.
It can be verified that the construction of fuzzy rules allows managers, em- ployees, and
developers to find the logical relationships between the data set and the problem to be solved.
12. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
24
With the fuzzy rules found, training, hiring, and workforces can be allocated in important
contexts to avoid time for software development to be high. Future work may be performed on
other models of fuzzy neural networks, which use other neurons or clustering methods. Other
activation functions can be applied to verify the accuracy of the model.
ACKNOWLEDGMENT
The thanks of this work are destined to CEFET-MG and UNA.Conflict of Interest: All authors
declare no conflict of interest.
REFERENCES
[1] S. Krusche, B. Scharlau, ˚A. Cajander, J. Hughes, 50 years of software engi- neering: challenges,
results, and opportunities in its education, in: Proceed- ings of the 23rd Annual ACM Conference on
Innovation and Technology in Computer Science Education, ACM, 2018, pp. 362–363.
[2] C. Ghezzi, M. Jazayeri, D. Mandrioli, Fundamentals of software engineer- ing, Prentice Hall PTR,
2002.
[3] M. Harman, The current state and future of search based software engi- neering, in: 2007 Future of
Software Engineering, IEEE Computer Society, 2007, pp. 342–357.
[4] R. Silhavy, P. Silhavy, Z. Prokopova, Analysis and selection of a regression model for the use case
points method using a stepwise approach, Journal of Systems and Software 125 (2017) 1–14.
[5]P. V. de Campos Souza, L. C. B. Torres, Regularized fuzzy neural network based on or neuron for time
series forecasting, in: G. A. Barreto, R. Coelho (Eds.), Fuzzy Information Processing, Springer
International Publishing, Cham, 2018, pp. 13–23.
[6] J.-S. Jang, Anfis: adaptive-network-based fuzzy inference system, IEEE transactions on systems,
man, and cybernetics 23 (3) (1993) 665–685.
[7] T. Takagi, M. Sugeno, Derivation of fuzzy control rules from human oper- ator’s control actions,
IFAC Proceedings Volumes 16 (13) (1983) 55–60.
[8] G.-B. Huang, Q.-Y. Zhu, C.-K. Siew, Extreme learning machine: theory and applications,
Neurocomputing 70 (1-3) (2006) 489–501.
[9] G. Melnik, F. Maurer, Comparative analysis of job satisfaction in agile and non-agile software
development teams, in: International Conference on Extreme Programming and Agile Processes in
Software Engineering, Springer, 2006, pp. 32–42.
[10] K. El Emam, A. G. Koru, A replicated survey of it software project failures,
IEEE software (5) (2008) 84–90.
[11] R. S. Pressman, Engenharia de software, Vol. 6, Makron books S˜ao Paulo, 1995.
[12] C. E. Vazquez, G. S. SIMO˜ ES, R. M. Albert, An´alise de pontos de fun¸c˜ao: medi¸c˜ao,
estimativas e gerenciamento de projetos de software, EditoraE´rica, S˜ao Paulo 3.
[13] G. Karner, Resource estimation for objectory projects, Objective Systems SF AB 17.
[14] K. Iskandar, F. L. Gaol, B. Soewito, H. L. H. S. Warnars, R. Kosala, Software size measurement of
knowledge management portal with use case point, in: Computer, Control, Informatics and its
Applications (IC3INA), 2016 International Conference on, IEEE, 2016, pp. 42–47.
[15] G. R. Finnie, G. E. Wittig, J.-M. Desharnais, A comparison of software effort estimation techniques:
using function points with neural networks, case-based reasoning and regression models, Journal of
systems and soft- ware 39 (3) (1997) 281–289.
[16] H. Park, S. Baek, An empirical validation of a neural network model for software effort estimation,
Expert Systems with Applications 35 (3) (2008) 929–937.
[17] S. Nageswaran, Test effort estimation using use case points, in: Quality
Week, Vol. 6, 2001, pp. 1–6.
[18] G.-S. Liu, R.-Q. Wang, F. Yin, J.-M. Ogier, C.-L. Liu, Fast genre classi- fication of web images using
global and local features, CAAI Transactions on Intelligence Technology 3 (3) (2018) 161–168.
[19] P. Shivakumara, M. Asadzadehkaljahi, D. Tang, T. Lu, U. Pal, M. H. Anisi,
13. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
25
Cnn-rnn based method for license plate recognition, Caai Transactions on Intelligence Technology 3
(3) (2018) 169–175.
[20] T. Oyama, T. Yamanaka, Influence of image classification accuracy on saliency map estimation,
arXiv preprint arXiv:1807.10657.
[21] Y. Zhou, Q. Sun, J. Liu, Robust optimisation algorithm for the measure- ment matrix in compressed
sensing, CAAI Transactions on Intelligence Technology 3 (3) (2018) 133–139.
[22] Q. Deng, S. Wu, J. Wen, Y. Xu, Multi-level image representation for large- scale image-based
instance retrieval, CAAI Transactions on Intelligence Technology 3 (1) (2018) 33–39.
[23] G. Qi, Q. Zhang, F. Zeng, J. Wang, Z. Zhu, Multi-focus image fusion via morphological
similarity-based dictionary construction and sparse rep- resentation, CAAI Transactions on
Intelligence Technology 3 (2) (2018) 83–94.
[24] A. K. Pujitha, J. Sivaswamy, Solution to overcome the sparsity issue of an- notated data in medical
domain, CAAI Transactions on Intelligence Tech- nology 3 (3) (2018) 153–160.
[25] H. An, D. Wang, Z. Pan, M. Chen, X. Wang, Text segmentation of health examination item based on
character statistics and information measure- ment, CAAI Transactions on Intelligence Technology 3
(1) (2018) 28–32.
[26] A. Lemos, W. Caminhas, F. Gomide, Multivariable gaussian evolving fuzzy modeling system, IEEE
Transactions on Fuzzy Systems 19 (1) (2011) 91– 104.
[27] V. H. M. Garcia, E. R. Trujillo, J. I. R. Molano, Knowledge management model to support software
development, in: International Conference on Data Mining and Big Data, Springer, 2018, pp. 533–
543.
[28] F. A. Batarseh, A. J. Gonzalez, Predicting failures in agile software de- velopment through data
analytics, Software Quality Journal 26 (1) (2018) 49–66.
[29] H. Gall, C. Alexandru, A. Ciurumelea, G. Grano, C. Laaber, S. Panichella,
S. Proksch, G. Schermann, C. Vassallo, J. Zhao, Data-driven decisions and actions in todays software
development, in: The Essence of Software Engineering, Springer, 2018, pp. 137–168.
[30] R. Kumar, K. Prasad, A. S. Rao, Defect Prediction in Software Develop- ment & Maintainence,
Partridge Publishing, 2018.
[31] A. B. Nassif, L. F. Capretz, D. Ho, Estimating software effort based on use case point model using
sugeno fuzzy inference system, in: Tools with Artificial Intelligence (ICTAI), 2011 23rd IEEE
International Conference on, IEEE, 2011, pp. 393–398.
[32] A. B. Nassif, D. Ho, L. F. Capretz, Towards an early software estimation using log-linear regression
and a multilayer perceptron model, Journal of Systems and Software 86 (1) (2013) 144–160.
[33] A. B. Nassif, L. F. Capretz, D. Ho, Software effort estimation in the early stages of the software life
cycle using a cascade correlation neural net- work model, in: 2012 13th ACIS International
Conference on Software Engineering, Artificial Intelligence, Networking and Parallel & Distributed
Computing (SNPD 2012), IEEE, 2012, pp. 589–594.
[34] M. Azzeh, A. B. Nassif, A hybrid model for estimating software project effort from use case points,
Applied Soft Computing 49 (2016) 981–989.
[35] A. B. Nassif, M. Azzeh, L. F. Capretz, D. Ho, Neural network models for software development
effort estimation: a comparative study, Neural Computing and Applications 27 (8) (2016) 2369–2381.
[36] W. Pedrycz, F. Gomide, Fuzzy systems engineering: toward human-centric computing, John Wiley &
Sons, 2007.
[37] W. M. Caminhas, H. Tavares, F. A. Gomide, W. Pedrycz, Fuzzy set based neural networks: Structure,
learning and application., JACIII 3 (3) (1999) 151–157.
[38] P. V. C. Souza, Regularized fuzzy neural networks for pattern classification problems, International
Journal of Applied Engineering Research 13 (5) (2018) 2985–2991.
[39] A. L. Maas, A. Y. Hannun, A. Y. Ng, Rectifier nonlinearities improve neural network acoustic
models, in: Proc. icml, Vol. 30, 2013, p. 3.
[40] V. Nair, G. E. Hinton, Rectified linear units improve restricted boltzmann machines, in: Proceedings
of the 27th international conference on machine learning (ICML-10), 2010, pp. 807–814.
[41] W. Pedrycz, Neurocomputations in relational systems, IEEE Transactions on Pattern Analysis &
Machine Intelligence (3) (1991) 289–297.
14. International Journal of Artificial Intelligence and Applications (IJAIA), Vol.9, No.6, November 2018
26
[42] F. R. Bach, Bolasso: model consistent lasso estimation through the boot- strap, in: Proceedings of the
25th international conference on Machine learning, ACM, 2008, pp. 33–40.
[43] P. V. de Campos Souza, G. R. L. Silva, L. C. B. Torres, Uninorm based regularized fuzzy neural
networks, in: 2018 IEEE Conference on Evolving and Adaptive Intelligent Systems (EAIS), 2018, pp.
1–8. doi:10.1109/ EAIS.2018.8397176.