Reusability is an only one best direction to increase developing productivity and maintainability of application. One must first search for good tested software component and reusable. Developed Application software by one programmer can be shown useful for others also component. This is proving that code specifics to application requirements can be also reused in develop projects related with same requirements. The main aim of this paper proposed a way for reusable module. An process that takes source code as a input that will helped to take the decision approximately which particular software, reusable artefacts should be reused or not.
Reusability Metrics for Object-Oriented System: An Alternative ApproachWaqas Tariq
Object-oriented metrics plays an import role in ensuring the desired quality and have widely been applied to practical software projects. The benefits of object-oriented software development increasing leading to development of new measurement techniques. Assessing the reusability is more and more of a necessity. Reusability is the key element to reduce the cost and improve the quality of the software. Generic programming helps us to achieve the concept of reusability through C++ Templates which helps in developing reusable software modules and also identify effectiveness of this reuse strategy. The advantage of defining metrics for templates is the possibility to measure the reusability of software component and to identify the most effective reuse strategy. The need for such metrics is particularly useful when an organization is adopting a new technology, for which established practices have yet to be developed. Many researchers have done research on reusability metrics [2, 9, 3, 4]. In this paper we have proposed four new independent metrics Number of Template Children (NTC), Depth of Template Tree (DTT) Method Template Inheritance Factor (MTIF) and Attribute Template Inheritance Factor (ATIF), to measure the reusability for object-oriented systems.
Measuring effort for modifying software package as reusable package using pac...eSAT Journals
Abstract In any engineering field the data associated with knowledge is important one for taking decisions for solving problems in the current system development. The specification mining can give support for analyzing collected data to help the project management team to fulfill their responsibilities. In this paper ‘Package Specification Mining’ is designed by using packages’ reusability quality factor. It supports to give effort required for modifying the package to be reusable package for using those packages in new software development. This methodology may reduce the risks in various domains of software engineering. Keywords: Specification Mining, Reusability, Effort Estimation, Coupling, Project Management
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
A Model To Compare The Degree Of Refactoring Opportunities Of Three Projects ...acijjournal
Refactoring is applied to the software artifacts so as to improve its internal structure, while preserving its
external behavior. Refactoring is an uncertain process and it is difficult to give some units for
measurement. The amount to refactoring that can be applied to the source-code depends upon the skills of
the developer. In this research, we have perceived refactoring as a quantified object on an ordinal scale of
measurement. We have a proposed a model for determining the degree of refactoring opportunities in the
given source-code. The model is applied on the three projects collected from a company. UML diagrams
are drawn for each project. The values for source-code metrics, that are useful in determining the quality of
code, are calculated for each UML of the projects. Based on the nominal values of metrics, each relevant
UML is represented on an ordinal scale. A machine learning tool, weka, is used to analyze the dataset,
imported in the form of arff file, produced by the three projects
A STRUCTURAL APPROACH TO IMPROVE SOFTWARE DESIGN REUSABILITYcscpconf
Software reuse become a very promising area that has many benefits such as reducing costs,
time, and most importantly, increasing quality of software. However, the concept of reuse is not
only related to implementation level, in fact, it can be included in the earlier stages of the
software development life cycle such as design stage. Adopting reuse at this stage provides
many benefits such as increasing productivity, saving time and reducing cost of software
development.
Accordingly, this paper presents the concept of reuse at design level in more details. As well as,
it proposes an approach to improve the reusability of software design using the directed graph
concept. This leads to produce a design to be considered as reusable components which can be
adapted in many software systems.
Reusability Metrics for Object-Oriented System: An Alternative ApproachWaqas Tariq
Object-oriented metrics plays an import role in ensuring the desired quality and have widely been applied to practical software projects. The benefits of object-oriented software development increasing leading to development of new measurement techniques. Assessing the reusability is more and more of a necessity. Reusability is the key element to reduce the cost and improve the quality of the software. Generic programming helps us to achieve the concept of reusability through C++ Templates which helps in developing reusable software modules and also identify effectiveness of this reuse strategy. The advantage of defining metrics for templates is the possibility to measure the reusability of software component and to identify the most effective reuse strategy. The need for such metrics is particularly useful when an organization is adopting a new technology, for which established practices have yet to be developed. Many researchers have done research on reusability metrics [2, 9, 3, 4]. In this paper we have proposed four new independent metrics Number of Template Children (NTC), Depth of Template Tree (DTT) Method Template Inheritance Factor (MTIF) and Attribute Template Inheritance Factor (ATIF), to measure the reusability for object-oriented systems.
Measuring effort for modifying software package as reusable package using pac...eSAT Journals
Abstract In any engineering field the data associated with knowledge is important one for taking decisions for solving problems in the current system development. The specification mining can give support for analyzing collected data to help the project management team to fulfill their responsibilities. In this paper ‘Package Specification Mining’ is designed by using packages’ reusability quality factor. It supports to give effort required for modifying the package to be reusable package for using those packages in new software development. This methodology may reduce the risks in various domains of software engineering. Keywords: Specification Mining, Reusability, Effort Estimation, Coupling, Project Management
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
A Model To Compare The Degree Of Refactoring Opportunities Of Three Projects ...acijjournal
Refactoring is applied to the software artifacts so as to improve its internal structure, while preserving its
external behavior. Refactoring is an uncertain process and it is difficult to give some units for
measurement. The amount to refactoring that can be applied to the source-code depends upon the skills of
the developer. In this research, we have perceived refactoring as a quantified object on an ordinal scale of
measurement. We have a proposed a model for determining the degree of refactoring opportunities in the
given source-code. The model is applied on the three projects collected from a company. UML diagrams
are drawn for each project. The values for source-code metrics, that are useful in determining the quality of
code, are calculated for each UML of the projects. Based on the nominal values of metrics, each relevant
UML is represented on an ordinal scale. A machine learning tool, weka, is used to analyze the dataset,
imported in the form of arff file, produced by the three projects
A STRUCTURAL APPROACH TO IMPROVE SOFTWARE DESIGN REUSABILITYcscpconf
Software reuse become a very promising area that has many benefits such as reducing costs,
time, and most importantly, increasing quality of software. However, the concept of reuse is not
only related to implementation level, in fact, it can be included in the earlier stages of the
software development life cycle such as design stage. Adopting reuse at this stage provides
many benefits such as increasing productivity, saving time and reducing cost of software
development.
Accordingly, this paper presents the concept of reuse at design level in more details. As well as,
it proposes an approach to improve the reusability of software design using the directed graph
concept. This leads to produce a design to be considered as reusable components which can be
adapted in many software systems.
AN APPROACH FOR TEST CASE PRIORITIZATION BASED UPON VARYING REQUIREMENTS IJCSEA Journal
Software testing is a process continuously performed by the development team during the life cycle of the software with the motive to detect the faults as early as possible. Regressing testing is the most suitable technique for this in which we test number of test cases. As the number of test cases can be very large it is always preferable to prioritize test cases based upon certain criterions.In this paper prioritization strategy is proposed which prioritize test cases based on requirements analysis. By regressing testing if the requirements will vary in future, the software will be modified in such a manner that it will not affect the remaining parts of the software. The proposed system improves the testing process and its efficiency to achieve goals regarding quality, cost, and effort as well user satisfaction and the result of the proposed method evaluated with the help of performance evaluation metric.
PRODUCT QUALITY EVALUATION METHOD (PQEM): TO UNDERSTAND THE EVOLUTION OF QUAL...ijseajournal
Promoting quality within the context of agile software development, it is extremely important as well as
useful to improve not only the knowledge and decision-making of project managers, product owners, and
quality assurance leaders but also to support the communication between teams. In this context, quality
needs to be visible in a synthetic and intuitive way in order to facilitate the decision of accepting or
rejecting each iteration within the software life cycle. This article introduces a novel solution called
Product Quality Evaluation Method (PQEM) which can be used to evaluate a set of quality characteristics
for each iteration within a software product life cycle. PQEM is based on the Goal-Question-Metric
approach, the standard ISO/IEC 25010, and the extension made of testing coverage in order to obtain the
quality coverage of each quality characteristic. The outcome of PQEM is a unique multidimensional value,
that represents the quality level reached by each iteration of a product, as an aggregated measure. Even
though a value it is not the regular idea of measuring quality, we believe that it can be useful to use this
value to easily understand the quality level of each iteration. An illustrative example of the PQEM method
was carried out with two iterations from a web and mobile application, within the healthcare environment.
A single measure makes it possible to observe the evolution of the level of quality reached in the evolution
of the product through the iterations.
A FRAMEWORK FOR ASPECTUAL REQUIREMENTS VALIDATION: AN EXPERIMENTAL STUDYijseajournal
Requirements engineering is a discipline of software engineering that is concerned with the
identification and handling of user and system requirements. Aspect-Oriented Requirements
Engineering (AORE) extends the existing requirements engineering approaches to cope with the
issue of tangling and scattering resulted from crosscutting concerns. Crosscutting concerns are
considered as potential aspects and can lead to the phenomena “tyranny of the dominant
decomposition”. Requirements-level aspects are responsible for producing scattered and tangled
descriptions of requirements in the requirements document. Validation of requirements artefacts
is an essential task in software development. This task ensures that requirements are correct and
valid in terms of completeness and consistency, hence, reducing the development cost,
maintenance and establish an approximately correct estimate of effort and completion time of the
project. In this paper, we present a validation framework to validate the aspectual requirements
and the crosscutting relationship of concerns that are resulted from the requirements engineering
phase. The proposed framework comprises a high-level and low-level validation to implement on
software requirements specification (SRS). The high-level validation validates the concerns with
stakeholders, whereas the low-level validation validates the aspectual requirement by
requirements engineers and analysts using a checklist. The approach has been evaluated using
an experimental study on two AORE approaches. The approaches are viewpoint-based called
AORE with ArCaDe and lexical analysis based on Theme/Doc approach. The results obtained
from the study demonstrate that the proposed framework is an effective validation model for
AORE artefacts.
Reengineering framework for open source software using decision tree approachIJECEIAES
A Software engineering is an approach to software development. Once software gets developed and delivered, it needs maintenance. Changes in software incur due to new requirements of the end-user, identification of bug in software or failure to achieve system objective. It has been observed that successive maintenance in the developed software reduces software quality and degrades the performance of software system. Reengineering is an approach of retaining the software quality and improving maintainability of the software system. But the question arises “when to reengineer the software”. The paper proposed a framework for software reengineering process using decision tree approach which helps decision makers to decide whether to maintain or reengineer the software systems.
Maintaining the quality of the software is the major challenge in the process of software development.
Software inspections which use the methods like structured walkthroughs and formal code reviews involve
careful examination of each and every aspect/stage of software development. In Agile software
development, refactoring helps to improve software quality. This refactoring is a technique to improve
software internal structure without changing its behaviour. After much study regarding the ways to
improve software quality, our research proposes an object oriented software metric tool called
“MetricAnalyzer”. This tool is tested on different codebases and is proven to be much useful.
Harnessing deep learning algorithms to predict software refactoringTELKOMNIKA JOURNAL
During software maintenance, software systems need to be modified by adding or modifying source code. These changes are required to fix errors or adopt new requirements raised by stakeholders or market place. Identifying thetargeted piece of code for refactoring purposes is considered a real challenge for software developers. The whole process of refactoring mainly relies on software developers’ skills and intuition. In this paper, a deep learning algorithm is used to develop a refactoring prediction model for highlighting the classes that require refactoring. More specifically, the gated recurrent unit algorithm is used with proposed pre-processing steps for refactoring predictionat the class level. The effectiveness of the proposed model is evaluated usinga very common dataset of 7 open source java projects. The experiments are conducted before and after balancing the dataset to investigate the influence of data sampling on the performance of the prediction model. The experimental analysis reveals a promising result in the field of code refactoring prediction
Software quality is an important issue in the development of successful software application.
Many methods have been applied to improve the software quality. Refactoring is one of those
methods. But, the effect of refactoring in general on all the software quality attributes is
ambiguous.
The goal of this paper is to find out the effect of various refactoring methods on quality
attributes and to classify them based on their measurable effect on particular software quality
attribute. The paper focuses on studying the Reusability, Complexity, Maintainability,
Testability, Adaptability, Understandability, Fault Proneness, Stability and Completeness
attribute of a software .This, in turn, will assist the developer in determining that whether to
apply a certain refactoring method to improve a desirable quality attribute.
‘O’ Model for Component-Based Software Development Processijceronline
The technology advancement has forced the user to become more dependent on information technology, and so on software. Software provides the platform for implementation of information technology. Component Based Software Engineering (CBSE) is adopted by software community to counter challenges thrown by fast growing demand of heavy and complex software systems. One of the essential reasons behind adopting CBSE for software development is the fast development of complicated software systems within well-defined boundaries of time and budget. CBSE provides the mechanical facilities by assembling already existing reusable components out of autonomously developed pieces of the software. The paper proposes a novel CBSE model named as O model, keeping an eye on the available CBSE lifecycle.
Computer literacy and competitive pressures among end users is increasing day by day due to whichthe
need for End-User Programming in software packages is also increasing for rapid, flexible, and user
driven information processing solutions. End User Development out-sources development effort to the end
user by enabling softwaredevelopers to create information systems that can even be adapted by technically
inexperienced endusers and hence are in great demand. If end user decides to pay the price and add
significant programmability to their system, there are additional costs to consider before end user can start
to enjoy the payoff. It is important to calculate accurateand early estimation of software size forcalculating
effort and cost estimation of software systems incorporating EUD features. With the evolution of object
orientation, use cases emerged as a dominant method for structuring requirements. Use cases were
integrated into the Unified Modeling Language (UML) and Unified Process and became the standard for
Software Engineering requirements modelling. The Use Case Point (UCP)methodestimates project size by
assigning points to use cases in the same way that Function Point Analysis (FPA) assigns points to
functions. This paper discusses the concept of end-user programming and Advancement of UCP by adding
end-user development/programming as an additional Effort Estimation Factor (EEF).
Software Refactoring Under Uncertainty: A Robust Multi-Objective ApproachWiem Mkaouer
Refactoring large systems involves several sources of uncertainty related to the severity levels of code smells to be corrected and the importance of the classes in which the smells are located. Due to the dynamic nature of software development, these values cannot be accurately determined in practice, leading to refactoring sequences that lack robustness. To address this problem, we introduced a multi-objective robust model, based on NSGA-II, for the software refactoring problem that tries to find the best trade-off between quality and robustness. We evaluated our approach using six open source systems and demonstrated that it is significantly better than state-of-the-art refactoring approaches in terms of robustness in 100% of experiments based on a variety of real-world scenarios. Our suggested refactoring solutions were found to be comparable in terms of quality to those suggested by existing approaches and to carry an acceptable robustness price. Our results also revealed an interesting feature about the trade-off between quality and robustness that demonstrates the practical value of taking robustness into account in software refactoring tasks.
Determination of Software Release Instant of Three-Tier Client Server Softwar...Waqas Tariq
Quality of any software system mainly depends on how much time testing take place, what kind of testing methodologies are used, how complex the software is, the amount of efforts put by software developers and the type of testing environment subject to the cost and time constraint. More time developers spend on testing more errors can be removed leading to better reliable software but then testing cost will also increase. On the contrary, if testing time is too short, software cost could be reduced provided the customers take risk of buying unreliable software. However, this will increase the cost during operational phase since it is more expensive to fix an error during operational phase than during testing phase. Therefore it is essentially important to decide when to stop testing and release the software to customers based on cost and reliability assessment. In this paper we present a mechanism of when to stop testing process and release the software to end-user by developing a software cost model with risk factor. Based on the proposed method we specifically address the issues of how to decide that we should stop testing and release the software based on three-tier client server architecture which would facilitates software developers to ensure on-time delivery of a software product meeting the criteria of achieving predefined level of reliability and minimizing the cost. A numerical example has been cited to illustrate the experimental results showing significant improvements over the conventional statistical models based on NHPP.
One of the core quality assurance feature which combines fault prevention and fault detection, is often known as testability approach also. There are many assessment techniques and quantification method evolved for software testability prediction which actually identifies testability weakness or factors to further help reduce test effort. This paper examines all those measurement techniques that are being proposed for software testability assessment at various phases of object oriented software development life cycle. The aim is to find the best metrics suit for software quality improvisation through software testability support. The ultimate objective is to establish the ground work for finding ways reduce the testing effort by improvising software testability and its assessment using well planned guidelines for object-oriented software development with the help of suitable metrics.
SPSNYC - Authentication, Authorization, and Identity – More than meets the eye…Scott Hoag
In today’s complex market place of corporate partnerships and relationships, sharing information is pertinent to ensuring that business operations are conducted in a secure computing environment with trusted entities being provided access to protected information.
In this session, Dan and Scott will discuss the basics of authentication and authorization in relation to the SharePoint platform. Further, we will be discussing the technical underpinnings of the SharePoint platform’s processing of a user’s identity dependent on identity provider and authorization settings.
As a part of this session we will demonstrate different authentication and authorization configurations that are common place in today’s business settings to include when to use:
• Integrated Windows Authentication
• Forms Based Authentication using SQL Server
• ADFS as a Trusted Identity Provider
• Threat Management Gateway with Kerberos Constrained Delegation using client certs
After attending this session, attendees will have a better grasp of the configuration complexities involved with each scenario as well as the user experience impacts based on the path taken.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Implementation of Vacate on Demand Algorithm in Various Spectrum Sensing Netw...IJERA Editor
In present days the wireless communications are widely increases because of this reason spectrum utilization can be rapidly increased.For efficient usage of spectrum we can implement the Vacate on demand algorithm in different networks. CR users also need to sense the spectrum and vacate the channel upon the detection of the PU‟s presence to protectPUs from harmful interference. To achieve these fundamental CR functions, CR users usually coordinate with each other by using a common medium for control message exchange ensuring a priority of PUs over CR users. This paper presents the Vacate on Demand (VD) algorithm which enables dynamic spectrum access and ensures to vacate the assigned channel in case of PU activity and move the CR user to some other vacant channel to make spectrum available to PUs as well as to CR users. The basic idea is to use a ranking table of the available channels based on the PU activity detected on each channel. To improve the spectrum efficiency we can implement the Vacate on demand algorithm in MANET Network.
To Study Effect of Various Parameters for Quality Improvement in Technical Ed...IJERA Editor
In the present Research Investigation an effort has been made to study the effect of various important parameters of technical education system on its quality. This is done by creating sub modules of important stakeholders of technical education system and studying their Interactions by constructing causal loop diagrams of various modules .The main objective of this Research study is to construct a system dynamic model based on the interactions among this sub-modules which can be taken as a base for optimal policy planning for achieving optimum level of quality in the technical education system.
AN APPROACH FOR TEST CASE PRIORITIZATION BASED UPON VARYING REQUIREMENTS IJCSEA Journal
Software testing is a process continuously performed by the development team during the life cycle of the software with the motive to detect the faults as early as possible. Regressing testing is the most suitable technique for this in which we test number of test cases. As the number of test cases can be very large it is always preferable to prioritize test cases based upon certain criterions.In this paper prioritization strategy is proposed which prioritize test cases based on requirements analysis. By regressing testing if the requirements will vary in future, the software will be modified in such a manner that it will not affect the remaining parts of the software. The proposed system improves the testing process and its efficiency to achieve goals regarding quality, cost, and effort as well user satisfaction and the result of the proposed method evaluated with the help of performance evaluation metric.
PRODUCT QUALITY EVALUATION METHOD (PQEM): TO UNDERSTAND THE EVOLUTION OF QUAL...ijseajournal
Promoting quality within the context of agile software development, it is extremely important as well as
useful to improve not only the knowledge and decision-making of project managers, product owners, and
quality assurance leaders but also to support the communication between teams. In this context, quality
needs to be visible in a synthetic and intuitive way in order to facilitate the decision of accepting or
rejecting each iteration within the software life cycle. This article introduces a novel solution called
Product Quality Evaluation Method (PQEM) which can be used to evaluate a set of quality characteristics
for each iteration within a software product life cycle. PQEM is based on the Goal-Question-Metric
approach, the standard ISO/IEC 25010, and the extension made of testing coverage in order to obtain the
quality coverage of each quality characteristic. The outcome of PQEM is a unique multidimensional value,
that represents the quality level reached by each iteration of a product, as an aggregated measure. Even
though a value it is not the regular idea of measuring quality, we believe that it can be useful to use this
value to easily understand the quality level of each iteration. An illustrative example of the PQEM method
was carried out with two iterations from a web and mobile application, within the healthcare environment.
A single measure makes it possible to observe the evolution of the level of quality reached in the evolution
of the product through the iterations.
A FRAMEWORK FOR ASPECTUAL REQUIREMENTS VALIDATION: AN EXPERIMENTAL STUDYijseajournal
Requirements engineering is a discipline of software engineering that is concerned with the
identification and handling of user and system requirements. Aspect-Oriented Requirements
Engineering (AORE) extends the existing requirements engineering approaches to cope with the
issue of tangling and scattering resulted from crosscutting concerns. Crosscutting concerns are
considered as potential aspects and can lead to the phenomena “tyranny of the dominant
decomposition”. Requirements-level aspects are responsible for producing scattered and tangled
descriptions of requirements in the requirements document. Validation of requirements artefacts
is an essential task in software development. This task ensures that requirements are correct and
valid in terms of completeness and consistency, hence, reducing the development cost,
maintenance and establish an approximately correct estimate of effort and completion time of the
project. In this paper, we present a validation framework to validate the aspectual requirements
and the crosscutting relationship of concerns that are resulted from the requirements engineering
phase. The proposed framework comprises a high-level and low-level validation to implement on
software requirements specification (SRS). The high-level validation validates the concerns with
stakeholders, whereas the low-level validation validates the aspectual requirement by
requirements engineers and analysts using a checklist. The approach has been evaluated using
an experimental study on two AORE approaches. The approaches are viewpoint-based called
AORE with ArCaDe and lexical analysis based on Theme/Doc approach. The results obtained
from the study demonstrate that the proposed framework is an effective validation model for
AORE artefacts.
Reengineering framework for open source software using decision tree approachIJECEIAES
A Software engineering is an approach to software development. Once software gets developed and delivered, it needs maintenance. Changes in software incur due to new requirements of the end-user, identification of bug in software or failure to achieve system objective. It has been observed that successive maintenance in the developed software reduces software quality and degrades the performance of software system. Reengineering is an approach of retaining the software quality and improving maintainability of the software system. But the question arises “when to reengineer the software”. The paper proposed a framework for software reengineering process using decision tree approach which helps decision makers to decide whether to maintain or reengineer the software systems.
Maintaining the quality of the software is the major challenge in the process of software development.
Software inspections which use the methods like structured walkthroughs and formal code reviews involve
careful examination of each and every aspect/stage of software development. In Agile software
development, refactoring helps to improve software quality. This refactoring is a technique to improve
software internal structure without changing its behaviour. After much study regarding the ways to
improve software quality, our research proposes an object oriented software metric tool called
“MetricAnalyzer”. This tool is tested on different codebases and is proven to be much useful.
Harnessing deep learning algorithms to predict software refactoringTELKOMNIKA JOURNAL
During software maintenance, software systems need to be modified by adding or modifying source code. These changes are required to fix errors or adopt new requirements raised by stakeholders or market place. Identifying thetargeted piece of code for refactoring purposes is considered a real challenge for software developers. The whole process of refactoring mainly relies on software developers’ skills and intuition. In this paper, a deep learning algorithm is used to develop a refactoring prediction model for highlighting the classes that require refactoring. More specifically, the gated recurrent unit algorithm is used with proposed pre-processing steps for refactoring predictionat the class level. The effectiveness of the proposed model is evaluated usinga very common dataset of 7 open source java projects. The experiments are conducted before and after balancing the dataset to investigate the influence of data sampling on the performance of the prediction model. The experimental analysis reveals a promising result in the field of code refactoring prediction
Software quality is an important issue in the development of successful software application.
Many methods have been applied to improve the software quality. Refactoring is one of those
methods. But, the effect of refactoring in general on all the software quality attributes is
ambiguous.
The goal of this paper is to find out the effect of various refactoring methods on quality
attributes and to classify them based on their measurable effect on particular software quality
attribute. The paper focuses on studying the Reusability, Complexity, Maintainability,
Testability, Adaptability, Understandability, Fault Proneness, Stability and Completeness
attribute of a software .This, in turn, will assist the developer in determining that whether to
apply a certain refactoring method to improve a desirable quality attribute.
‘O’ Model for Component-Based Software Development Processijceronline
The technology advancement has forced the user to become more dependent on information technology, and so on software. Software provides the platform for implementation of information technology. Component Based Software Engineering (CBSE) is adopted by software community to counter challenges thrown by fast growing demand of heavy and complex software systems. One of the essential reasons behind adopting CBSE for software development is the fast development of complicated software systems within well-defined boundaries of time and budget. CBSE provides the mechanical facilities by assembling already existing reusable components out of autonomously developed pieces of the software. The paper proposes a novel CBSE model named as O model, keeping an eye on the available CBSE lifecycle.
Computer literacy and competitive pressures among end users is increasing day by day due to whichthe
need for End-User Programming in software packages is also increasing for rapid, flexible, and user
driven information processing solutions. End User Development out-sources development effort to the end
user by enabling softwaredevelopers to create information systems that can even be adapted by technically
inexperienced endusers and hence are in great demand. If end user decides to pay the price and add
significant programmability to their system, there are additional costs to consider before end user can start
to enjoy the payoff. It is important to calculate accurateand early estimation of software size forcalculating
effort and cost estimation of software systems incorporating EUD features. With the evolution of object
orientation, use cases emerged as a dominant method for structuring requirements. Use cases were
integrated into the Unified Modeling Language (UML) and Unified Process and became the standard for
Software Engineering requirements modelling. The Use Case Point (UCP)methodestimates project size by
assigning points to use cases in the same way that Function Point Analysis (FPA) assigns points to
functions. This paper discusses the concept of end-user programming and Advancement of UCP by adding
end-user development/programming as an additional Effort Estimation Factor (EEF).
Software Refactoring Under Uncertainty: A Robust Multi-Objective ApproachWiem Mkaouer
Refactoring large systems involves several sources of uncertainty related to the severity levels of code smells to be corrected and the importance of the classes in which the smells are located. Due to the dynamic nature of software development, these values cannot be accurately determined in practice, leading to refactoring sequences that lack robustness. To address this problem, we introduced a multi-objective robust model, based on NSGA-II, for the software refactoring problem that tries to find the best trade-off between quality and robustness. We evaluated our approach using six open source systems and demonstrated that it is significantly better than state-of-the-art refactoring approaches in terms of robustness in 100% of experiments based on a variety of real-world scenarios. Our suggested refactoring solutions were found to be comparable in terms of quality to those suggested by existing approaches and to carry an acceptable robustness price. Our results also revealed an interesting feature about the trade-off between quality and robustness that demonstrates the practical value of taking robustness into account in software refactoring tasks.
Determination of Software Release Instant of Three-Tier Client Server Softwar...Waqas Tariq
Quality of any software system mainly depends on how much time testing take place, what kind of testing methodologies are used, how complex the software is, the amount of efforts put by software developers and the type of testing environment subject to the cost and time constraint. More time developers spend on testing more errors can be removed leading to better reliable software but then testing cost will also increase. On the contrary, if testing time is too short, software cost could be reduced provided the customers take risk of buying unreliable software. However, this will increase the cost during operational phase since it is more expensive to fix an error during operational phase than during testing phase. Therefore it is essentially important to decide when to stop testing and release the software to customers based on cost and reliability assessment. In this paper we present a mechanism of when to stop testing process and release the software to end-user by developing a software cost model with risk factor. Based on the proposed method we specifically address the issues of how to decide that we should stop testing and release the software based on three-tier client server architecture which would facilitates software developers to ensure on-time delivery of a software product meeting the criteria of achieving predefined level of reliability and minimizing the cost. A numerical example has been cited to illustrate the experimental results showing significant improvements over the conventional statistical models based on NHPP.
One of the core quality assurance feature which combines fault prevention and fault detection, is often known as testability approach also. There are many assessment techniques and quantification method evolved for software testability prediction which actually identifies testability weakness or factors to further help reduce test effort. This paper examines all those measurement techniques that are being proposed for software testability assessment at various phases of object oriented software development life cycle. The aim is to find the best metrics suit for software quality improvisation through software testability support. The ultimate objective is to establish the ground work for finding ways reduce the testing effort by improvising software testability and its assessment using well planned guidelines for object-oriented software development with the help of suitable metrics.
SPSNYC - Authentication, Authorization, and Identity – More than meets the eye…Scott Hoag
In today’s complex market place of corporate partnerships and relationships, sharing information is pertinent to ensuring that business operations are conducted in a secure computing environment with trusted entities being provided access to protected information.
In this session, Dan and Scott will discuss the basics of authentication and authorization in relation to the SharePoint platform. Further, we will be discussing the technical underpinnings of the SharePoint platform’s processing of a user’s identity dependent on identity provider and authorization settings.
As a part of this session we will demonstrate different authentication and authorization configurations that are common place in today’s business settings to include when to use:
• Integrated Windows Authentication
• Forms Based Authentication using SQL Server
• ADFS as a Trusted Identity Provider
• Threat Management Gateway with Kerberos Constrained Delegation using client certs
After attending this session, attendees will have a better grasp of the configuration complexities involved with each scenario as well as the user experience impacts based on the path taken.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Implementation of Vacate on Demand Algorithm in Various Spectrum Sensing Netw...IJERA Editor
In present days the wireless communications are widely increases because of this reason spectrum utilization can be rapidly increased.For efficient usage of spectrum we can implement the Vacate on demand algorithm in different networks. CR users also need to sense the spectrum and vacate the channel upon the detection of the PU‟s presence to protectPUs from harmful interference. To achieve these fundamental CR functions, CR users usually coordinate with each other by using a common medium for control message exchange ensuring a priority of PUs over CR users. This paper presents the Vacate on Demand (VD) algorithm which enables dynamic spectrum access and ensures to vacate the assigned channel in case of PU activity and move the CR user to some other vacant channel to make spectrum available to PUs as well as to CR users. The basic idea is to use a ranking table of the available channels based on the PU activity detected on each channel. To improve the spectrum efficiency we can implement the Vacate on demand algorithm in MANET Network.
To Study Effect of Various Parameters for Quality Improvement in Technical Ed...IJERA Editor
In the present Research Investigation an effort has been made to study the effect of various important parameters of technical education system on its quality. This is done by creating sub modules of important stakeholders of technical education system and studying their Interactions by constructing causal loop diagrams of various modules .The main objective of this Research study is to construct a system dynamic model based on the interactions among this sub-modules which can be taken as a base for optimal policy planning for achieving optimum level of quality in the technical education system.
Reviewing the factors of the Renewable Energy systems for Improving the Energ...IJERA Editor
Electricity demand around the globe has increased alarmingly and is increasing at high rates. Therefore,
electricity supply by the conventional resources is not sufficient right now and the generation of electricity by
these resources is causing pollution worldwide. As the recent world is moving towards the alternative and
renewable resources of energy that include sun, wind, water, and air. This paper focuses on reviewing the
renewable energy sources used to improve the energy efficiency. This paper presents how the maximum power
generation capacity can be achieved using these sources. Main focus of this paper is on solar and wind power
that is freely available all around the globe. This paper concludes that there are certain factors that should be
considered while generating power from these sources. The factors include the calculation of radiation data,
storage size and capacity calculation, and geographic dispersion of the plants.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
CFD Analysis on the Effect of Injection Timing for Diesel Combustion and Emis...IJERA Editor
This paper describes the effect of injection timing in diesel combustion. Ansys Fluent a computational fluid dynamics tool is used to study the combustion of diesel with three different injection timing. The fuel is injected before TDC, at TDC and after TDC. The parameters such as temperature, pressure, velocity, density, soot and NOx emission are compared. The specie transport model is used for modelling the combustion. Standard k-e (2 equ) is used for modelling the turbulence. The analysis is carried out by only considering the compression and expansion strokes. The pressure reaches the maximum when the fuel is injected before TDC and the maximum temperature is when injected at TDC. The NOx emission is less when the fuel is injected at TDC and the soot formation is when fuel injected before TDC.
A Survey on Constellation Based Attribute Selection Method for High Dimension...IJERA Editor
Attribute Selection is an important topic in Data Mining, because it is the effective way for reducing dimensionality, removing irrelevant data, removing redundant data, & increasing accuracy of the data. It is the process of identifying a subset of the most useful attributes that produces compatible results as the original entire set of attribute. Cluster analysis or clustering is the task of grouping a set of objects in such a way that objects in the same group called a cluster are more similar in some sense or another to each other than to those in other groups (Clusters). There are various approaches & techniques for attribute subset selection namely Wrapper approach, Filter Approach, Relief Algorithm, Distributional clustering etc. But each of one having some disadvantages like unable to handle large volumes of data, computational complexity, accuracy is not guaranteed, difficult to evaluate and redundancy detection etc. To get the upper hand on some of these issues in attribute selection this paper proposes a technique that aims to design an effective clustering based attribute selection method for high dimensional data. Initially, attributes are divided into clusters by using graph-based clustering method like minimum spanning tree (MST). In the second step, the most representative attribute that is strongly related to target classes is selected from each cluster to form a subset of attributes. The purpose is to increase the level of accuracy, reduce dimensionality; shorter training time and improves generalization by reducing over fitting.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Behavior Analysis Of Malicious Web Pages Through Client Honeypot For Detectio...IJERA Editor
Malwares which is also known as malicious software’s is spreading through the exploiting the client side applications such as browsers, plug-ins etc. Attackers implant the malware codes in the user’s computer through web pages; thereby they are also known malicious web pages. Here in the paper, we present the usefulness of controlled environment in the form of client honeypots in detection of malicious web pages through collections of malicious intent in web pages and then perform detailed analysis for validation and confirmation of malicious web pages. First phase is collection of malicious infections through high interaction client honeypot, second phase is validations of the malicious infections embedded into web pages through behavior based analysis. Malwares which infect the client side applications and drop the malwares into user’s computers sometimes overrides the signature based detection techniques; thereby there is a need to study the behavior of the complete malicious web pages.
Research on Iris Region Localization AlgorithmsIJERA Editor
Iris recognition is a biometric technique that offers premium performance. Iris localization is critical to the success of an iris recognition system, since data that is falsely represented as iris pattern data will corrupt the biometric templates generated, resulting in poor recognition rates. So far different algorithms for iris localization having been proposed. This paper explored four efficient methods for iris localization, out of these three methods of iris localization in circular form and one methods of unwrapping the iris in to a flat bed. Experimental results are reported to demonstrate performance evaluation of every implemented algorithms. Conclusion based on comparisons can provide most significant information for further research. A CASIA and UPOL iris databases of iris images has been used for implementation of iris localization General Term Biometrics,Iris Recognition,Iris Localization
Software Based Traffic Separation at the Access LayerIJERA Editor
Access Network is the subscriber part of the telecommunications network or the network connecting the subscribers to the Internet Service Providers (ISP) [1]. In many countries including Tanzania access network is still predominantly made up of the copper cable based or other point to point wireless connections. This has kept the network in large proportions passive, inflexible and relatively unreliable [2]. This traditional network has long been tailored to the services generally provided i.e. voice, leased lines, Internet, corporate data and video conference, sometimes each provided by separate equipment and networks. This paper presents the study on approaches used by ISP in Tanzania to separate traffic in the access network. The paper also presents the effective way of traffic separation, whereby multiple hardware used to separate traffic currently has been replaced with single hardware. The traffic separation technique is based on creating logical links (software based) for each traffic type inside single physical link, providing a differentiated QoS support for each type of traffic according to its individual QoS requirements.
FEDSPUG - SharePoint 2013 - A Brief Capability OverviewScott Hoag
Hot off the press, SharePoint 2013 attained Release to Manufacture in October 2012 and is speeding ahead to general availability in early 2013. Interesting in learning what’s new and different? Then come and learn more about new capabilities in the product such as Shredded Storage and Distributed Caching among others as well as how the migration story changes for SharePoint 2013.
An Approach to Calculate Reusability in Source Code Using MetricsIJERA Editor
Reusability is an only one best direction to increase developing productivity and maintainability of application. One must first search for good tested software component and reusable. Developed Application software by one programmer can be shown useful for others also component. This is proving that code specifics to application requirements can be also reused in develop projects related with same requirements. The main aim of this paper proposed a way for reusable module. An process that takes source code as a input that will helped to take the decision approximately which particular software, reusable artefacts should be reused or not.
A Methodology To Manage Victim Components Using Cbo Measureijseajournal
The current practices of software industry demands development of a software within time and budget
which is highly productive. The traditional approach of developing a software from scratch requires
considerable amount of effort. To overcome the drawback a reuse drive software development approach is
adopted. However there is a dire need for realizing effective software reuse. This paper presents several
measures of reusability and presents a methodology of reconfiguring the victim components. The CBO
measure helps in identifying the component to be reconfigured. The proposed strategy is simulated using
HR portal domain specific component system.
A MODEL TO COMPARE THE DEGREE OF REFACTORING OPPORTUNITIES OF THREE PROJECTS ...acijjournal
Refactoring is applied to the software artifacts so as to improve its internal structure, while preserving its
external behavior. Refactoring is an uncertain process and it is difficult to give some units for
measurement. The amount to refactoring that can be applied to the source-code depends upon the skills of
the developer. In this research, we have perceived refactoring as a quantified object on an ordinal scale of
measurement. We have a proposed a model for determining the degree of refactoring opportunities in the
given source-code. The model is applied on the three projects collected from a company. UML diagrams
are drawn for each project. The values for source-code metrics, that are useful in determining the quality of
code, are calculated for each UML of the projects. Based on the nominal values of metrics, each relevant
UML is represented on an ordinal scale. A machine learning tool, weka, is used to analyze the dataset,
imported in the form of arff file, produced by the three projects.
Developing reusable software components for distributed embedded systemseSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
A Novel Optimization towards Higher Reliability in Predictive Modelling towar...IJECEIAES
Although, the area of software engineering has made a remarkable progress in last decade but there is less attention towards the concept of code reusability in this regards.Code reusability is a subset of Software Reusability which is one of the signature topics in software engineering. We review the existing system to find that there is no progress or availability of standard research approach toward code reusability being introduced in last decade. Hence, this paper introduced a predictive framework that is used for optimizing the performance of code reusability. For this purpose, we introduce a case study of near real-time challenge and involved it in our modelling. We apply neural network and Damped-Least square algorithm to perform optimization with a sole target to compute and ensure highest possible reliability. The study outcome of our model exhibits higher reliability and better computational response time
A FRAMEWORK STUDIO FOR COMPONENT REUSABILITYcscpconf
The deployment of a software product requires considerable amount of time and effort. In order
to increase the productivity of the software products, reusability strategies were proposed in the
literature. However effective reuse is still a challenging issue. This paper presents a framework
studio for effective components reusability which provides the selection of components from framework studio and generation of source code based on stakeholders needs. The framework studio is implemented using swings which are integrated onto the Net Beans IDE which help in faster generation of the source code.
An Empirical Study of the Improved SPLD Framework using Expert Opinion TechniqueIJEACS
Due to the growing need for high-performance and low-cost software applications and the increasing competitiveness, the industry is under pressure to deliver products with low development cost, reduced delivery time and improved quality. To address these demands, researchers have proposed several development methodologies and frameworks. One of the latest methodologies is software product line (SPL) which utilizes the concepts like reusability and variability to deliver successful products with shorter time-to-market, least development and minimum maintenance cost with a high-quality product. This research paper is a validation of our proposed framework, Improved Software Product Line (ISPL), using Expert Opinion Technique. An extensive survey based on a set of questionnaires on various aspects and sub-processes of the ISPLD Framework was carried. Analysis of the empirical data concludes that ISPL shows significant improvements on several aspects of the contemporary SPL frameworks.
A New Model for Study of Quality Attributes to Components Based Development A...Kiogyf
A New Model for Study of Quality Attributes to Components Based Development Approach
by bstract :
Software development costs, time - to release and quality product are important factors affecting the construction of software. Different types of tools and techniques are suggested by researchers to improve in delivering quality software systems with lower cost and reduce time to delivery. One such practice is development of software using ased Software Development (CBSD) techniques. CBSD recommended Component Bbuilding software systems using existing reusable components, instead of writing from scratch. The main objective of CBSD is to writes once and reuse any number of time with no or modification . Some of the advantages that a company may available by adapting CBSD for the Software development are shorter development time which results in meet tight dead line, increase productivity and Quality Product. CBSD also, s paper is to develop the new model of software support reusability. The aim of thiproduct and describe the characteristics of some selected of attributes of CBSD models that are widely practiced in software industries. We proposed a complete model for or reuse. This Model will cover both Component Based Software Development fcomponent based software development as well as Component development phases for
A - Model. This Model is represent one good solution for Component Based Development with reduce cost and time to deliverable and save the quality of product . Keywords: Component Based Approach, Quality Model, Quality Attributes, , A - Model for CBD .
1. Introduction
Cosmetic shop management system project report.pdfKamal Acharya
Buying new cosmetic products is difficult. It can even be scary for those who have sensitive skin and are prone to skin trouble. The information needed to alleviate this problem is on the back of each product, but it's thought to interpret those ingredient lists unless you have a background in chemistry.
Instead of buying and hoping for the best, we can use data science to help us predict which products may be good fits for us. It includes various function programs to do the above mentioned tasks.
Data file handling has been effectively used in the program.
The automated cosmetic shop management system should deal with the automation of general workflow and administration process of the shop. The main processes of the system focus on customer's request where the system is able to search the most appropriate products and deliver it to the customers. It should help the employees to quickly identify the list of cosmetic product that have reached the minimum quantity and also keep a track of expired date for each cosmetic product. It should help the employees to find the rack number in which the product is placed.It is also Faster and more efficient way.
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...Amil Baba Dawood bangali
Contact with Dawood Bhai Just call on +92322-6382012 and we'll help you. We'll solve all your problems within 12 to 24 hours and with 101% guarantee and with astrology systematic. If you want to take any personal or professional advice then also you can call us on +92322-6382012 , ONLINE LOVE PROBLEM & Other all types of Daily Life Problem's.Then CALL or WHATSAPP us on +92322-6382012 and Get all these problems solutions here by Amil Baba DAWOOD BANGALI
#vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore#blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #blackmagicforlove #blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #Amilbabainuk #amilbabainspain #amilbabaindubai #Amilbabainnorway #amilbabainkrachi #amilbabainlahore #amilbabaingujranwalan #amilbabainislamabad
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
Forklift Classes Overview by Intella PartsIntella Parts
Discover the different forklift classes and their specific applications. Learn how to choose the right forklift for your needs to ensure safety, efficiency, and compliance in your operations.
For more technical information, visit our website https://intellaparts.com
Immunizing Image Classifiers Against Localized Adversary Attacksgerogepatton
This paper addresses the vulnerability of deep learning models, particularly convolutional neural networks
(CNN)s, to adversarial attacks and presents a proactive training technique designed to counter them. We
introduce a novel volumization algorithm, which transforms 2D images into 3D volumetric representations.
When combined with 3D convolution and deep curriculum learning optimization (CLO), itsignificantly improves
the immunity of models against localized universal attacks by up to 40%. We evaluate our proposed approach
using contemporary CNN architectures and the modified Canadian Institute for Advanced Research (CIFAR-10
and CIFAR-100) and ImageNet Large Scale Visual Recognition Challenge (ILSVRC12) datasets, showcasing
accuracy improvements over previous techniques. The results indicate that the combination of the volumetric
input and curriculum learning holds significant promise for mitigating adversarial attacks without necessitating
adversary training.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdffxintegritypublishin
Advancements in technology unveil a myriad of electrical and electronic breakthroughs geared towards efficiently harnessing limited resources to meet human energy demands. The optimization of hybrid solar PV panels and pumped hydro energy supply systems plays a pivotal role in utilizing natural resources effectively. This initiative not only benefits humanity but also fosters environmental sustainability. The study investigated the design optimization of these hybrid systems, focusing on understanding solar radiation patterns, identifying geographical influences on solar radiation, formulating a mathematical model for system optimization, and determining the optimal configuration of PV panels and pumped hydro storage. Through a comparative analysis approach and eight weeks of data collection, the study addressed key research questions related to solar radiation patterns and optimal system design. The findings highlighted regions with heightened solar radiation levels, showcasing substantial potential for power generation and emphasizing the system's efficiency. Optimizing system design significantly boosted power generation, promoted renewable energy utilization, and enhanced energy storage capacity. The study underscored the benefits of optimizing hybrid solar PV panels and pumped hydro energy supply systems for sustainable energy usage. Optimizing the design of solar PV panels and pumped hydro energy supply systems as examined across diverse climatic conditions in a developing country, not only enhances power generation but also improves the integration of renewable energy sources and boosts energy storage capacities, particularly beneficial for less economically prosperous regions. Additionally, the study provides valuable insights for advancing energy research in economically viable areas. Recommendations included conducting site-specific assessments, utilizing advanced modeling tools, implementing regular maintenance protocols, and enhancing communication among system components.
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
A Survey of Software Reusability
1. Rohit Patidar Int. Journal of Engineering Research and Applications www.ijera.com
ISSN : 2248-9622, Vol. 4, Issue 8( Version 2), August 2014, pp.96-101
www.ijera.com 96 | P a g e
A Survey of Software Reusability Rohit Patidar, Prof. Virendra Singh Computer Science IIST 2 Indore Indore, India Computer Science IIST 2 Indore Indore, India Abstract Reusability is an only one best direction to increase developing productivity and maintainability of application. One must first search for good tested software component and reusable. Developed Application software by one programmer can be shown useful for others also component. This is proving that code specifics to application requirements can be also reused in develop projects related with same requirements. The main aim of this paper proposed a way for reusable module. An process that takes source code as a input that will helped to take the decision approximately which particular software, reusable artefacts should be reused or not. Keywords: software reuse, reusability, metrics, CK metrics, cyclomatic complexity
I. INTRODUCTION
The degree to which is a software Reusability module or early product work can be using in more than one system computing or software program. Many believe software reusability provides the key to tremendous benefits and saving in software development product. The U.S. Department of Defence only could be save $300 million annually by increasing its level of reuse by as little as 1% [l].Measurement may help us not only to learn how to build reusable components but also to identify reusable components among the wealth of existing programs. Existing programs contain the knowledge and experience gained from working in the particular application domain and meeting the organization‟s software needs. If we could extract this information efficiently, we could gain a valuable resource upon which to build future applications. If any Software system using the reusable component from existing System. Reusability save the lot of time for developing the software and also reduced the cost of the software development. It is also increases the software performance. And objective of the any system organization is giving the product with good quality and reduced the low cost. Today everyone is interested to increase the productivity, and reducing the cost of developing products, and better quality of software providing. There are various types of quality attributes available from which one can be identify the software quality. Reusability is way to increase the productivity, which is to reusable the existing component. Because a lot of time and effort already have expended for the developing product. Old software component that is well tested and designed already [15]. So that one should reuse the being software. That problem is many times module are not developed for reusable, extends to highly product development time and cost. Thus, one should explore system which existing component or module is more suitable for reuse, and try to reuse.
II. REUSABILITY
Reusability measurement is allowing for identify the reusable modules from being program and way to build. Some Department and Organization, along the increasing level of software reuse, development cost taken to develop the software and save the cost and time. Department of U.S. saved 300 million $ by increasing the 1% reusability software. Being software programs contain the knowledge‟s and experiences of the developers who are good in particular application domain area. So we take out information from being program which comfort to require of the software system then it is beneficial for the organization. There are many examples where reusability helps
In Missile Systems Division (MSD) using the computer software reuse concept it increased the 50% productivity.
American Navy utilizes the reusable modules which reduce 26 % of labour required to develop and maintain the Restructured Naval Tactical Data Systems (RNTDS).
Magnavox saw when we are using the reusable modules to develop the Force Fusion System Prototype (FFSP); it reduces the 20% of the development time of estimated time for developing the new system.
We determined from requirement stage to last stage of the software development .In Software development life cycle Reuse concept is not determined to only coding stage. These are several phases where software development is reused:
Code
Requirement
Architecture/design documentation
RESEARCH ARTICLE OPEN ACCESS
2. Rohit Patidar Int. Journal of Engineering Research and Applications www.ijera.com
ISSN : 2248-9622, Vol. 4, Issue 8( Version 2), August 2014, pp.96-101
www.ijera.com 97 | P a g e
Test plans
Specifications
Design
Manuals
Templates
Design decisions
III. PREVIOUS WORK
The measurement of the reusability will help developers to control the current level of the reuse and providing the metrics for identifies one of the important quality properties reusability. Conte [14], Boehm [12], Bailey [2], and Fenton [3] describe reusability measuring that are based on comparisons between the size of the newly written code in particular software product and the length or size of reusability code. Change of the reusability code is not considered by these researchers. Conte reuse measure is estimating coding effort. Reusability reduces the coding effort. These reductions impact the effort estimation techniques. In standardized way Boehm And bailey use the size of reused code to adjust cost predictors And Fenton also develop a method which measuring the reuse based on the dependency in an associated call graph. All these above measurement develop by the researchers not supporting the modification. Selby[16] providing the approach which supporting the modification of reused code Selby classifying the module into a particular category based on the percentage of reused Code is modified in make new module. These categories are:
Complete new module.
Reused Module with grater then equal to 25 percentage changes.
Reused Module with less than equal to 25 percentage change.
Module that are reused without change.
All these measurement are based on only one Attributes program size or program length. Selby add one another attributes modification. [6] Reuse Measure Derivation As indicated in the title, this paper gave a description of how to derive measures of software reuse in object oriented systems [5]. The method used to derive measures of software reuse, is derived from measurement theory and was described as.
Define considerable and identify and nonrational, realized attributes of software reusability, we must qualitatively understand what we want to evaluate.
Determine precisely the attributes to be measured and the documents. The measurement must be able to specify and indications of the measurement and the aim with higher precision.
Development of formal methods with higher level of abstractions which describe the attributes. Formal designs for definition to
produce quantitative evaluate to study these abstractions without any ambiguity in them.
Devising the relationship of the model with the attributes with this quantitative measurement. These relationships must be in consistency with the specification of attributes and quantitative values.
A. Assessing Module Reusability:
In these paper Propose the Conceptual model for estimating the reuse of the module. Reusability [7] determines the reusability of the module as product of its applicability and functionality. In this paper define model of Reuse as function of Applicability and Functionality.
B. Functionality:
Functionality of the module is determined as number of situation in which this module is Use, based on the specification. Important Problems will get a high functionality, while a module that covers a few specific, rare problems will get a low functionality.
C. Applicability:
Applicability measuring the number of situations in which a module can be reused. When the module can only be applied in 50%of the situations where its specified features are needed, then its applicability is 50%. The applicability of a module is 100% when any situation that calls for features provided by the module, the module can actually be used. There are many reasons why a module's applicability might be less than 100%, including:
Technical limitations, such as programming languages and platforms.
Incompatibilities by details of the interface.
Incompatibilities by external dependencies.
Architectural mismatches.
IV. DEFINITION OF REUSES TYPES
Bieman et al‟s [6] determined the various types of reuse and defined the three types of reuse in the three perspectives. These are explained as follows:
Public Reuse:
Fenton define the public reuses as “the proportion of a product which was constructed externally “[3].
Public reuse = length (E) / length (p);
E is the code developed externally.
P is the new system including E. Private Reuse:
Fenton defines private reuse (or possibly more appropriately internal reuse) as the “degree to which modules between products are reused within the same product” [3].
Fenton uses the call graph which represents the flow connection of the module .In these graphs node
3. Rohit Patidar Int. Journal of Engineering Research and Applications www.ijera.com
ISSN : 2248-9622, Vol. 4, Issue 8( Version 2), August 2014, pp.96-101
www.ijera.com 98 | P a g e
represents the module and they are connected through
edges. If one node calls another node then edge
displays the connection between them. Fenton
provides the formula for calculating the private reuse
in call graph as follows
R (G) = e – n + 1;
e is total no of edges in graph. And n is the number of
the nodes in graph.
Leveraged Reuse:
In Leveraged reuse means modifications of reuse
is allowed.
Direct Reuse:
Direct reuse is reuse without using the
intermediate entity. One module directly calls another
module.
Verbatim Reuse:
In Verbatim reuse means modifications of reuse
is not allowed.
Indirect Reuse:
Indirect reuse is reuse through an intermediate
entity. When first module calls second module and
second module calls the third module then first
module indirectly calls the third module.
A. Three Perspectives for reuse:
Bieman [6] provided the three perspective view
for identifying the reuse views. These are described
as follows:
Server Perspective: perspective of the library
component known as a server perspective. Server
reuse of the any class will characterize how one class
is using the other class.
Client Perspective: Client perspective means how
one particular entity is using the other entity i.e. how
the new system using the existing system.
System Perspective: it is the combination of the both
server perspective and client Perspective.
V. PROPERTIES OF SOFTWARE WHICH
AFFECT REUSABILITY
There are some software attributes which affect
the reuse. The relationship between these attribute and
the reusability are explained as follows:
Complexity : when the complexity of the class is
high or for developing the class developer uses a
complicated structure then that type of class is
difficult to reuse and difficult to understand.
Complexity of interface: Complicated interface
make reuse difficult.
Class size: when the size of the class is large then
it is difficult to understand and difficult to reuse.
Dependencies: Dependency of the single module
to various modules may also make reuse more
difficult.
Reusability is depending about the, adaptability,
portability, maintainability reliability and
understandability [4]. We are dealing with the java
program that way portability is not issue for us.
Complexity is of two types‟ structure complexity and
inheritance complexity. And we are treating with
static code therefore we are not considering the
reliability an affect which is the reusability, since
reliability is measure in terms of the average time and
error which is measured, on the execution of the
program. Understandability depending on the
structure complexity, documentation level of the
programs and size.
Fig 1 Factor in which Reusability Depends
A. Understandability:
Understandability is the level in which the
meaning of the software system or module should be
authorizing to the developer or user.
Understandability depend along the following
element, these are Documentation level, size and
complexity. When module are well documented then
understandability of the module is high i.e. module
having more comment line so new developer
understand module code easily, since what cause
function do describe in the starting of the purpose.
Understandability is also depending along the size of
the module. When size of the module is high and then
itself difficult to understand. If the Complexity of the
module is high then module is difficult to understand.
We tell module is more complex when module holds
more composite data structure in his program and
more decision statement Complexity is two types
beginning is structure complexity it was easily
measure along WMC metrics and second is
inheritance complexity; this complexity is
measurement by using the DIT and NOC metrics.
When the program using the inheritance concept or
cover the class then these metrics is used for
measuring the inheritance complexity [9].
4. Rohit Patidar Int. Journal of Engineering Research and Applications www.ijera.com
ISSN : 2248-9622, Vol. 4, Issue 8( Version 2), August 2014, pp.96-101
www.ijera.com 99 | P a g e
Fig 2 Factor and Metrics in which Understandability
depends
Lines of Code (LOC): This metrics applied for
measuring the size of the program by considering the
no of lines in program. Lines of Code (LOC) counts
all lines like as source line and the number of
statements, the number of comment lines and the
number of blank lines.
Comment Percentage (CP): CP is computed by
number of comment line separated along Line of
Code. High evaluate of the CP increases the
maintainability and understandability.
CP = Comment Line / LOC;
Weighted Method per Class (WMC): This metrics
is applied towards calculating the structure
complexity of the programs [13]. Method complexity
is measured by using Cyclomatic Complexity and
WMC is sum of complexity of the all methods which
is applied in class. Suppose class is getting the
methods (m1, m2, and m3…mn) and complexity of
the methods are (c1, c2, and c3…cn) then
WMC = c1+c2+c3+…. +cn;
Cyclomatic Complexity causes foundation of the
graph theory and is computed in one of the 3
directions. Number of regions in flow graph.
Cyclomatic Complexity determined in flow graph
as follow
C (G) = E – N +2;
Where N is the no of the nodes in graph
and E is the no of the edge in the graph.
Cyclomatic Complexity defined in flow graph as
follow
C (G) = P+1;
Where „P‟ is number of predicate nodes in the
graph.
Statement where we are taking some decision
are called predicate node.
Depth of Inheritance Tree (DIT):
This metric is applied for measuring the
inheritance complexity for the programs, when
programmer usages the inheritance in his program
then this Metric can be utilized.DIT is the Maximum
depth from the root node of tree to special node. Here
class is represented as a node. Deeper node in the tree
accepts more no of the methods because they inherit
and the more classes in the tree and it make the class
more complex [13].
Number of Children (NOC):
NOC is applied when there are many numbers of
the Sub- Classes of the Particular class in hierarchy of
the class exist. When children of a class are more then
it requires more testing because super class may be
misused [13].
Public Interface Size:
Public interface size is determined when a
number of the public method deliver in the class.
Which describe how much other class is using that
class‟ method?
B. Maintainability
The degree to which the system or module of the
software can be modified easily in order to fix bugs,
adding quality attributes or for adjustment of the
operating environment change, increase efficiency of
the system. Maintainability depends on the following
factor, modularity these are size, complexity [8, 9].
We say module is more complex if module comprises
more decision statement and more complex data
structure in his program. When the Complexity of the
module is high then module is difficult to maintain.
Modularity is measure by using the coupling metrics
and cohesion metrics and Maintainability heavily
depends on the modularity. Modularity is idea for
managing the complexity of complex system by
dividing it into various small modules and this module
is communicates by using interface. We handle the
complexity by using the concept of the modularity. By
using the concept of modularity we find clear and
simple system. When the module is high cohesive
then it is easy to maintain. If module having the more
no of the local method then it difficult to maintain.
And if size of the module is high then it difficult to
maintain if the coupling of any module with other
module is low then this module is easily modified and
easy to maintain. Maintainability is also depending on
the cohesiveness of the module. If we write code for
any module and interested in measuring the quality of
module then the best criteria for measuring the quality
of the code is measure based on quality factor
maintainability. Maintainable code is more flexible,
code is not maintainable because module performing
several functionality and invoking other module [10].
Any module is more maintainable if we add easily
new functionality and exchange existing functionality
of the module. Metrics victimized for calculating
maintainability of the program, Shown in the Figure
3.
5. Rohit Patidar Int. Journal of Engineering Research and Applications www.ijera.com
ISSN : 2248-9622, Vol. 4, Issue 8( Version 2), August 2014, pp.96-101
www.ijera.com 100 | P a g e
Fig 3 Factor and Metrics in which Maintainability
depends
C. Adaptability
Adaptability determines as how easily software
satisfies requirement or and user requires of the new
environments from being system and system
constraints. Now suddenly business environment or
business require is changed, thus handling this
situation adaptability is one of the important
component or weapon. Business market situation is
change frequently so our software system should be
adaptable to satisfy this requirement. It doesn‟t intend
whatever software. We build up from oop is always
adaptable. In object oriented concept applying the
ability to build adaptable software [8]. When the
coupling of module is low and cohesion is high that
signifies module is easily adjust in new environment
from old environment. For make adaptable module we
concentration on cohesion and coupling of the
module. Metrics used for calculating adaptability of
the program, Shown in the Figure 4.
Fig. 4 Factor and Metrics on which Adaptability
depends
Coupling: Coupling is also called dependency.
Coupling is the level to in which one module is
depend on the other module signifies. It is using the
functionality of the other module. Coupling is one
important component which helps you to determine
the quality of the design or software. Design of the
designer or good programmer is to archive low
coupling [11].
Cohesion: Cohesive signifies that a certain class
performs a set of closely related to actions. A lack of
cohesion means that a class is performing and various
unrelated tasks. Principle of object oriented say
increase reduces the coupling between modules and
cohesion of the module. It is beside beneficial for
architecture point of view. Cohesiveness means each
functions in the class perform one affair. When this
happened in the class then new designer can well
understood what class performs [11].
VI. APPROACH FOR IDENTIFICATION OF
REUSABLE MODULE
Extract the source code: In this phase we analyzed
the source code and extract useful information and
store it in memory, which is necessary for calculating
the all metrics, these metrics are necessary for
evaluate factor on which reusability depend.
Calculating the Metrics: In this phase we calculate,
all metrics which describes in above section, for
implementing these metrics, we used information
gathering from extract phase. And result of the all
metrics is store in memory. All metrics are concern
with object oriented system.
Display: In this phase we calculate the reusability of
the source code. We give the some weighted to each
metrics and finally we determine the reusability of
the sources code. And display source code is reusable
or not.
Fig 5 Steps follow for identified reusable module
VII. CONCLUSION
The purpose of this paper is to finding the
approach and way to calculate reusability of object
oriented programs. Reusability is one of the quality
attribute and it is of prime importance in object
oriented software development as reusability leads to
increase in developer productivity, reduce
development cost as well as reduce time to market.
The work presented in this paper can be effectively
used to calculate the reusability of any object
oriented software module.
6. Rohit Patidar Int. Journal of Engineering Research and Applications www.ijera.com
ISSN : 2248-9622, Vol. 4, Issue 8( Version 2), August 2014, pp.96-101
www.ijera.com 101 | P a g e
REFERENCES
[1] Software Reuse Plans BringPaybacks,” Computeworld, Vol. 27, KO. 49, pp.73-76. Anthes, Gary I I.,
[2] J.W. Bailey and V.R. Basili. “A s meta- model for oftware development resource expenditures”. Proc. Fifth Int. Conf. Software Engineering.Pages107-116. 1981
[3] Norman Fenton. “Software Metrics A Rigorous Approach” .Chapman & Hall, London, 1991
[4] Software Reusability Vol II Applications and Experiences, Addison Wesley, 1989.
[5] http://www.indiawebdevelopers.com/articles/reusability.asp
[6] James M. Bieman “Deriving Measures of Software Reuse in Object Oriented Systems” Springer-Verlag 1992 pp 79-82.
[7] Chris Luer, “Assessing Module Reusability”, First International Workshop on Assessment of Contemporary Modularization techniques (ACoM'07).
[8] Dandashi F., “A Method for Assessing the Reusability of Object-oriented Code Using a Validated Set of Automated Measurements”, ACM 2002 pp 997-1003.
[9] Young Lee and Kai H. Chang, “Reusability and Maintainability Metrics for object oriented software”, ACM 2002 pp 88 – 94.
[10] Jeffrey S. Poulin “Measuring Software Reusability”, IEEE 1994 pp 126- 138.
[11] http://en.wikipedia.org/wiki/Coupling_(computer science)
[12] B. W. Boehm. “Software Engineering Economics” .Prenntice Hall, Englewood Cliffs, NJ, 1981.
[13] Shyam R. Chidamber, Chris F. Kemerer, “A metrics suit for object oriented design”,1993
[14] S.D. Conte, H.E. Dunsmore, and V.Y. Shen, “Software Engineering Metrics and Models”. Benjamin"Cummings, Menlo Park, California 1986.
[15] M. Burgin. H. K. Lee. N. Debnath, “Software Technological Roles, Usability, and Reusability, Dept. of Math”. California Univ., Los Angeles, 2004.
[16] Richard W. Selby. “Quantitative studies of software reuse”. In Ted J. Biggersta and Alan J. Perlis, editors,