The requirements form the basis for all software products. Apparently, the requirements are imprecisely stated when scattered between development teams. Therefore, software applications released with some bugs, missing functionalities, or loosely implemented requirements. In literature, a limited number of related works have been developed as a tool for software requirements inspections. This paper presents a methodology to verify that the system design fulfilled all functional requirements. The proposed approach contains three phases: requirements collection, facts collection, and matching algorithm. The feedback results provided enable analysist and developer to make a decision about the initial application release while taking on consideration missing requirements or over-designed requirements.
AN IMPROVED REPOSITORY STRUCTURE TO IDENTIFY, SELECT AND INTEGRATE COMPONENTS...ijseajournal
An ultimate goal of software development is to build high quality products. The customers of software
industry always demand for high-quality products quickly and cost effectively. The component-based
development (CBD) is the most suitable methodology for the software companies to meet the demands of
target market. To opt CBD, the software development teams have to customize generic components that are
available in the market and it is very difficult for the development teams to choose the suitable components
from the millions of third party and commercial off the shelf (COTS) components. On the other hand, the
development of in-house repository is tedious and time consuming. In this paper, we propose an easy and
understandable repository structure to provide helpful information about stored components like how to
identify, select, retrieve and integrate components. The proposed repository will also provide previous
assessments of developers and end-users about the selected component. The proposed repository will help
the software companies by reducing the customization effort, improving the quality of developed software
and preventing integrating unfamiliar components.
An interactive approach to requirements prioritization using quality factorsijfcstjournal
As the prevalence of software increases, so does the complexity and the number of requirements assoc
iated
to the software project. This presents a dilemma for the developers to clearly identify and prioriti
ze the
most important requirements in order to del
iver the project in given amount of resources and time.
A
number of prioritization methods have been proposed which provide consistent results, but they are v
ery
difficult and complex to implement in practical scenarios as well as lack proper structure to
analyze the
requirements properly. In this study, the users can provide their requirements in two forms: text ba
sed
story form and use case form.
Moreover, the existing prioritization techniques have a very little or no
interaction with the users. So, in t
his paper an attempt has been made to make the prioritization process
user interactive by adding a second level of prioritization where after the developer has properly a
nalyzed
and ranked the requirements on the basis of quality attributes in the first le
vel, takes the opinion of distinct
user’s about the requirements priority sequence. The developer then calculates the disagreement valu
e
associated with each user sequence in order to find out the final priority sequence.
FROM THE ART OF SOFTWARE TESTING TO TEST-AS-A-SERVICE IN CLOUD COMPUTINGijseajournal
Researchers consider that the first edition of the book "The Art of Software Testing" by Myers (1979)
initiated research in Software Testing. Since then, software testing has gone through evolutions that have
driven standards and tools. This evolution has accompanied the complexity and variety of software
deployment platforms. The migration to the cloud allowed benefits such as scalability, agility, and better
return on investment. Cloud computing requires more significant involvement in software testing to ensure
that services work as expected. In addition to testing cloud applications, cloud computing has paved the
way for testing in the Test-as-a-Service model. This review aims to understand software testing in the
context of cloud computing. Based on the knowledge explained here, we sought to linearize the evolution of
software testing, characterizing fundamental points and allowing us to compose a synthesis of the body of
knowledge in software testing, expanded by the cloud computing paradigm.
From the Art of Software Testing to Test-as-a-Service in Cloud Computingijseajournal
Researchers consider that the first edition of the book "The Art of Software Testing" by Myers (1979)
initiated research in Software Testing. Since then, software testing has gone through evolutions that have
driven standards and tools. This evolution has accompanied the complexity and variety of software
deployment platforms. The migration to the cloud allowed benefits such as scalability, agility, and better
return on investment. Cloud computing requires more significant involvement in software testing to ensure
that services work as expected. In addition to testing cloud applications, cloud computing has paved the
way for testing in the Test-as-a-Service model. This review aims to understand software testing in the
context of cloud computing. Based on the knowledge explained here, we sought to linearize the evolution of
software testing, characterizing fundamental points and allowing us to compose a synthesis of the body of
knowledge in software testing, expanded by the cloud computing paradigm.
Insights on Research Techniques towards Cost Estimation in Software Design IJECEIAES
Software cost estimation is of the most challenging task in project management in order to ensuring smoother development operation and target achievement. There has been evolution of various standards tools and techniques for cost estimation practiced in the industry at present times. However, it was never investigated about the overall picturization of effectiveness of such techniques till date. This paper initiates its contribution by presenting taxonomies of conventional cost-estimation techniques and then investigates the research trends towards frequently addressed problems in it. The paper also reviews the existing techniques in well-structured manner in order to highlight the problems addressed, techniques used, advantages associated and limitation explored from literatures. Finally, we also brief the explored open research issues as an added contribution to this manuscript.
One of the core quality assurance feature which combines fault prevention and fault detection, is often known as testability approach also. There are many assessment techniques and quantification method evolved for software testability prediction which actually identifies testability weakness or factors to further help reduce test effort. This paper examines all those measurement techniques that are being proposed for software testability assessment at various phases of object oriented software development life cycle. The aim is to find the best metrics suit for software quality improvisation through software testability support. The ultimate objective is to establish the ground work for finding ways reduce the testing effort by improvising software testability and its assessment using well planned guidelines for object-oriented software development with the help of suitable metrics.
AN IMPROVED REPOSITORY STRUCTURE TO IDENTIFY, SELECT AND INTEGRATE COMPONENTS...ijseajournal
An ultimate goal of software development is to build high quality products. The customers of software
industry always demand for high-quality products quickly and cost effectively. The component-based
development (CBD) is the most suitable methodology for the software companies to meet the demands of
target market. To opt CBD, the software development teams have to customize generic components that are
available in the market and it is very difficult for the development teams to choose the suitable components
from the millions of third party and commercial off the shelf (COTS) components. On the other hand, the
development of in-house repository is tedious and time consuming. In this paper, we propose an easy and
understandable repository structure to provide helpful information about stored components like how to
identify, select, retrieve and integrate components. The proposed repository will also provide previous
assessments of developers and end-users about the selected component. The proposed repository will help
the software companies by reducing the customization effort, improving the quality of developed software
and preventing integrating unfamiliar components.
An interactive approach to requirements prioritization using quality factorsijfcstjournal
As the prevalence of software increases, so does the complexity and the number of requirements assoc
iated
to the software project. This presents a dilemma for the developers to clearly identify and prioriti
ze the
most important requirements in order to del
iver the project in given amount of resources and time.
A
number of prioritization methods have been proposed which provide consistent results, but they are v
ery
difficult and complex to implement in practical scenarios as well as lack proper structure to
analyze the
requirements properly. In this study, the users can provide their requirements in two forms: text ba
sed
story form and use case form.
Moreover, the existing prioritization techniques have a very little or no
interaction with the users. So, in t
his paper an attempt has been made to make the prioritization process
user interactive by adding a second level of prioritization where after the developer has properly a
nalyzed
and ranked the requirements on the basis of quality attributes in the first le
vel, takes the opinion of distinct
user’s about the requirements priority sequence. The developer then calculates the disagreement valu
e
associated with each user sequence in order to find out the final priority sequence.
FROM THE ART OF SOFTWARE TESTING TO TEST-AS-A-SERVICE IN CLOUD COMPUTINGijseajournal
Researchers consider that the first edition of the book "The Art of Software Testing" by Myers (1979)
initiated research in Software Testing. Since then, software testing has gone through evolutions that have
driven standards and tools. This evolution has accompanied the complexity and variety of software
deployment platforms. The migration to the cloud allowed benefits such as scalability, agility, and better
return on investment. Cloud computing requires more significant involvement in software testing to ensure
that services work as expected. In addition to testing cloud applications, cloud computing has paved the
way for testing in the Test-as-a-Service model. This review aims to understand software testing in the
context of cloud computing. Based on the knowledge explained here, we sought to linearize the evolution of
software testing, characterizing fundamental points and allowing us to compose a synthesis of the body of
knowledge in software testing, expanded by the cloud computing paradigm.
From the Art of Software Testing to Test-as-a-Service in Cloud Computingijseajournal
Researchers consider that the first edition of the book "The Art of Software Testing" by Myers (1979)
initiated research in Software Testing. Since then, software testing has gone through evolutions that have
driven standards and tools. This evolution has accompanied the complexity and variety of software
deployment platforms. The migration to the cloud allowed benefits such as scalability, agility, and better
return on investment. Cloud computing requires more significant involvement in software testing to ensure
that services work as expected. In addition to testing cloud applications, cloud computing has paved the
way for testing in the Test-as-a-Service model. This review aims to understand software testing in the
context of cloud computing. Based on the knowledge explained here, we sought to linearize the evolution of
software testing, characterizing fundamental points and allowing us to compose a synthesis of the body of
knowledge in software testing, expanded by the cloud computing paradigm.
Insights on Research Techniques towards Cost Estimation in Software Design IJECEIAES
Software cost estimation is of the most challenging task in project management in order to ensuring smoother development operation and target achievement. There has been evolution of various standards tools and techniques for cost estimation practiced in the industry at present times. However, it was never investigated about the overall picturization of effectiveness of such techniques till date. This paper initiates its contribution by presenting taxonomies of conventional cost-estimation techniques and then investigates the research trends towards frequently addressed problems in it. The paper also reviews the existing techniques in well-structured manner in order to highlight the problems addressed, techniques used, advantages associated and limitation explored from literatures. Finally, we also brief the explored open research issues as an added contribution to this manuscript.
One of the core quality assurance feature which combines fault prevention and fault detection, is often known as testability approach also. There are many assessment techniques and quantification method evolved for software testability prediction which actually identifies testability weakness or factors to further help reduce test effort. This paper examines all those measurement techniques that are being proposed for software testability assessment at various phases of object oriented software development life cycle. The aim is to find the best metrics suit for software quality improvisation through software testability support. The ultimate objective is to establish the ground work for finding ways reduce the testing effort by improvising software testability and its assessment using well planned guidelines for object-oriented software development with the help of suitable metrics.
SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...ijseajournal
In software development phase software artifacts are not in consistent states such as: some of the class artifacts are fully developed some are half developed, some are major developed, some are minor developed and some are not developed yet. At this stage allowing too many software requirement changes may possibly delay in project delivery and increase development budget of the software. On the other hand rejecting too many changes may increase customer dissatisfaction. Software change effort estimation is one of the most challenging and important activity that helps software project managers in accepting or rejecting changes during software development phase. This paper extends our previous works on developing a software requirement change effort estimation model prototype tool for the software development phase. The significant achievements of the tool are demonstrated through an extensive experimental validation using several case studies. The experimental analysis shows improvement in the estimation accuracy over current change effort estimation models.
Contributors to Reduce Maintainability Cost at the Software Implementation PhaseWaqas Tariq
Software maintenance is important and difficult to measure. The cost of maintenance is the most ever during the phases of software development. One of the most critical processes in software development is the reduction of software maintainability cost based on the quality of source code during design step, however, a lack of quality models and measures can help asses the quality attributes of software maintainability process. Software maintainability suffers from a number of challenges such as lack source code understanding, quality of software code, and adherence to programming standards in maintenance. This work describes model based-factors to assess the software maintenance, explains the steps followed to obtain and validate them. Such a method can be used to eliminate the software maintenance cost. The research results will enhance the quality of the source code. It will increase software understandability, eliminate maintenance time, cost, and give confidence for software reusability.
Requirement Engineering Challenges in Development of Software Applications an...Waqas Tariq
Requirement Engineering acts as foundation for any software and is one of the most important tasks. Entire software is supported by four pillars of requirement engineering processes. Functional and non-functional requirements work as bricks to support software edifice. Finally, design, implementation and testing add stories to construct entire software tower on top of this foundation. Thus, the base needs to be well-built to support rest of software tower. For this purpose, requirement engineers come across with numerous challenges to develop successful software. The paper has highlighted requirement engineering challenges encountered in development of software applications and selection of right customer-off-the-shelf components (COTS). Comprehending stakeholder’s needs; incomplete and inconsistent process description; verification and validation of requirements; classification and modeling of extensive data; selection of COTS product with minimum requirement modifications are foremost challenges faced during requirement engineering. Moreover, the paper has discussed and critically evaluated challenges highlighted by various researchers. Besides, the paper presents a model that encapsulates seven major challenges that recur during requirement engineering phase. These challenges have been further categorized into problems. Furthermore, the model has been linked with previous research work to elaborate challenges that have not been specified earlier. Anticipating requirement engineering challenges could assist requirement engineers to prevent software tower from any destruction.
DESQA a Software Quality Assurance FrameworkIJERA Editor
In current software development lifecycles of heterogeneous environments, the pitfalls businesses have to face are that software defect tracking, measurements and quality assurance do not start early enough in the development process. In fact the cost of fixing a defect in a production environment is much higher than in the initial phases of the Software Development Life Cycle (SDLC) which is particularly true for Service Oriented Architecture (SOA). Thus the aim of this study is to develop a new framework for defect tracking and detection and quality estimation for early stages particularly for the design stage of the SDLC. Part of the objectives of this work is to conceptualize, borrow and customize from known frameworks, such as object-oriented programming to build a solid framework using automated rule based intelligent mechanisms to detect and classify defects in software design of SOA. The implementation part demonstrated how the framework can predict the quality level of the designed software. The results showed a good level of quality estimation can be achieved based on the number of design attributes, the number of quality attributes and the number of SOA Design Defects. Assessment shows that metrics provide guidelines to indicate the progress that a software system has made and the quality of design. Using these guidelines, we can develop more usable and maintainable software systems to fulfill the demand of efficient systems for software applications. Another valuable result coming from this study is that developers are trying to keep backwards compatibility when they introduce new functionality. Sometimes, in the same newly-introduced elements developers perform necessary breaking changes in future versions. In that way they give time to their clients to adapt their systems. This is a very valuable practice for the developers because they have more time to assess the quality of their software before releasing it. Other improvements in this research include investigation of other design attributes and SOA Design Defects which can be computed in extending the tests we performed.
STATE-OF-THE-ART IN EMPIRICAL VALIDATION OF SOFTWARE METRICS FOR FAULT PRONEN...IJCSES Journal
With the sharp rise in software dependability and failure cost, high quality has been in great demand.However, guaranteeing high quality in software systems which have grown in size and complexity coupled with the constraints imposed on their development has become increasingly difficult, time and resource consuming activity. Consequently, it becomes inevitable to deliver software that have no serious faults. In
this case, object-oriented (OO) products being the de facto standard of software development with their unique features could have some faults that are hard to find or pinpoint the impacts of changes. The earlier faults are identified, found and fixed, the lesser the costs and the higher the quality. To assess product quality, software metrics are used. Many OO metrics have been proposed and developed. Furthermore,
many empirical studies have validated metrics and class fault proneness (FP) relationship. The challenge is which metrics are related to class FP and what activities are performed. Therefore, this study bring together the state-of-the-art in fault prediction of FP that utilizes CK and size metrics. We conducted a systematic literature review over relevant published empirical validation articles. The results obtained are
analysed and presented. It indicates that 29 relevant empirical studies exist and measures such as complexity, coupling and size were found to be strongly related to FP.
With interconnectivity between IT Service Providers and their customers and partners growing, fueled by
proliferation of IT Services Outsourcing, with some providers gaining leading positions in marketplace
today, challenges are faced by teams who are tasked to deliver integration projects with much desired
efficiencies both in cost and schedule. Such integrations are growing both in volume and complexity.
Integrations between different autonomous systems such as workflow systems of the providers and their
customers are an important element of this emerging paradigm. In this paper we present an efficient model
to implement such interfaces between autonomous workflow systems with close attention given to major
phases of these projects, from requirement gathering/analysis, to configuration/coding, to
validation/verification, several levels of testing and finally deployment. By deploying a comprehensive
strategy and implementing it in a real corporate environment, a 10%-20% reduction in cost and schedule
year over year was achieved for past several years primarily by improving testing techniques and detecting
bugs earlier in the development life-cycle. Some practical considerations are outlined in addition to
detailing the strategy for testing the autonomous system integrations domain.
A Complexity Based Regression Test Selection StrategyCSEIJJournal
Software is unequivocally the foremost and indispensable entity in this technologically driven world.
Therefore quality assurance, and in particular, software testing is a crucial step in the software
development cycle. This paper presents an effective test selection strategy that uses a Spectrum of
Complexity Metrics (SCM). Our aim in this paper is to increase the efficiency of the testing process by
significantly reducing the number of test cases without having a significant drop in test effectiveness. The
strategy makes use of a comprehensive taxonomy of complexity metrics based on the product level (class,
method, statement) and its characteristics.We use a series of experiments based on three applications with
a significant number of mutants to demonstrate the effectiveness of our selection strategy.For further
evaluation, we compareour approach to boundary value analysis. The results show the capability of our
approach to detect mutants as well as the seeded errors.
A Review of Agile Software Effort Estimation MethodsEditor IJCATR
Software cost estimation is an essential aspect of software project management and therefore the success or failure of a software
project depends on accuracy in estimating effort, time and cost. Software cost estimation is a scientific activity that requires knowledge of a
number of relevant attributes that will determine which estimation method to use in a given situation. Over the years various studies were done
to evaluate software effort estimation methods however due to introduction of new software development methods, the reviews have not
captured new software development methods. Agile software development method is one of the recent popular methods that were not taken
into account in previous cost estimation reviews. The main aim of this paper is to review existing software effort estimation methods
exhaustively by exploring estimation methods suitable for new software development methods.
Bibliometric analysis highlighting the role of women in addressing climate ch...IJECEIAES
Fossil fuel consumption increased quickly, contributing to climate change
that is evident in unusual flooding and draughts, and global warming. Over
the past ten years, women's involvement in society has grown dramatically,
and they succeeded in playing a noticeable role in reducing climate change.
A bibliometric analysis of data from the last ten years has been carried out to
examine the role of women in addressing the climate change. The analysis's
findings discussed the relevant to the sustainable development goals (SDGs),
particularly SDG 7 and SDG 13. The results considered contributions made
by women in the various sectors while taking geographic dispersion into
account. The bibliometric analysis delves into topics including women's
leadership in environmental groups, their involvement in policymaking, their
contributions to sustainable development projects, and the influence of
gender diversity on attempts to mitigate climate change. This study's results
highlight how women have influenced policies and actions related to climate
change, point out areas of research deficiency and recommendations on how
to increase role of the women in addressing the climate change and
achieving sustainability. To achieve more successful results, this initiative
aims to highlight the significance of gender equality and encourage
inclusivity in climate change decision-making processes.
Voltage and frequency control of microgrid in presence of micro-turbine inter...IJECEIAES
The active and reactive load changes have a significant impact on voltage
and frequency. In this paper, in order to stabilize the microgrid (MG) against
load variations in islanding mode, the active and reactive power of all
distributed generators (DGs), including energy storage (battery), diesel
generator, and micro-turbine, are controlled. The micro-turbine generator is
connected to MG through a three-phase to three-phase matrix converter, and
the droop control method is applied for controlling the voltage and
frequency of MG. In addition, a method is introduced for voltage and
frequency control of micro-turbines in the transition state from gridconnected mode to islanding mode. A novel switching strategy of the matrix
converter is used for converting the high-frequency output voltage of the
micro-turbine to the grid-side frequency of the utility system. Moreover,
using the switching strategy, the low-order harmonics in the output current
and voltage are not produced, and consequently, the size of the output filter
would be reduced. In fact, the suggested control strategy is load-independent
and has no frequency conversion restrictions. The proposed approach for
voltage and frequency regulation demonstrates exceptional performance and
favorable response across various load alteration scenarios. The suggested
strategy is examined in several scenarios in the MG test systems, and the
simulation results are addressed.
More Related Content
Similar to A novel defect detection method for software requirements inspections
SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...ijseajournal
In software development phase software artifacts are not in consistent states such as: some of the class artifacts are fully developed some are half developed, some are major developed, some are minor developed and some are not developed yet. At this stage allowing too many software requirement changes may possibly delay in project delivery and increase development budget of the software. On the other hand rejecting too many changes may increase customer dissatisfaction. Software change effort estimation is one of the most challenging and important activity that helps software project managers in accepting or rejecting changes during software development phase. This paper extends our previous works on developing a software requirement change effort estimation model prototype tool for the software development phase. The significant achievements of the tool are demonstrated through an extensive experimental validation using several case studies. The experimental analysis shows improvement in the estimation accuracy over current change effort estimation models.
Contributors to Reduce Maintainability Cost at the Software Implementation PhaseWaqas Tariq
Software maintenance is important and difficult to measure. The cost of maintenance is the most ever during the phases of software development. One of the most critical processes in software development is the reduction of software maintainability cost based on the quality of source code during design step, however, a lack of quality models and measures can help asses the quality attributes of software maintainability process. Software maintainability suffers from a number of challenges such as lack source code understanding, quality of software code, and adherence to programming standards in maintenance. This work describes model based-factors to assess the software maintenance, explains the steps followed to obtain and validate them. Such a method can be used to eliminate the software maintenance cost. The research results will enhance the quality of the source code. It will increase software understandability, eliminate maintenance time, cost, and give confidence for software reusability.
Requirement Engineering Challenges in Development of Software Applications an...Waqas Tariq
Requirement Engineering acts as foundation for any software and is one of the most important tasks. Entire software is supported by four pillars of requirement engineering processes. Functional and non-functional requirements work as bricks to support software edifice. Finally, design, implementation and testing add stories to construct entire software tower on top of this foundation. Thus, the base needs to be well-built to support rest of software tower. For this purpose, requirement engineers come across with numerous challenges to develop successful software. The paper has highlighted requirement engineering challenges encountered in development of software applications and selection of right customer-off-the-shelf components (COTS). Comprehending stakeholder’s needs; incomplete and inconsistent process description; verification and validation of requirements; classification and modeling of extensive data; selection of COTS product with minimum requirement modifications are foremost challenges faced during requirement engineering. Moreover, the paper has discussed and critically evaluated challenges highlighted by various researchers. Besides, the paper presents a model that encapsulates seven major challenges that recur during requirement engineering phase. These challenges have been further categorized into problems. Furthermore, the model has been linked with previous research work to elaborate challenges that have not been specified earlier. Anticipating requirement engineering challenges could assist requirement engineers to prevent software tower from any destruction.
DESQA a Software Quality Assurance FrameworkIJERA Editor
In current software development lifecycles of heterogeneous environments, the pitfalls businesses have to face are that software defect tracking, measurements and quality assurance do not start early enough in the development process. In fact the cost of fixing a defect in a production environment is much higher than in the initial phases of the Software Development Life Cycle (SDLC) which is particularly true for Service Oriented Architecture (SOA). Thus the aim of this study is to develop a new framework for defect tracking and detection and quality estimation for early stages particularly for the design stage of the SDLC. Part of the objectives of this work is to conceptualize, borrow and customize from known frameworks, such as object-oriented programming to build a solid framework using automated rule based intelligent mechanisms to detect and classify defects in software design of SOA. The implementation part demonstrated how the framework can predict the quality level of the designed software. The results showed a good level of quality estimation can be achieved based on the number of design attributes, the number of quality attributes and the number of SOA Design Defects. Assessment shows that metrics provide guidelines to indicate the progress that a software system has made and the quality of design. Using these guidelines, we can develop more usable and maintainable software systems to fulfill the demand of efficient systems for software applications. Another valuable result coming from this study is that developers are trying to keep backwards compatibility when they introduce new functionality. Sometimes, in the same newly-introduced elements developers perform necessary breaking changes in future versions. In that way they give time to their clients to adapt their systems. This is a very valuable practice for the developers because they have more time to assess the quality of their software before releasing it. Other improvements in this research include investigation of other design attributes and SOA Design Defects which can be computed in extending the tests we performed.
STATE-OF-THE-ART IN EMPIRICAL VALIDATION OF SOFTWARE METRICS FOR FAULT PRONEN...IJCSES Journal
With the sharp rise in software dependability and failure cost, high quality has been in great demand.However, guaranteeing high quality in software systems which have grown in size and complexity coupled with the constraints imposed on their development has become increasingly difficult, time and resource consuming activity. Consequently, it becomes inevitable to deliver software that have no serious faults. In
this case, object-oriented (OO) products being the de facto standard of software development with their unique features could have some faults that are hard to find or pinpoint the impacts of changes. The earlier faults are identified, found and fixed, the lesser the costs and the higher the quality. To assess product quality, software metrics are used. Many OO metrics have been proposed and developed. Furthermore,
many empirical studies have validated metrics and class fault proneness (FP) relationship. The challenge is which metrics are related to class FP and what activities are performed. Therefore, this study bring together the state-of-the-art in fault prediction of FP that utilizes CK and size metrics. We conducted a systematic literature review over relevant published empirical validation articles. The results obtained are
analysed and presented. It indicates that 29 relevant empirical studies exist and measures such as complexity, coupling and size were found to be strongly related to FP.
With interconnectivity between IT Service Providers and their customers and partners growing, fueled by
proliferation of IT Services Outsourcing, with some providers gaining leading positions in marketplace
today, challenges are faced by teams who are tasked to deliver integration projects with much desired
efficiencies both in cost and schedule. Such integrations are growing both in volume and complexity.
Integrations between different autonomous systems such as workflow systems of the providers and their
customers are an important element of this emerging paradigm. In this paper we present an efficient model
to implement such interfaces between autonomous workflow systems with close attention given to major
phases of these projects, from requirement gathering/analysis, to configuration/coding, to
validation/verification, several levels of testing and finally deployment. By deploying a comprehensive
strategy and implementing it in a real corporate environment, a 10%-20% reduction in cost and schedule
year over year was achieved for past several years primarily by improving testing techniques and detecting
bugs earlier in the development life-cycle. Some practical considerations are outlined in addition to
detailing the strategy for testing the autonomous system integrations domain.
A Complexity Based Regression Test Selection StrategyCSEIJJournal
Software is unequivocally the foremost and indispensable entity in this technologically driven world.
Therefore quality assurance, and in particular, software testing is a crucial step in the software
development cycle. This paper presents an effective test selection strategy that uses a Spectrum of
Complexity Metrics (SCM). Our aim in this paper is to increase the efficiency of the testing process by
significantly reducing the number of test cases without having a significant drop in test effectiveness. The
strategy makes use of a comprehensive taxonomy of complexity metrics based on the product level (class,
method, statement) and its characteristics.We use a series of experiments based on three applications with
a significant number of mutants to demonstrate the effectiveness of our selection strategy.For further
evaluation, we compareour approach to boundary value analysis. The results show the capability of our
approach to detect mutants as well as the seeded errors.
A Review of Agile Software Effort Estimation MethodsEditor IJCATR
Software cost estimation is an essential aspect of software project management and therefore the success or failure of a software
project depends on accuracy in estimating effort, time and cost. Software cost estimation is a scientific activity that requires knowledge of a
number of relevant attributes that will determine which estimation method to use in a given situation. Over the years various studies were done
to evaluate software effort estimation methods however due to introduction of new software development methods, the reviews have not
captured new software development methods. Agile software development method is one of the recent popular methods that were not taken
into account in previous cost estimation reviews. The main aim of this paper is to review existing software effort estimation methods
exhaustively by exploring estimation methods suitable for new software development methods.
Similar to A novel defect detection method for software requirements inspections (20)
Bibliometric analysis highlighting the role of women in addressing climate ch...IJECEIAES
Fossil fuel consumption increased quickly, contributing to climate change
that is evident in unusual flooding and draughts, and global warming. Over
the past ten years, women's involvement in society has grown dramatically,
and they succeeded in playing a noticeable role in reducing climate change.
A bibliometric analysis of data from the last ten years has been carried out to
examine the role of women in addressing the climate change. The analysis's
findings discussed the relevant to the sustainable development goals (SDGs),
particularly SDG 7 and SDG 13. The results considered contributions made
by women in the various sectors while taking geographic dispersion into
account. The bibliometric analysis delves into topics including women's
leadership in environmental groups, their involvement in policymaking, their
contributions to sustainable development projects, and the influence of
gender diversity on attempts to mitigate climate change. This study's results
highlight how women have influenced policies and actions related to climate
change, point out areas of research deficiency and recommendations on how
to increase role of the women in addressing the climate change and
achieving sustainability. To achieve more successful results, this initiative
aims to highlight the significance of gender equality and encourage
inclusivity in climate change decision-making processes.
Voltage and frequency control of microgrid in presence of micro-turbine inter...IJECEIAES
The active and reactive load changes have a significant impact on voltage
and frequency. In this paper, in order to stabilize the microgrid (MG) against
load variations in islanding mode, the active and reactive power of all
distributed generators (DGs), including energy storage (battery), diesel
generator, and micro-turbine, are controlled. The micro-turbine generator is
connected to MG through a three-phase to three-phase matrix converter, and
the droop control method is applied for controlling the voltage and
frequency of MG. In addition, a method is introduced for voltage and
frequency control of micro-turbines in the transition state from gridconnected mode to islanding mode. A novel switching strategy of the matrix
converter is used for converting the high-frequency output voltage of the
micro-turbine to the grid-side frequency of the utility system. Moreover,
using the switching strategy, the low-order harmonics in the output current
and voltage are not produced, and consequently, the size of the output filter
would be reduced. In fact, the suggested control strategy is load-independent
and has no frequency conversion restrictions. The proposed approach for
voltage and frequency regulation demonstrates exceptional performance and
favorable response across various load alteration scenarios. The suggested
strategy is examined in several scenarios in the MG test systems, and the
simulation results are addressed.
Enhancing battery system identification: nonlinear autoregressive modeling fo...IJECEIAES
Precisely characterizing Li-ion batteries is essential for optimizing their
performance, enhancing safety, and prolonging their lifespan across various
applications, such as electric vehicles and renewable energy systems. This
article introduces an innovative nonlinear methodology for system
identification of a Li-ion battery, employing a nonlinear autoregressive with
exogenous inputs (NARX) model. The proposed approach integrates the
benefits of nonlinear modeling with the adaptability of the NARX structure,
facilitating a more comprehensive representation of the intricate
electrochemical processes within the battery. Experimental data collected
from a Li-ion battery operating under diverse scenarios are employed to
validate the effectiveness of the proposed methodology. The identified
NARX model exhibits superior accuracy in predicting the battery's behavior
compared to traditional linear models. This study underscores the
importance of accounting for nonlinearities in battery modeling, providing
insights into the intricate relationships between state-of-charge, voltage, and
current under dynamic conditions.
Smart grid deployment: from a bibliometric analysis to a surveyIJECEIAES
Smart grids are one of the last decades' innovations in electrical energy.
They bring relevant advantages compared to the traditional grid and
significant interest from the research community. Assessing the field's
evolution is essential to propose guidelines for facing new and future smart
grid challenges. In addition, knowing the main technologies involved in the
deployment of smart grids (SGs) is important to highlight possible
shortcomings that can be mitigated by developing new tools. This paper
contributes to the research trends mentioned above by focusing on two
objectives. First, a bibliometric analysis is presented to give an overview of
the current research level about smart grid deployment. Second, a survey of
the main technological approaches used for smart grid implementation and
their contributions are highlighted. To that effect, we searched the Web of
Science (WoS), and the Scopus databases. We obtained 5,663 documents
from WoS and 7,215 from Scopus on smart grid implementation or
deployment. With the extraction limitation in the Scopus database, 5,872 of
the 7,215 documents were extracted using a multi-step process. These two
datasets have been analyzed using a bibliometric tool called bibliometrix.
The main outputs are presented with some recommendations for future
research.
Use of analytical hierarchy process for selecting and prioritizing islanding ...IJECEIAES
One of the problems that are associated to power systems is islanding
condition, which must be rapidly and properly detected to prevent any
negative consequences on the system's protection, stability, and security.
This paper offers a thorough overview of several islanding detection
strategies, which are divided into two categories: classic approaches,
including local and remote approaches, and modern techniques, including
techniques based on signal processing and computational intelligence.
Additionally, each approach is compared and assessed based on several
factors, including implementation costs, non-detected zones, declining
power quality, and response times using the analytical hierarchy process
(AHP). The multi-criteria decision-making analysis shows that the overall
weight of passive methods (24.7%), active methods (7.8%), hybrid methods
(5.6%), remote methods (14.5%), signal processing-based methods (26.6%),
and computational intelligent-based methods (20.8%) based on the
comparison of all criteria together. Thus, it can be seen from the total weight
that hybrid approaches are the least suitable to be chosen, while signal
processing-based methods are the most appropriate islanding detection
method to be selected and implemented in power system with respect to the
aforementioned factors. Using Expert Choice software, the proposed
hierarchy model is studied and examined.
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...IJECEIAES
The power generated by photovoltaic (PV) systems is influenced by
environmental factors. This variability hampers the control and utilization of
solar cells' peak output. In this study, a single-stage grid-connected PV
system is designed to enhance power quality. Our approach employs fuzzy
logic in the direct power control (DPC) of a three-phase voltage source
inverter (VSI), enabling seamless integration of the PV connected to the
grid. Additionally, a fuzzy logic-based maximum power point tracking
(MPPT) controller is adopted, which outperforms traditional methods like
incremental conductance (INC) in enhancing solar cell efficiency and
minimizing the response time. Moreover, the inverter's real-time active and
reactive power is directly managed to achieve a unity power factor (UPF).
The system's performance is assessed through MATLAB/Simulink
implementation, showing marked improvement over conventional methods,
particularly in steady-state and varying weather conditions. For solar
irradiances of 500 and 1,000 W/m2
, the results show that the proposed
method reduces the total harmonic distortion (THD) of the injected current
to the grid by approximately 46% and 38% compared to conventional
methods, respectively. Furthermore, we compare the simulation results with
IEEE standards to evaluate the system's grid compatibility.
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...IJECEIAES
Photovoltaic systems have emerged as a promising energy resource that
caters to the future needs of society, owing to their renewable, inexhaustible,
and cost-free nature. The power output of these systems relies on solar cell
radiation and temperature. In order to mitigate the dependence on
atmospheric conditions and enhance power tracking, a conventional
approach has been improved by integrating various methods. To optimize
the generation of electricity from solar systems, the maximum power point
tracking (MPPT) technique is employed. To overcome limitations such as
steady-state voltage oscillations and improve transient response, two
traditional MPPT methods, namely fuzzy logic controller (FLC) and perturb
and observe (P&O), have been modified. This research paper aims to
simulate and validate the step size of the proposed modified P&O and FLC
techniques within the MPPT algorithm using MATLAB/Simulink for
efficient power tracking in photovoltaic systems.
Adaptive synchronous sliding control for a robot manipulator based on neural ...IJECEIAES
Robot manipulators have become important equipment in production lines, medical fields, and transportation. Improving the quality of trajectory tracking for
robot hands is always an attractive topic in the research community. This is a
challenging problem because robot manipulators are complex nonlinear systems
and are often subject to fluctuations in loads and external disturbances. This
article proposes an adaptive synchronous sliding control scheme to improve trajectory tracking performance for a robot manipulator. The proposed controller
ensures that the positions of the joints track the desired trajectory, synchronize
the errors, and significantly reduces chattering. First, the synchronous tracking
errors and synchronous sliding surfaces are presented. Second, the synchronous
tracking error dynamics are determined. Third, a robust adaptive control law is
designed,the unknown components of the model are estimated online by the neural network, and the parameters of the switching elements are selected by fuzzy
logic. The built algorithm ensures that the tracking and approximation errors
are ultimately uniformly bounded (UUB). Finally, the effectiveness of the constructed algorithm is demonstrated through simulation and experimental results.
Simulation and experimental results show that the proposed controller is effective with small synchronous tracking errors, and the chattering phenomenon is
significantly reduced.
Remote field-programmable gate array laboratory for signal acquisition and de...IJECEIAES
A remote laboratory utilizing field-programmable gate array (FPGA) technologies enhances students’ learning experience anywhere and anytime in embedded system design. Existing remote laboratories prioritize hardware access and visual feedback for observing board behavior after programming, neglecting comprehensive debugging tools to resolve errors that require internal signal acquisition. This paper proposes a novel remote embeddedsystem design approach targeting FPGA technologies that are fully interactive via a web-based platform. Our solution provides FPGA board access and debugging capabilities beyond the visual feedback provided by existing remote laboratories. We implemented a lab module that allows users to seamlessly incorporate into their FPGA design. The module minimizes hardware resource utilization while enabling the acquisition of a large number of data samples from the signal during the experiments by adaptively compressing the signal prior to data transmission. The results demonstrate an average compression ratio of 2.90 across three benchmark signals, indicating efficient signal acquisition and effective debugging and analysis. This method allows users to acquire more data samples than conventional methods. The proposed lab allows students to remotely test and debug their designs, bridging the gap between theory and practice in embedded system design.
Detecting and resolving feature envy through automated machine learning and m...IJECEIAES
Efficiently identifying and resolving code smells enhances software project quality. This paper presents a novel solution, utilizing automated machine learning (AutoML) techniques, to detect code smells and apply move method refactoring. By evaluating code metrics before and after refactoring, we assessed its impact on coupling, complexity, and cohesion. Key contributions of this research include a unique dataset for code smell classification and the development of models using AutoGluon for optimal performance. Furthermore, the study identifies the top 20 influential features in classifying feature envy, a well-known code smell, stemming from excessive reliance on external classes. We also explored how move method refactoring addresses feature envy, revealing reduced coupling and complexity, and improved cohesion, ultimately enhancing code quality. In summary, this research offers an empirical, data-driven approach, integrating AutoML and move method refactoring to optimize software project quality. Insights gained shed light on the benefits of refactoring on code quality and the significance of specific features in detecting feature envy. Future research can expand to explore additional refactoring techniques and a broader range of code metrics, advancing software engineering practices and standards.
Smart monitoring technique for solar cell systems using internet of things ba...IJECEIAES
Rapidly and remotely monitoring and receiving the solar cell systems status parameters, solar irradiance, temperature, and humidity, are critical issues in enhancement their efficiency. Hence, in the present article an improved smart prototype of internet of things (IoT) technique based on embedded system through NodeMCU ESP8266 (ESP-12E) was carried out experimentally. Three different regions at Egypt; Luxor, Cairo, and El-Beheira cities were chosen to study their solar irradiance profile, temperature, and humidity by the proposed IoT system. The monitoring data of solar irradiance, temperature, and humidity were live visualized directly by Ubidots through hypertext transfer protocol (HTTP) protocol. The measured solar power radiation in Luxor, Cairo, and El-Beheira ranged between 216-1000, 245-958, and 187-692 W/m 2 respectively during the solar day. The accuracy and rapidity of obtaining monitoring results using the proposed IoT system made it a strong candidate for application in monitoring solar cell systems. On the other hand, the obtained solar power radiation results of the three considered regions strongly candidate Luxor and Cairo as suitable places to build up a solar cells system station rather than El-Beheira.
An efficient security framework for intrusion detection and prevention in int...IJECEIAES
Over the past few years, the internet of things (IoT) has advanced to connect billions of smart devices to improve quality of life. However, anomalies or malicious intrusions pose several security loopholes, leading to performance degradation and threat to data security in IoT operations. Thereby, IoT security systems must keep an eye on and restrict unwanted events from occurring in the IoT network. Recently, various technical solutions based on machine learning (ML) models have been derived towards identifying and restricting unwanted events in IoT. However, most ML-based approaches are prone to miss-classification due to inappropriate feature selection. Additionally, most ML approaches applied to intrusion detection and prevention consider supervised learning, which requires a large amount of labeled data to be trained. Consequently, such complex datasets are impossible to source in a large network like IoT. To address this problem, this proposed study introduces an efficient learning mechanism to strengthen the IoT security aspects. The proposed algorithm incorporates supervised and unsupervised approaches to improve the learning models for intrusion detection and mitigation. Compared with the related works, the experimental outcome shows that the model performs well in a benchmark dataset. It accomplishes an improved detection accuracy of approximately 99.21%.
Developing a smart system for infant incubators using the internet of things ...IJECEIAES
This research is developing an incubator system that integrates the internet of things and artificial intelligence to improve care for premature babies. The system workflow starts with sensors that collect data from the incubator. Then, the data is sent in real-time to the internet of things (IoT) broker eclipse mosquito using the message queue telemetry transport (MQTT) protocol version 5.0. After that, the data is stored in a database for analysis using the long short-term memory network (LSTM) method and displayed in a web application using an application programming interface (API) service. Furthermore, the experimental results produce as many as 2,880 rows of data stored in the database. The correlation coefficient between the target attribute and other attributes ranges from 0.23 to 0.48. Next, several experiments were conducted to evaluate the model-predicted value on the test data. The best results are obtained using a two-layer LSTM configuration model, each with 60 neurons and a lookback setting 6. This model produces an R 2 value of 0.934, with a root mean square error (RMSE) value of 0.015 and a mean absolute error (MAE) of 0.008. In addition, the R 2 value was also evaluated for each attribute used as input, with a result of values between 0.590 and 0.845.
A review on internet of things-based stingless bee's honey production with im...IJECEIAES
Honey is produced exclusively by honeybees and stingless bees which both are well adapted to tropical and subtropical regions such as Malaysia. Stingless bees are known for producing small amounts of honey and are known for having a unique flavor profile. Problem identified that many stingless bees collapsed due to weather, temperature and environment. It is critical to understand the relationship between the production of stingless bee honey and environmental conditions to improve honey production. Thus, this paper presents a review on stingless bee's honey production and prediction modeling. About 54 previous research has been analyzed and compared in identifying the research gaps. A framework on modeling the prediction of stingless bee honey is derived. The result presents the comparison and analysis on the internet of things (IoT) monitoring systems, honey production estimation, convolution neural networks (CNNs), and automatic identification methods on bee species. It is identified based on image detection method the top best three efficiency presents CNN is at 98.67%, densely connected convolutional networks with YOLO v3 is 97.7%, and DenseNet201 convolutional networks 99.81%. This study is significant to assist the researcher in developing a model for predicting stingless honey produced by bee's output, which is important for a stable economy and food security.
A trust based secure access control using authentication mechanism for intero...IJECEIAES
The internet of things (IoT) is a revolutionary innovation in many aspects of our society including interactions, financial activity, and global security such as the military and battlefield internet. Due to the limited energy and processing capacity of network devices, security, energy consumption, compatibility, and device heterogeneity are the long-term IoT problems. As a result, energy and security are critical for data transmission across edge and IoT networks. Existing IoT interoperability techniques need more computation time, have unreliable authentication mechanisms that break easily, lose data easily, and have low confidentiality. In this paper, a key agreement protocol-based authentication mechanism for IoT devices is offered as a solution to this issue. This system makes use of information exchange, which must be secured to prevent access by unauthorized users. Using a compact contiki/cooja simulator, the performance and design of the suggested framework are validated. The simulation findings are evaluated based on detection of malicious nodes after 60 minutes of simulation. The suggested trust method, which is based on privacy access control, reduced packet loss ratio to 0.32%, consumed 0.39% power, and had the greatest average residual energy of 0.99 mJoules at 10 nodes.
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbersIJECEIAES
In real world applications, data are subject to ambiguity due to several factors; fuzzy sets and fuzzy numbers propose a great tool to model such ambiguity. In case of hesitation, the complement of a membership value in fuzzy numbers can be different from the non-membership value, in which case we can model using intuitionistic fuzzy numbers as they provide flexibility by defining both a membership and a non-membership functions. In this article, we consider the intuitionistic fuzzy linear programming problem with intuitionistic polygonal fuzzy numbers, which is a generalization of the previous polygonal fuzzy numbers found in the literature. We present a modification of the simplex method that can be used to solve any general intuitionistic fuzzy linear programming problem after approximating the problem by an intuitionistic polygonal fuzzy number with n edges. This method is given in a simple tableau formulation, and then applied on numerical examples for clarity.
The performance of artificial intelligence in prostate magnetic resonance im...IJECEIAES
Prostate cancer is the predominant form of cancer observed in men worldwide. The application of magnetic resonance imaging (MRI) as a guidance tool for conducting biopsies has been established as a reliable and well-established approach in the diagnosis of prostate cancer. The diagnostic performance of MRI-guided prostate cancer diagnosis exhibits significant heterogeneity due to the intricate and multi-step nature of the diagnostic pathway. The development of artificial intelligence (AI) models, specifically through the utilization of machine learning techniques such as deep learning, is assuming an increasingly significant role in the field of radiology. In the realm of prostate MRI, a considerable body of literature has been dedicated to the development of various AI algorithms. These algorithms have been specifically designed for tasks such as prostate segmentation, lesion identification, and classification. The overarching objective of these endeavors is to enhance diagnostic performance and foster greater agreement among different observers within MRI scans for the prostate. This review article aims to provide a concise overview of the application of AI in the field of radiology, with a specific focus on its utilization in prostate MRI.
Seizure stage detection of epileptic seizure using convolutional neural networksIJECEIAES
According to the World Health Organization (WHO), seventy million individuals worldwide suffer from epilepsy, a neurological disorder. While electroencephalography (EEG) is crucial for diagnosing epilepsy and monitoring the brain activity of epilepsy patients, it requires a specialist to examine all EEG recordings to find epileptic behavior. This procedure needs an experienced doctor, and a precise epilepsy diagnosis is crucial for appropriate treatment. To identify epileptic seizures, this study employed a convolutional neural network (CNN) based on raw scalp EEG signals to discriminate between preictal, ictal, postictal, and interictal segments. The possibility of these characteristics is explored by examining how well timedomain signals work in the detection of epileptic signals using intracranial Freiburg Hospital (FH), scalp Children's Hospital Boston-Massachusetts Institute of Technology (CHB-MIT) databases, and Temple University Hospital (TUH) EEG. To test the viability of this approach, two types of experiments were carried out. Firstly, binary class classification (preictal, ictal, postictal each versus interictal) and four-class classification (interictal versus preictal versus ictal versus postictal). The average accuracy for stage detection using CHB-MIT database was 84.4%, while the Freiburg database's time-domain signals had an accuracy of 79.7% and the highest accuracy of 94.02% for classification in the TUH EEG database when comparing interictal stage to preictal stage.
Analysis of driving style using self-organizing maps to analyze driver behaviorIJECEIAES
Modern life is strongly associated with the use of cars, but the increase in acceleration speeds and their maneuverability leads to a dangerous driving style for some drivers. In these conditions, the development of a method that allows you to track the behavior of the driver is relevant. The article provides an overview of existing methods and models for assessing the functioning of motor vehicles and driver behavior. Based on this, a combined algorithm for recognizing driving style is proposed. To do this, a set of input data was formed, including 20 descriptive features: About the environment, the driver's behavior and the characteristics of the functioning of the car, collected using OBD II. The generated data set is sent to the Kohonen network, where clustering is performed according to driving style and degree of danger. Getting the driving characteristics into a particular cluster allows you to switch to the private indicators of an individual driver and considering individual driving characteristics. The application of the method allows you to identify potentially dangerous driving styles that can prevent accidents.
Hyperspectral object classification using hybrid spectral-spatial fusion and ...IJECEIAES
Because of its spectral-spatial and temporal resolution of greater areas, hyperspectral imaging (HSI) has found widespread application in the field of object classification. The HSI is typically used to accurately determine an object's physical characteristics as well as to locate related objects with appropriate spectral fingerprints. As a result, the HSI has been extensively applied to object identification in several fields, including surveillance, agricultural monitoring, environmental research, and precision agriculture. However, because of their enormous size, objects require a lot of time to classify; for this reason, both spectral and spatial feature fusion have been completed. The existing classification strategy leads to increased misclassification, and the feature fusion method is unable to preserve semantic object inherent features; This study addresses the research difficulties by introducing a hybrid spectral-spatial fusion (HSSF) technique to minimize feature size while maintaining object intrinsic qualities; Lastly, a soft-margins kernel is proposed for multi-layer deep support vector machine (MLDSVM) to reduce misclassification. The standard Indian pines dataset is used for the experiment, and the outcome demonstrates that the HSSF-MLDSVM model performs substantially better in terms of accuracy and Kappa coefficient.
Quality defects in TMT Bars, Possible causes and Potential Solutions.PrashantGoswami42
Maintaining high-quality standards in the production of TMT bars is crucial for ensuring structural integrity in construction. Addressing common defects through careful monitoring, standardized processes, and advanced technology can significantly improve the quality of TMT bars. Continuous training and adherence to quality control measures will also play a pivotal role in minimizing these defects.
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptxR&R Consult
CFD analysis is incredibly effective at solving mysteries and improving the performance of complex systems!
Here's a great example: At a large natural gas-fired power plant, where they use waste heat to generate steam and energy, they were puzzled that their boiler wasn't producing as much steam as expected.
R&R and Tetra Engineering Group Inc. were asked to solve the issue with reduced steam production.
An inspection had shown that a significant amount of hot flue gas was bypassing the boiler tubes, where the heat was supposed to be transferred.
R&R Consult conducted a CFD analysis, which revealed that 6.3% of the flue gas was bypassing the boiler tubes without transferring heat. The analysis also showed that the flue gas was instead being directed along the sides of the boiler and between the modules that were supposed to capture the heat. This was the cause of the reduced performance.
Based on our results, Tetra Engineering installed covering plates to reduce the bypass flow. This improved the boiler's performance and increased electricity production.
It is always satisfying when we can help solve complex challenges like this. Do your systems also need a check-up or optimization? Give us a call!
Work done in cooperation with James Malloy and David Moelling from Tetra Engineering.
More examples of our work https://www.r-r-consult.dk/en/cases-en/
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)MdTanvirMahtab2
This presentation is about the working procedure of Shahjalal Fertilizer Company Limited (SFCL). A Govt. owned Company of Bangladesh Chemical Industries Corporation under Ministry of Industries.
Courier management system project report.pdfKamal Acharya
It is now-a-days very important for the people to send or receive articles like imported furniture, electronic items, gifts, business goods and the like. People depend vastly on different transport systems which mostly use the manual way of receiving and delivering the articles. There is no way to track the articles till they are received and there is no way to let the customer know what happened in transit, once he booked some articles. In such a situation, we need a system which completely computerizes the cargo activities including time to time tracking of the articles sent. This need is fulfilled by Courier Management System software which is online software for the cargo management people that enables them to receive the goods from a source and send them to a required destination and track their status from time to time.
Explore the innovative world of trenchless pipe repair with our comprehensive guide, "The Benefits and Techniques of Trenchless Pipe Repair." This document delves into the modern methods of repairing underground pipes without the need for extensive excavation, highlighting the numerous advantages and the latest techniques used in the industry.
Learn about the cost savings, reduced environmental impact, and minimal disruption associated with trenchless technology. Discover detailed explanations of popular techniques such as pipe bursting, cured-in-place pipe (CIPP) lining, and directional drilling. Understand how these methods can be applied to various types of infrastructure, from residential plumbing to large-scale municipal systems.
Ideal for homeowners, contractors, engineers, and anyone interested in modern plumbing solutions, this guide provides valuable insights into why trenchless pipe repair is becoming the preferred choice for pipe rehabilitation. Stay informed about the latest advancements and best practices in the field.
Immunizing Image Classifiers Against Localized Adversary Attacksgerogepatton
This paper addresses the vulnerability of deep learning models, particularly convolutional neural networks
(CNN)s, to adversarial attacks and presents a proactive training technique designed to counter them. We
introduce a novel volumization algorithm, which transforms 2D images into 3D volumetric representations.
When combined with 3D convolution and deep curriculum learning optimization (CLO), itsignificantly improves
the immunity of models against localized universal attacks by up to 40%. We evaluate our proposed approach
using contemporary CNN architectures and the modified Canadian Institute for Advanced Research (CIFAR-10
and CIFAR-100) and ImageNet Large Scale Visual Recognition Challenge (ILSVRC12) datasets, showcasing
accuracy improvements over previous techniques. The results indicate that the combination of the volumetric
input and curriculum learning holds significant promise for mitigating adversarial attacks without necessitating
adversary training.
Event Management System Vb Net Project Report.pdfKamal Acharya
In present era, the scopes of information technology growing with a very fast .We do not see any are untouched from this industry. The scope of information technology has become wider includes: Business and industry. Household Business, Communication, Education, Entertainment, Science, Medicine, Engineering, Distance Learning, Weather Forecasting. Carrier Searching and so on.
My project named “Event Management System” is software that store and maintained all events coordinated in college. It also helpful to print related reports. My project will help to record the events coordinated by faculties with their Name, Event subject, date & details in an efficient & effective ways.
In my system we have to make a system by which a user can record all events coordinated by a particular faculty. In our proposed system some more featured are added which differs it from the existing system such as security.
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Dr.Costas Sachpazis
Terzaghi's soil bearing capacity theory, developed by Karl Terzaghi, is a fundamental principle in geotechnical engineering used to determine the bearing capacity of shallow foundations. This theory provides a method to calculate the ultimate bearing capacity of soil, which is the maximum load per unit area that the soil can support without undergoing shear failure. The Calculation HTML Code included.
A novel defect detection method for software requirements inspections
1. International Journal of Electrical and Computer Engineering (IJECE)
Vol. 13, No. 5, October 2023, pp. 5865~5873
ISSN: 2088-8708, DOI: 10.11591/ijece.v13i5.pp5865-5873 5865
Journal homepage: http://ijece.iaescore.com
A novel defect detection method for software requirements
inspections
Bilal Alqudah1
, Laiali Almazaydeh2
, Reyad Alsalameen2
1
Faculty of Engineering, Al-Hussein Bin Talal University, Ma’an, Jordan
2
Faculty of Information Technology, Al-Hussein Bin Talal University, Ma’an, Jordan
Article Info ABSTRACT
Article history:
Received Jan 21, 2023
Revised May 3, 2023
Accepted May 3, 2023
The requirements form the basis for all software products. Apparently, the
requirements are imprecisely stated when scattered between development
teams. Therefore, software applications released with some bugs, missing
functionalities, or loosely implemented requirements. In literature, a limited
number of related works have been developed as a tool for software
requirements inspections. This paper presents a methodology to verify that the
system design fulfilled all functional requirements. The proposed approach
contains three phases: requirements collection, facts collection, and matching
algorithm. The feedback results provided enable analysist and developer to
make a decision about the initial application release while taking on
consideration missing requirements or over-designed requirements.
Keywords:
Requirement engineering
Requirement verification
Software design
Software development life cycle
Software engineering This is an open access article under the CC BY-SA license.
Corresponding Author:
Laiali Almazaydeh
Department of Software Engineering, Faculty of Information Technology, Al-Hussein Bin Talal University
Ma’an-71111, Jordan
Email: laiali.almazaydeh@ahu.edu.jo
1. INTRODUCTION
Software engineering is defined as the application of a standardized, structured, and thorough
approach to the development process of the software in a rigorous way [1]. The process encompasses the entire
range of activities, from initial customer inception to software production and maintenance. The engineering
approach is the activity of envisioning and realizing valuable new functions with sufficient and justifiable
confidence that the resulting software will have all the critical quality attributes that are necessary for the
software to be a success. Therefore, as the end of software engineering is a streamlined and reliable software
product, the software should be engineered correctly using the intersection between requirements,
architecture, and project management, and all these essential concepts that have to go into the software
engineering mix [2].
The intended software product is developed using structured sequences of stages in software
engineering called software development life cycle (SDLC) [3]. The first stage in SDLC is requirement
engineering since the requirements form the basis for all software products. Requirement engineering consists
of a set of steps that are handled in an iterative process. The first step is elicitation which is the collection of
requirements from stakeholders and other sources. The second is requirement analysis which involved the
study and deeper understanding of the collective requirements. The third step is specification of requirements,
in which the collective requirements are suitably represented, organized, and saved so that they can be shared.
Once the requirements have been specified, they can be validated to ensure that they are complete, consistent,
no redundant and so on. Finally, the fifth step is requirements management which accounts for changes to
requirements during the lifetime of the project [4]–[6].
2. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 13, No. 5, October 2023: 5865-5873
5866
The eventual artifact that comes out of requirements engineering process, is the software requirements
specification (SRS) document [7]. Typically, the SRS document will end up containing the requirements which
can be classified along two different axes. One axis is that of the user versus system requirements. User
requirements are written in a natural language, and system requirements are written more from a developer’s
perspective. Another axis can be differentiated is called functional and non-functional requirements. Functional
requirements indicate the services from the perspective of the functionality of the system, and non-functional
requirements indicate a particular behavior of function of the system [8]–[10].
However, the requirements can range from a high-level abstract description of the system services to
a precise mathematically formulated specification. The reason behind this wide range in the requirements
definition is because, it can serve multiple purposes. The requirement itself can be used as a basis for a request
for proposals (RFP), so this may be a basis for a bid or contract. Therefore, in principle, the requirements have
two important characteristics; the first characteristic is completeness to avoid ambiguity, and the second
characteristic is consistency to avoid any conflicts or contradictions in the description of the system facilities
[11]–[14].
In fact, the requirements are imprecisely stated, since it is ambiguous for interpretation, so both the
client and developer will look at the requirements from their own perspective. But the ambiguity and the
imprecision with which it was laid out can create significant problem later [15], [16]. In addition, the need for
rapid production and lowering costs force some companies to release the application with some bugs, missing
functionalities, or loosely implemented requirements. Furthermore, the traditional SDLC methodologies
cannot go over the implementation as one unit for large systems. In addition, most of current algorithms focus
on providing feedback regarding analysis-implementation phases in stages.
In this paper, we propose an automated methodology to focus on functional requirements
implementation in the final product regardless of software size eliminating the need for a large number of
reviewers or quality assurance (QAs). The proposed methodology is quantitative; however, there is no specific
acceptance ratio specified for all systems ahead. It can be used for many rounds of inspections with no
additional costs. The provided feedback enables the analysist and developer to make a decision about the initial
application release while taking on consideration missing requirements or over-designed requirements. Below
we describe the relevant literature, several alternative defect detection methods which motivated our study, our
research methodology, and our test cases, results, and conclusion.
2. RELATED WORKS
Until now, however, a limited number of related works have been developed as a tool for software
requirements inspections. One of these some key related studies in [17] where a controlled experiment was
applied to assess different defect detection methods for software requirements inspections. The different defect
detection methods are ad hoc, checklist and scenario-based detection method. The experimental results showed
that the defect detection rate was higher when using a Scenario-based detection method, in which each reviewer
focus on particular class of defects, than either ad hoc or checklist methods.
The work in [18] was based on defining design errors to different classification, which are:
inconsistencies, inefficiencies, ambiguities, and inflexibilities, in order to review these errors by reviewers
according to their skills and knowledge. The purpose of this classification is to ensure that the reviewers will
find as many errors as possible. The work approach in [19] defined some software metrics in the factors and
discussed several software quality assurance models and some quality factors measure method. One of these
software quality factors is completeness and correctness of requirements, where the software quality measure
metric is requirement specification. Other work in [20] followed divide and conquer policy, by decomposition
of the inspection into discrete steps, so that one inspection step can be carried out without detailed knowledge
of the others. The work in [21] considered correctness, which indicates the ability of a system to perform
according to defined specification as one of software quality assurance factors. Meyer [22] also defined a more
software quality factors and classified these factors into technical groups. One of these groups is product-based
factors. Product based factors are those factors that define the “properties of the resulting software, for example
correctness, efficiency” [22]–[25]. Moreover, Meyer derived these quality characteristics from McCall’s
quality taxonomy model.
Apparently, based on the mentioned related works, a few inspection methods are partly effective as
inspectors may not have an adequate understanding of the inspection process as they take shortcuts. Therefore,
still, further research is needed to find more practical and effective ways of doing inspections. In this regard,
our contribution is developing a new automated approach that is used, in one hand, for quantifying the ratio of
implemented requirements and over-designed functionalities, in addition to identify the acceptance ratio that
affect the initial product release. On the other hand, it will lower the cost of requirements review by being able
to re-run the evaluation process many times with no extra cost or time.
3. Int J Elec & Comp Eng ISSN: 2088-8708
A novel defect detection method for software requirements inspections (Bilal Alqudah)
5867
3. PROPOSED METHOD
Information generated about any system can be classified into information produced in the analysis
phase and information produced as a result of development. Each stage of system development has many details
and sub tasks. In those stages a lot of material will be available through requirements elicitation in the form of
text documents, images, and scanned documents. Those information and details might get ignored or forgotten
when scattered between development teams.
After the system is built, requirements became facts of the system. Some facts are hidden in a form of
functionalities; for example: “reports have to be sorted by employee name”. To be able to verify that the system
design fulfilled all functional requirements, the system will be verified against requirements gathered.
However, some requirements can be hidden as explained before. To overcome that problem, we propose a
method where requirements automatically gathered regarding the system from analysts and from developed
system in pre-processing steps. Those steps are summarized into: 3.1. Requirements collection, 3.2. Facts
collection, and 3.3. Matching algorithm.
Requirements collection stage involves information extraction, forwardly. Foreword collection
means: collecting information from analysis documents, text data, and images. Where facts collection is
represented by reverse collection, this process is initiated from the final product side, from code, scripts, and
graphical user interface (GUI). The algorithm takes the available resources (text data and image data) through
optical character recognition (OCR) to extract text. In the matching phase, collected information are joined in
sets representing requirements for that screen in the system by identifying key words in the collected text.
The last step is processing facts and requirements by the matching algorithm. The matching algorithm
takes the responsibility of producing two sets of results, one is the matching requirements and facts. The other
is the set of information is requirements found in the documentation but not in the designed system. Both sets
will be represented by a numerically as well. The following sections provide details for all steps.
3.1. Requirements collection
Figure 1 shows gathering requirements from analysis phase. In this stage information collected by
firstly; parsing the repository of text files generated through the analysis phase and requirements elicitation.
Secondly, all images, pictures, and scanned documents are converted to text through OCR. The text extracted
from all sources clustered in a map where important words classified in a special table.
The documents then classified based on functionalities, a matching table is created for requirement,
document pairs (𝑟, 𝑑) as shown in Table 1. The goal of that table is to show how many documents are related
to requirement specified. Another goal is to be able to identify documents that does not relate to any
requirement. Those documents either 1) analyzed in a wrong way and some requirements have been ignored
or 2) the document does not relate to any functionality and the functionality has been forgotten for sufficient
analysis. Document significancy metric: for each row in the table, the sum of ones represents how significant
is the document to the system. The number is assigned to the document as a document weight (dw) as shown
by (1).
𝑑𝑤𝑖 = ∑ rd[req]
𝑛
𝑟𝑒𝑞=0 (1)
Table 2 shows the document classification matrix. For instance, docn-1 has no importance to the
system, or the document was ignored by mistake. That document needs to be revised and fixed to fit in its
correct location regarding the system. Where docn on the opposite, talked about almost every requirement in
the system except for two of them. That document should be revised as well because it is either an executive
summery and has no details about the system and its development, if so, then it must be removed from the
analysis we are doing, or it is not a summary, but it is a document shows the interaction between system
components. In both cases, zero value documents and very high significancy documents must be revised or
removed from the verification we are conducting.
Requirements significancy metric: the sum of each column in the requirements document (RD) table
represents how many documents talked about that requirement. This metric presented as requirement weight
(rw) and calculated as shown in (2).
𝑟𝑤𝑖 = ∑ rd[doc]
𝑛
𝑑𝑜𝑐=0 (2)
No requirement can have the value of rw=0 at all. This means that the requirements elicitation process
missed the requirement, or there are some missing documents. Otherwise, the requirement should be marked
as missing requirement and reported back. As shown in Table 3, requirement Reqn-1 is missing from the analysis
phase or documents analyzing it is missing.
4. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 13, No. 5, October 2023: 5865-5873
5868
Figure 1. Gathering requirements from analysis phase
Table 1. Requirements document (RD) table
Req.
Doc.
Requirement1
[req1]
Requirement2
[req2]
… Requirement n
[req n]
Doc1 1 0 … 1
Doc2 0 1 1
… … … … …
Doc n-1 0 0 … 0
Doc n 1 1 … 1
Table 2. Document weight table Table 3. Requirements weights matrix
Document dw
Doc1 2
Doc2 2
… …
Docn-1 0
Docn n-2
Requirement rw
Req1 2
Req2 2
… …
Reqn-1 0
Reqn 3
3.2. Facts collection
From the other side of the system, the developed and running application, facts about the system
collected and classified. Each function in the code, procedure, script with their corresponding interface is
grouped in one cluster and named according to that feature. What is new is that a text file with what we called
golden keys (gk) is created for each cluster. The gk set is used as a keyword set of what does the set of facts
collected represents. The reason for that is the fact that some requirements such as functional requirements
(font is bold, italic, the color is red with white borders) cannot be extracted easily from the design. To work
around this problem, we created the golden-key set as shown in Figure 2. The gk set is better to be matched in
name with the requirements specified in the RD table.
The collected list of facts now produces a filtered and clustered list of facts regarding each
functionality. For example, assuming e-commerce system, the system is producing a report showed on the
screen of customers sorted by last name. A list of facts regarding that report are (report ID, report date, issued
by who, directed to whom, first, middle, and last name columns), all those facts will be clustered under one
title called report_by_name cluster. A cluster of facts will be generated for each screen or window of the
analyzed system. We will be referring to the window as a feature and the items of that window, text, and fields
as facts. To connect the terminologies, a set of requirements and specifications in the analysis phase is called
a requirement, a requirement has sub-fields for it. After the system is built and the code for that requirement is
written we call it a feature and each feature have a set of facts. Table 4 shows an example of extracted
feature-facts from some system of managing employees. Each feature (𝑓) in a system will have set of facts {x},
a feature (i) represented as 𝐹(𝑖) = (𝑖, {𝑥1, 𝑥2, 𝑥3. . 𝑥𝑛}).
5. Int J Elec & Comp Eng ISSN: 2088-8708
A novel defect detection method for software requirements inspections (Bilal Alqudah)
5869
Figure 2. Facts collection and clustering with golden-keys
Table 4. Example of feature-facts
Feature f Name (x)
Create_New_Employee
F1
Full name
Address
…
DOB
3.3. Matching algorithm
This is the quantitative component for user requirements versus GUI facts collected. The algorithm is
built on the assumption of extracting information from images (representing forms, and paperwork) are
provided in duplicate-free lists. To guarantee that there is no duplication found, facts are stored in hash sets
that allows one copy of each fact in it. The results of extracted information are saved in lists. The other
assumption is that the system interface, database has been established and the evaluation algorithm we are
providing has access to the system and can run the same algorithms used previously to extract information
form documentation and paperwork.
In the proposed algorithm 1, lines 5 and 6 gets the result of data mining and for images and text files
add them in line 7 to a hash set where duplication will be eliminated automatically because of the feature that
a set provides. This will allow us keep one copy of the feature extracted from the image or the text. In line 8,
the while statement will get one feature from the user interface (UI) design and look for it in the hash set
prepared in step 7. If the feature exists (line 9) in the hash set, this means that the feature from the UI has a
match from the documentation and the images. For each feature found a match for, remove it from the UI
features so the algorithm will not check it twice then move to the next UI feature to check if exist. This process
will happen in line 11, if a match found between the set of features form the documentation, images and the
UI, (m) will increase by 1 stating that a match found.
In line 15, the (v) factor will increase by the amount of information found in the hash set with no
match from the UI design. The hash set will be cleared after that because we did the best with the information,
we got the number of matches and the amount of miss. After clearing the set, the algorithm checks for the
search depth factor specified. The depth factor specified by the algorithm user to indicate how many documents
need to be mined if the acceptance ratio not found. This condition will help in stopping the algorithm from
keep running indefinitely for large amount of data or if the ratio specified in line 22 not satisfied. In line 21,
the loop will stop if the amount of information from the images and the documents still has no match comparing
to the amount of information found in the UI less than the acceptance ratio specified. The algorithm will add
{m, v, currentDepth, i, j} to an array of results and return them to the main function as a result for the match.
6. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 13, No. 5, October 2023: 5865-5873
5870
The following are the parameters which are used in algorithm 1:
– Images[x] : A list of images of size x.
– Text[y] : List of text files, documentation of a size y.
– acceptRatio : The estimated accuracy level or matching level after which the system can be considered
matching requirements.
– depth : Until when the algorithm will keep running and asking for more data mining.
– m : The matching percentage between the developed UI and the requirements.
– v : The divergence between what is the in documentation and the UI (which is the result of the
analysis).
The assumptions of the algorithm 1 are as follows:
– ImgMine(Image): Any selected data mining algorithm to extract information from images and return a set
of features (we focus on the attributes related to text).
– TxtMine(Text) : Any selected data mining algorithm to extract information form text files and
documentations with ranking.
Algorithm 1. Matching algorithm
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
results [] match(images[x], text[y], acceptRatio, depth) {
m, v: will be returned in results.
currentDepth: int
do {
ImgMine(images[i])→imgResults<>;
TxtMine(text[y]) →txtResults<>;
HashSet.add(imgResults), HashSet.add(txtResults)
While(UI.hasNext()) {
If(HashSet.contains(UI.getFeature(j)){
m++;
HashSet.delete(UI.getFeature(j))
j++;
}
}
v+=HashSet.size()
HashSet.clear()
currentDepth++;
if(currDepth>depth) break;
i++;
y++;
}
while((v-m)/m)>acceptRatio)
results.add(m,v,currDepth,i,j)
return (results)
}
}
4. TEST CASES AND RESULTS
To show the principal of how the algorithm performs, it was implemented in a simple text editing
software where features are limited as well as requirements. A software developer has been asked to write
analysis for the assumed text editor ordered by testers. The developer came up with five text documents that
explains the work of the text editor. As shown in Figure 3, the test case system we used has a unique
28 requirements extracted from the selected system. The analysis document extracted and found to contain
29 paragraphs. the expectations will be having some mismatch between analysis and system. The goal is to
highlight the mismatch using the proposed algorithm.
First step is building the requirements-documents table to be able to identify the significancy of each
document to requirements and vice versa. However, after building the tables, it can be identified that one of
the documents found to contain functional requirements with no match to any functionality. Those functional
requirements cover the coloring and fonts used. Then the algorithm tested using a clinic system where the
available information is the system interfaces and the analysis files. The system was modified and part of it
used to show the work of the proposed algorithm because of the non-disclosure agreement (NDA) policy for
system owners and developers.
As shown in Table 5, documents covered the requirements for basic operations sorted ascending are:
1, 2, 3, 4..., where document 1 covered twice the requirements covered by the next document inline. Some
requirements might be misrepresented or documented in a bad way, such as UTF8 and close document, in such
cases those requirements need to be revised and the evaluation process must be run again. So, as shown in
Figure 4, the algorithm found a matching.
7. Int J Elec & Comp Eng ISSN: 2088-8708
A novel defect detection method for software requirements inspections (Bilal Alqudah)
5871
Figure 5 shows the results gained from analyzing a test system built for a clinic. As the figure shows,
the tested system has two screens each has a normalized 30 paragraphs of analysis and description. The
requirements collected from the two GUIs contain 26 and 20 facts. In the first screen, the matching algorithm
was able to find 5 facts that has no match in the analysis files and 9 key words that has no match in the running
system. However, in the second system screen the algorithm found 15 key words in the analysis with no math
in the running system and 5 facts or features in the running system with no mention in the analysis files. Those
test cases show how the proposed algorithm was able to identify the mismatching between analysis and built
systems. Those results will be useful feedback to QAs, analysist, and developers to minimize the rounds of
code review and lower the cost of system development.
Figure 3. the initial set of requirements and facts sizes
Table 5. Document, requirement relevance
Requirement description Req. Code Doc1 Doc2 Doc3 Doc4 Doc5 rd
text files req1 1 0 0 0 0 1
plain text req2 1 0 1 0 0 2
Unformatted req3 1 0 0 0 0 1
Format req4 1 1 1 1 0 4
UFT8 req5 0 0 0 0 0 0
new file req6 1 0 0 0 0 1
open document req7 0 0 1 0 0 1
close document req8 0 0 0 0 0 0
add tables req9 1 0 0 1 0 2
save as req10 1 1 0 0 0 2
Save req11 1 1 1 0 0 3
dw 8 3 4 2 0 -------
Figure 4. Result of matching requirements from GUI and analysis files
Figure 5. Clinical system test case
8. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 13, No. 5, October 2023: 5865-5873
5872
5. CONCLUSION
This paper focused on validation the design fulfillment of user requirements as been provided by
customer. However, we focused on the information presence in the design. As a future work; the paper can be
improved by enhancing the algorithm by adding more sophisticated algorithms to match words and meanings
such as “gender” selection box on drop box with the words (male/female) as substitute. This will improve the
results but add more overhead to the cost (time). Another improvement can be integrating (user/designer)
feedback to the algorithm to reduce the error ratio by stating whether the requirement has been fulfilled or not
if the information present but not classified or matched by the algorithm.
REFERENCES
[1] R. L. Glass, I. Vessey, and V. Ramesh, “Research in software engineering: an analysis of the literature,” Information and Software
Technology, vol. 44, no. 8, pp. 491–506, Jun. 2002, doi: 10.1016/S0950-5849(02)00049-6.
[2] R. S. Pressman and B. Maxi, Software engineering: a practitioner’s approach. McGraw Hill, 2005.
[3] B. Boehm, “A view of 20th
and 21st
century software engineering,” in Proceedings of the 28th
international conference on Software
engineering, May 2006, pp. 12–29, doi: 10.1145/1134285.1134288.
[4] T. Rehman, M. N. A. Khan, and N. Riaz, “Analysis of requirement engineering processes, tools/techniques and methodologies,”
International Journal of Information Technology and Computer Science, vol. 5, no. 3, pp. 40–48, Feb. 2013, doi:
10.5815/ijitcs.2013.03.05.
[5] D. Pandey, U. Suman, and A. K. Ramani, “An effective requirement engineering process model for software development and
requirements management,” in 2010 International Conference on Advances in Recent Technologies in Communication and
Computing, Oct. 2010, pp. 287–291, doi: 10.1109/ARTCom.2010.24.
[6] D. Mishra, A. Mishra, and A. Yazici, “Successful requirement elicitation by combining requirement engineering techniques,” in
2008 First International Conference on the Applications of Digital Information and Web Technologies (ICADIWT), Aug. 2008,
pp. 258–263, doi: 10.1109/ICADIWT.2008.4664355.
[7] S. W. Ali, Q. A. Ahmed, and I. Shafi, “Process to enhance the quality of software requirement specification document,” in 2018
International Conference on Engineering and Emerging Technologies (ICEET), Feb. 2018, pp. 1–7, doi:
10.1109/ICEET1.2018.8338619.
[8] B.-G. Lee, M.-S. Hwang, Y.-B. Lee, H.-J. Lee, J.-M. Baik, and C.-K. Lee, “Design and development of a standard guidance for
software requirement specification,” Journal of KIISE: Software and applications, vol. 36, no. 7, pp. 531–538, Apr. 2009, doi:
10.1145/3167132.3167268.
[9] V. Pekar, M. Felderer, and R. Breu, “Improvement methods for software requirement specifications: A mapping study,” in 2014
9th International Conference on the Quality of Information and Communications Technology, Sep. 2014, pp. 242–245, doi:
10.1109/QUATIC.2014.40.
[10] J. Medeiros, A. Vasconcelos, C. Silva, and M. Goulão, “Quality of software requirements specification in agile projects: A
cross-case analysis of six companies,” Journal of Systems and Software, vol. 142, pp. 171–194, Aug. 2018, doi:
10.1016/j.jss.2018.04.064.
[11] M. Ochodek and S. Kopczyńska, “Perceived importance of agile requirements engineering practices-A survey,” Journal of Systems
and Software, vol. 143, pp. 29–43, Sep. 2018, doi: 10.1016/j.jss.2018.05.012.
[12] D. Firesmith, “Prioritizing requirements,” The Journal of Object Technology, vol. 3, no. 8, 2004, doi: 10.5381/jot.2004.3.8.c4.
[13] X. Lai, M. Xie, K.-C. Tan, and B. Yang, “Ranking of customer requirements in a competitive environment,” Computers and
Industrial Engineering, vol. 54, no. 2, pp. 202–214, Mar. 2008, doi: 10.1016/j.cie.2007.06.042.
[14] N. Bencomo, J. Whittle, P. Sawyer, A. Finkelstein, and E. Letier, “Requirements reflection,” in Proceedings of the 32nd
ACM/IEEE International Conference on Software Engineering-Volume 2, May 2010, pp. 199–202,
doi: 10.1145/1810295.1810329.
[15] M. Bano, “Addressing the challenges of requirements ambiguity: a review of empirical literature,” in 2015 IEEE Fifth
International Workshop on Empirical Requirements Engineering (EmpiRE), Aug. 2015, pp. 21–24,
doi: 10.1109/EmpiRE.2015.7431303.
[16] D. M. Berry and E. Kamsties, “Ambiguity in requirements specification,” in Perspectives on Software Requirements, Boston, MA:
Springer US, 2004, pp. 7–44, doi: 10.1007/978-1-4615-0465-8_2.
[17] A. A. Porter and L. G. Votta, “An experiment to assess different defect detection methods for software requirements
inspections,” in Proceedings of 16th
International Conference on Software Engineering, 1994, pp. 103–112, doi:
10.1109/ICSE.1994.296770.
[18] D. Parnas and D. M. Weiss, “Active design reviews: Principles and practices,” Journal of Systems and Software, vol. 7, no. 4,
pp. 259–265, Dec. 1987, doi: 10.1016/0164-1212(87)90025-2.
[19] M.-C. Lee, “Software quality factors and software quality metrics to enhance software quality assurance,” British Journal of Applied
Science & Technology, vol. 4, no. 21, pp. 3069–3095, Jan. 2014, doi: 10.9734/BJAST/2014/10548.
[20] D. L. Parnas and M. Lawford, “The role of inspection in software quality assurance,” IEEE Transactions on Software Engineering,
vol. 29, no. 8, pp. 674–676, Aug. 2003, doi: 10.1109/TSE.2003.1223642.
[21] E. Mnkandla and B. Dwolatzky, “Defining agile software quality assurance,” in 2006 International Conference on Software
Engineering Advances (ICSEA’06), Oct. 2006, pp. 36–36, doi: 10.1109/ICSEA.2006.261292.
[22] B. Meyer, “Applying ‘design by contract,” Computer, vol. 25, no. 10, pp. 40–51, Oct. 1992, doi: 10.1109/2.161279.
[23] S. Balsamo, A. Di Marco, P. Inverardi, and M. Simeoni, “Model-based performance prediction in software development: a survey,”
IEEE Transactions on Software Engineering, vol. 30, no. 5, pp. 295–310, May 2004, doi: 10.1109/TSE.2004.9.
[24] Melville, Kraemer, and Gurbaxani, “Review: information technology and organizational performance: An integrative model of IT
business value,” MIS Quarterly, vol. 28, no. 2, pp. 283–322, 2004, doi: 10.2307/25148636.
[25] S. Yadav, “Analysis and assessment of existing software quality models to predict the reliability of component-based software,”
International Journal of Emerging Trends in Engineering Research, vol. 8, no. 6, pp. 2824–2840, Jun. 2020, doi:
10.30534/ijeter/2020/96862020.
9. Int J Elec & Comp Eng ISSN: 2088-8708
A novel defect detection method for software requirements inspections (Bilal Alqudah)
5873
BIOGRAPHIES OF AUTHORS
Bilal Alqudah received his doctorate degree in Computer Security and Privacy
Protection from the Bobby B. Lyle College of Engineering, Southern Methodist University in
USA in 2015. He is currently an assistant professor of Computer Security and Privacy Protection
at the college of Engineering at Al-Hussein Bin Talal University, Jordan. Dr. Alqudah has held
many local and international training seminars and conferences in his field of specialization. Dr.
Alqudah focuses in computer security and privacy research, electronic medical records, and
access controlling, in addition to other areas of interest. He can be contacted at email:
alqu-dah@ahu.edu.jo.
Laiali Almazaydeh received her doctorate degree in Computer Science and
Engineering from University of Bridgeport in USA in 2013, specializing in human computer
interaction. She is currently a full professor and the dean of Faculty of Information Technology,
Al-Hussein Bin Talal University, Jordan. Laiali has published more than sixty research papers
in various international journals and conferences proceedings, her research interests include
human computer interaction, pattern recognition, and computer security. She received best paper
awards in 3 conferences, ASEE 2012, ASEE 2013 and ICUMT 2016. Recently she has been
awarded two postdoc scholarships from European Union Commission and Jordanian-American
Fulbright Commission. She can be contacted at email: laiali.almazaydeh@ahu.edu.jo.
Reyad Alsalameen received his Ph.D. in Software Engineering from University of
Salford, UK in 2016. He is currently an assistant professor and the vice dean of Faculty of
Information Technology, Al-Hussein Bin Talal University, Jordan. His current research interest
includes fault-tolerant systems, software operation and maintenance, E-learning, and artificial
Intelligence (AI) applications. He can be contacted at email: reyad.m.salameen@ahu.edu.jo.