Model driven development is an effective method due to its benefits such as code transformation, increasing
productivity and reducing human based error possibilities. Meanwhile, agile software development
increases the software flexibility and customer satisfaction by using iterative method. Can these two
development approaches be combined to develop web applications efficiently? What are the challenges and
what are the benefits of this approach? In this paper, we answer these two crucial problems; combining
model driven development and agile software development results in not only fast development and
easiness of the user interface design but also efficient job tracking. We also defined an agile model based
approach for web applications whose implementation study has been carried out to support the answers we
gave these two crucial problems.
Majority of agile projects currently involve interactive user interface designs, which can only be possible by following UCD in agile software model. But the integration of UCD is not clear in the current agile models. Both Agile models and UCD have iterative nature but agile models focus on coding and development of software; whereas, UCD focuses on user interface of the software. Similarly, both of them have testing features where the agile model involves automated tested code while UCD involves an expert or a user to test the user interface. In this paper, a new agile usability model is presented and tested in companies and are presented. Key results from these projects clearly shows: the proposed agile model incorporates usability evaluation methods, improve the relationship between usability experts to work with agile software experts; in addition, allows agile developers to incorporate the result from UCD into subsequent interactions.
A software Process model is a standardised format for planning, organising, and runninga new software development project. The need to complete and deliver software projects faster require using a suitable model.
There are several different kind of models being used which have evolved over the years, in this paper we carried out survey on the following main types of model; waterfall model, V-model, Component assembly model, Chaos model, Incremental model, Prototyping model, Spiral model, Rapid application development (RAD) model, Agile model, rational unified process (RUP), Iconix process and Software ecosystem (SECO) model by describing their characteristic features.
We concluded the study by listing the strengths and weaknesses of each of this models.
A NOVEL METHOD FOR REDUCING TESTING TIME IN SCRUM AGILE PROCESSijseajournal
Recently, the software development in the industry is moving towards agile due to the advantages provided
by the agile development process. Main advantages of agile software development process are: delivering
high quality software in shorter intervals and embracing change. Testing is a vital activity for delivering a
high quality software product. Often testing accounts for more project effort and time than any other
software development activities. Testing strategies for conventional process models are well established,
but these strategies are not directly applicable to agile testing without modifications and changes. In this
paper, a novel method for agile testing in the scrum software development environment is proposed and
presented. The sprint and testing activities which form the context for the proposed testing method are
presented. The proposed method is applied on two cases studies. The results indicated that the testing time
can be reduced considerably by applying the proposed method
‘O’ Model for Component-Based Software Development Processijceronline
The technology advancement has forced the user to become more dependent on information technology, and so on software. Software provides the platform for implementation of information technology. Component Based Software Engineering (CBSE) is adopted by software community to counter challenges thrown by fast growing demand of heavy and complex software systems. One of the essential reasons behind adopting CBSE for software development is the fast development of complicated software systems within well-defined boundaries of time and budget. CBSE provides the mechanical facilities by assembling already existing reusable components out of autonomously developed pieces of the software. The paper proposes a novel CBSE model named as O model, keeping an eye on the available CBSE lifecycle.
Majority of agile projects currently involve interactive user interface designs, which can only be possible by following UCD in agile software model. But the integration of UCD is not clear in the current agile models. Both Agile models and UCD have iterative nature but agile models focus on coding and development of software; whereas, UCD focuses on user interface of the software. Similarly, both of them have testing features where the agile model involves automated tested code while UCD involves an expert or a user to test the user interface. In this paper, a new agile usability model is presented and tested in companies and are presented. Key results from these projects clearly shows: the proposed agile model incorporates usability evaluation methods, improve the relationship between usability experts to work with agile software experts; in addition, allows agile developers to incorporate the result from UCD into subsequent interactions.
A software Process model is a standardised format for planning, organising, and runninga new software development project. The need to complete and deliver software projects faster require using a suitable model.
There are several different kind of models being used which have evolved over the years, in this paper we carried out survey on the following main types of model; waterfall model, V-model, Component assembly model, Chaos model, Incremental model, Prototyping model, Spiral model, Rapid application development (RAD) model, Agile model, rational unified process (RUP), Iconix process and Software ecosystem (SECO) model by describing their characteristic features.
We concluded the study by listing the strengths and weaknesses of each of this models.
A NOVEL METHOD FOR REDUCING TESTING TIME IN SCRUM AGILE PROCESSijseajournal
Recently, the software development in the industry is moving towards agile due to the advantages provided
by the agile development process. Main advantages of agile software development process are: delivering
high quality software in shorter intervals and embracing change. Testing is a vital activity for delivering a
high quality software product. Often testing accounts for more project effort and time than any other
software development activities. Testing strategies for conventional process models are well established,
but these strategies are not directly applicable to agile testing without modifications and changes. In this
paper, a novel method for agile testing in the scrum software development environment is proposed and
presented. The sprint and testing activities which form the context for the proposed testing method are
presented. The proposed method is applied on two cases studies. The results indicated that the testing time
can be reduced considerably by applying the proposed method
‘O’ Model for Component-Based Software Development Processijceronline
The technology advancement has forced the user to become more dependent on information technology, and so on software. Software provides the platform for implementation of information technology. Component Based Software Engineering (CBSE) is adopted by software community to counter challenges thrown by fast growing demand of heavy and complex software systems. One of the essential reasons behind adopting CBSE for software development is the fast development of complicated software systems within well-defined boundaries of time and budget. CBSE provides the mechanical facilities by assembling already existing reusable components out of autonomously developed pieces of the software. The paper proposes a novel CBSE model named as O model, keeping an eye on the available CBSE lifecycle.
A Novel Method for Quantitative Assessment of Software QualityCSCJournals
This paper deals with quantitative quality model that needs to be practiced formally through out the software development life cycle at each phase. Proposed quality model emphasizes that various stakeholders need to be consulted for quality requirements. The quality goals are set through various measurements and metrics. Software under development is evaluated against expected value of set of metrics. The use of proposed quantitative model is illustrated through a simple case study. The unaddressed quality attribute reusability in ISO 9126 is also discussed.
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTijseajournal
The present business network infrastructure is quickly varying with latest servers, services, connections,
and ports added often, at times day by day, and with a uncontrollably inflow of laptops, storage media and
wireless networks. With the increasing amount of vulnerabilities and exploits coupled with the recurrent
evolution of IT infrastructure, organizations at present require more numerous vulnerability assessments.
In this paper new approach the Unified process for Network vulnerability Assessments hereafter called as
a unified NVA is proposed for network vulnerability assessment derived from Unified Software
Development Process or Unified Process, it is a popular iterative and incremental software development
process framework.
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISONijseajournal
Performance responsiveness and scalability is a make-or-break quality for software. Nearly everyone runs into performance problems at one time or another. This paper discusses about performance issues faced during Pre Examination Process Automation System (PEPAS) implemented in java technology. The challenges faced during the life cycle of the project and the mitigation actions performed. It compares 3 java technologies and shows how improvements are made through statistical analysis in response time of the application. The paper concludes with result analysis.
Software development field is becoming more
productive day by day with the wonderful model name Agile. Agile
is the main focus of research now a days. It is because of its
abilities of handling changes in efficient way through iterative and
incremental practices. Although it became famous because of its
capabilities still there are some issues in it, which is ignorance of
usability engineering in different phases of agile that is an
important aspect to understand the software. Usability has deep
roots in software quality and is a core construct of HCI. To develop
interactive and usable systems there is a need of such a model
which can integrate HCI with Agile. To address this issue. To solve
this issue we have proposed a model which will work with both
User Centered (main focus of HCI) and Agile by assembling
different practices from both fields which will result useable
products. It will enhance software life with user satisfaction by
giving them running software with usability.
A comparative studies of software quality model for the software product eval...imdurgesh
Actually, software products are increasing in a fast way and are used in almost all activities of human life.
Consequently measuring and evaluating the quality of a software product has become a critical task for many companies.
Several models have been proposed to help diverse types of users with quality issues. The development of techniques for
building software has influenced the creation of models to assess the quality. Since 2000 the construction of software
started to depend on generated or manufactured components and gave rise to new challenges in assessing quality.
These components introduce new concepts such as configurability, reusability, availability, better quality and lower cost.
Consequently, the models are classified into basic models which were developed until 2000, and those based on
components called tailored quality models. The purpose of this article is to describe the main models with their strengths
and point out some deficiencies. In this work, we conclude that in the present age, aspects of communications play an
important factor in the quality of the software.
Software Development Life Cycle: Traditional and Agile- A Comparative Studyijsrd.com
In the field of software development, software development lifecycle is the most important component. There is a number of software development methodologies used in software industry today. The paper discussed below focuses on the modern SDLC which are traditional methods and the agile methods. It also explains the compensation and shortcomings of the traditional as well as agile methods. Along with this, it suggests some improvements which could help in improving current agile development
Selenium - A Trending Automation Testing Toolijtsrd
Selenium is an important testing tool for software quality assurance. In recent days number of websites are increasing rapidly and it becomes essential to test the websites against various quality factors to make sure it meets the expected quality goals. Several companies are spending a lot of bucks for the testing tool while Selenium is available completely free for the performance test. The open source tool is well known for its unlimited capabilities and unlimited reach. Selenium stands out from the crowd in this aspect. Anyone could visit the Selenium website and download the latest version and use it. It is not only an open source but also highly modifiable. Testers could make changes based upon their needs and requirements. Manav Kundra "Selenium - A Trending Automation Testing Tool" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-4 , June 2020, URL: https://www.ijtsrd.com/papers/ijtsrd31202.pdf Paper Url :https://www.ijtsrd.com/engineering/software-engineering/31202/selenium-%E2%80%93-a-trending-automation-testing-tool/manav-kundra
A Software System Development Life Cycle Model for Improved Students Communic...IJCSES Journal
Software engineering provides methodologies, concepts and practices, which are used for analyzing,designing, building and maintaining the information in a software industry. Software Development Life Cycle (SDLC) model is an approach used in the software industry for the development of various size
projects: small scale projects, medium scale projects and large scale projects. A software project of any size is developed with the co-ordination of development team. It is therefore important to assign resources intelligently to the different phases of the software project by the project manager. This study proposes a
model for the spiral development process with the use of a simulator (Simphony.NET), which helps the project manager in determining how to increase the productivity of a software firm with the use of minimum resources (expert team members). This model increase the utilization of different development
processes by keeping all development team members busy all the time, which helps in decreasing idle and waste time.
MODELING, IMPLEMENTATION AND PERFORMANCE ANALYSIS OF MOBILITY LOAD BALANCING ...IJCNCJournal
We propose in this paper a simulation implementation of Self-Organizing Networks (SON) optimization
related to mobility load balancing (MLB) for LTE systems using ns-3 [1]. The implementation is achieved
toward two MLB algorithms dynamically adjusting handover (HO) parameters based on the Reference
Signal Received Power (RSRP) measurements. Such adjustments are done with respect to loads of both an
overloaded cell and its cells’ neighbours having enough available resources enabling to achieve load
balancing. Numerical investigations through selected key performance indicators (KPIs) of the proposed
MLB algorithms when compared with another HO algorithm (already implemented in ns-3) based on A3
event [2] highlight the significant MLB gains provided in terms global network throughput, packet loss rate
and the number of successful HO without incurring significant overhead.
EVALUATION OF A NEW INCREMENTAL CLASSIFICATION TREE ALGORITHM FOR MINING HIGH...mlaij
Abstract—A new model for online machine learning process of high speed data stream is proposed, to
minimize the severe restrictions associated with the existing computer learning algorithms. Most of the
existing models have three principle steps. In the first step, the system would create a model incrementally.
In the second step the time taken by the examples to complete a prescribed procedure with their arrival
speed is computed. In the third and final step of the model the size of memory required for computation is
predicted in advance. To overcome these restrictions we proposed this new data stream classification
algorithm, where the data can be partitioned into stream of trees. In this algorithm, the new data set can be
updated with the existing tree. This algorithm, called incremental classification tree algorithm, is proved to
be an excellent solution for processing larger data streams. In this paper, we present the experimental
results of our new algorithm and prove that our method would eradicate the problems of the existing
method.
A Novel Method for Quantitative Assessment of Software QualityCSCJournals
This paper deals with quantitative quality model that needs to be practiced formally through out the software development life cycle at each phase. Proposed quality model emphasizes that various stakeholders need to be consulted for quality requirements. The quality goals are set through various measurements and metrics. Software under development is evaluated against expected value of set of metrics. The use of proposed quantitative model is illustrated through a simple case study. The unaddressed quality attribute reusability in ISO 9126 is also discussed.
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTijseajournal
The present business network infrastructure is quickly varying with latest servers, services, connections,
and ports added often, at times day by day, and with a uncontrollably inflow of laptops, storage media and
wireless networks. With the increasing amount of vulnerabilities and exploits coupled with the recurrent
evolution of IT infrastructure, organizations at present require more numerous vulnerability assessments.
In this paper new approach the Unified process for Network vulnerability Assessments hereafter called as
a unified NVA is proposed for network vulnerability assessment derived from Unified Software
Development Process or Unified Process, it is a popular iterative and incremental software development
process framework.
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISONijseajournal
Performance responsiveness and scalability is a make-or-break quality for software. Nearly everyone runs into performance problems at one time or another. This paper discusses about performance issues faced during Pre Examination Process Automation System (PEPAS) implemented in java technology. The challenges faced during the life cycle of the project and the mitigation actions performed. It compares 3 java technologies and shows how improvements are made through statistical analysis in response time of the application. The paper concludes with result analysis.
Software development field is becoming more
productive day by day with the wonderful model name Agile. Agile
is the main focus of research now a days. It is because of its
abilities of handling changes in efficient way through iterative and
incremental practices. Although it became famous because of its
capabilities still there are some issues in it, which is ignorance of
usability engineering in different phases of agile that is an
important aspect to understand the software. Usability has deep
roots in software quality and is a core construct of HCI. To develop
interactive and usable systems there is a need of such a model
which can integrate HCI with Agile. To address this issue. To solve
this issue we have proposed a model which will work with both
User Centered (main focus of HCI) and Agile by assembling
different practices from both fields which will result useable
products. It will enhance software life with user satisfaction by
giving them running software with usability.
A comparative studies of software quality model for the software product eval...imdurgesh
Actually, software products are increasing in a fast way and are used in almost all activities of human life.
Consequently measuring and evaluating the quality of a software product has become a critical task for many companies.
Several models have been proposed to help diverse types of users with quality issues. The development of techniques for
building software has influenced the creation of models to assess the quality. Since 2000 the construction of software
started to depend on generated or manufactured components and gave rise to new challenges in assessing quality.
These components introduce new concepts such as configurability, reusability, availability, better quality and lower cost.
Consequently, the models are classified into basic models which were developed until 2000, and those based on
components called tailored quality models. The purpose of this article is to describe the main models with their strengths
and point out some deficiencies. In this work, we conclude that in the present age, aspects of communications play an
important factor in the quality of the software.
Software Development Life Cycle: Traditional and Agile- A Comparative Studyijsrd.com
In the field of software development, software development lifecycle is the most important component. There is a number of software development methodologies used in software industry today. The paper discussed below focuses on the modern SDLC which are traditional methods and the agile methods. It also explains the compensation and shortcomings of the traditional as well as agile methods. Along with this, it suggests some improvements which could help in improving current agile development
Selenium - A Trending Automation Testing Toolijtsrd
Selenium is an important testing tool for software quality assurance. In recent days number of websites are increasing rapidly and it becomes essential to test the websites against various quality factors to make sure it meets the expected quality goals. Several companies are spending a lot of bucks for the testing tool while Selenium is available completely free for the performance test. The open source tool is well known for its unlimited capabilities and unlimited reach. Selenium stands out from the crowd in this aspect. Anyone could visit the Selenium website and download the latest version and use it. It is not only an open source but also highly modifiable. Testers could make changes based upon their needs and requirements. Manav Kundra "Selenium - A Trending Automation Testing Tool" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-4 , June 2020, URL: https://www.ijtsrd.com/papers/ijtsrd31202.pdf Paper Url :https://www.ijtsrd.com/engineering/software-engineering/31202/selenium-%E2%80%93-a-trending-automation-testing-tool/manav-kundra
A Software System Development Life Cycle Model for Improved Students Communic...IJCSES Journal
Software engineering provides methodologies, concepts and practices, which are used for analyzing,designing, building and maintaining the information in a software industry. Software Development Life Cycle (SDLC) model is an approach used in the software industry for the development of various size
projects: small scale projects, medium scale projects and large scale projects. A software project of any size is developed with the co-ordination of development team. It is therefore important to assign resources intelligently to the different phases of the software project by the project manager. This study proposes a
model for the spiral development process with the use of a simulator (Simphony.NET), which helps the project manager in determining how to increase the productivity of a software firm with the use of minimum resources (expert team members). This model increase the utilization of different development
processes by keeping all development team members busy all the time, which helps in decreasing idle and waste time.
MODELING, IMPLEMENTATION AND PERFORMANCE ANALYSIS OF MOBILITY LOAD BALANCING ...IJCNCJournal
We propose in this paper a simulation implementation of Self-Organizing Networks (SON) optimization
related to mobility load balancing (MLB) for LTE systems using ns-3 [1]. The implementation is achieved
toward two MLB algorithms dynamically adjusting handover (HO) parameters based on the Reference
Signal Received Power (RSRP) measurements. Such adjustments are done with respect to loads of both an
overloaded cell and its cells’ neighbours having enough available resources enabling to achieve load
balancing. Numerical investigations through selected key performance indicators (KPIs) of the proposed
MLB algorithms when compared with another HO algorithm (already implemented in ns-3) based on A3
event [2] highlight the significant MLB gains provided in terms global network throughput, packet loss rate
and the number of successful HO without incurring significant overhead.
EVALUATION OF A NEW INCREMENTAL CLASSIFICATION TREE ALGORITHM FOR MINING HIGH...mlaij
Abstract—A new model for online machine learning process of high speed data stream is proposed, to
minimize the severe restrictions associated with the existing computer learning algorithms. Most of the
existing models have three principle steps. In the first step, the system would create a model incrementally.
In the second step the time taken by the examples to complete a prescribed procedure with their arrival
speed is computed. In the third and final step of the model the size of memory required for computation is
predicted in advance. To overcome these restrictions we proposed this new data stream classification
algorithm, where the data can be partitioned into stream of trees. In this algorithm, the new data set can be
updated with the existing tree. This algorithm, called incremental classification tree algorithm, is proved to
be an excellent solution for processing larger data streams. In this paper, we present the experimental
results of our new algorithm and prove that our method would eradicate the problems of the existing
method.
PREDICTING STUDENT ACADEMIC PERFORMANCE IN BLENDED LEARNING USING ARTIFICIAL ...ijaia
Along with the spreading of online education, the importance of active support of students involved in
online learning processes has grown. The application of artificial intelligence in education allows
instructors to analyze data extracted from university servers, identify patterns of student behavior and
develop interventions for struggling students. This study used student data stored in a Moodle server and
predicted student success in course, based on four learning activities - communication via emails,
collaborative content creation with wiki, content interaction measured by files viewed and self-evaluation
through online quizzes. Next, a model based on the Multi-Layer Perceptron Neural Network was trained to
predict student performance on a blended learning course environment. The model predicted the
performance of students with correct classification rate, CCR, of 98.3%.
A FRAMEWORK FOR INTEGRATING USABILITY PRACTICES INTO SMALL-SIZED SOFTWARE DEV...ijseajournal
Usability now appears to be a highly important attribute for software quality; it is a critical factor that
needs to be considered by every software-development organization when developing software to improve
customer satisfaction and increase competition in the market. There exists a lack of a reference model or
framework for small-sized software-development organizations to indicate which usability practices should
be implemented, and where in the system-development life cycle they need to be considered. We offer
developers who have the objective of integrating usability practices into their development life cycle a
framework that characterizes 10 selected user-centered design (UCD) methods in relation to five relevant
criteria based on some ISO factors that have an effect on the selection of methods (ISO/TR16982). The
selection of the methods for inclusion in the framework responds to these organizations’ needs; and we
selected basic methods that are recommended, cost-effective, simple to plan and apply, and easy to learn by
developers; and which can be applied when time, resources, skills, and expertise are limited. We favor
methods that are generally applicable across a wide range of development environments. The selected
methods are organized in the framework according to the stages in the development process where they
might be applied. The only requirement for the existing development life cycle is that it to be based on an
iterative approach.
FLEXIBLE VIRTUAL ROUTING FUNCTION DEPLOYMENT IN NFV-BASED NETWORK WITH MINIMU...IJCNCJournal
In a conventional network, most network devices, such as routers, are dedicated devices that do not
have much variation in capacity. In recent years, a new concept of Network Functions
Virtualisation (NFV) has come into use. The intention is to implement a variety of network functions
with software on general-purpose servers and this allows the network operator to select any
capabilities and locations of network functions without any physical constraints.
This paper focuses on the deployment of NFV-based routing functions which are one of critical
virtual network functions, and present the algorithm of virtual routing function allocation that
minimize the total network cost. In addition, this paper presents the useful allocation policy of
virtual routing functions, based on an evaluation with a ladder-shaped network model. This policy
takes the ratio of the cost of a routing function to that of a circuit and traffic distribution in the
network into consideration. Furthermore, this paper shows that there are cases where the use of
NFV-based routing functions makes it possible to reduce the total network cost dramatically, in
comparison to a conventional network, in which it is not economically viable to distribute smallcapacity
routing functions
A HYBRID APPROACH COMBINING RULE-BASED AND ANOMALY-BASED DETECTION AGAINST DD...IJNSA Journal
We have designed a hybrid approach combining rule-based and anomaly-based detection against DDoS
attacks. In the approach, the rule-based detection has established a set of rules and the anomaly-based
detection use one-way ANOVA test to detect possible attacks. We adopt TFN2K (Tribe Flood, the Net 2K)
as an attack traffic generator and monitor the system resource of the victim like throughput, memory
utilization, CPU utilization consumed by attack traffic. Target users of the proposed scheme are data
center administrators. The types of attack traffic have been analysed and by that we develop a defense
scheme. The experiment has demonstrated that the proposed scheme can effectively detect the attack traffic.
IMAGE BASED RECOGNITION - RECENT CHALLENGES AND SOLUTIONS ILLUSTRATED ON APPL...mlaij
In this paper, problems and solutions for the automatic recognition of miscellaneous materials, especially
bulk materials are discussed. The fact that many materials, especially natural materials, have a strong
phenotypic variability resulting in high intra-class and low inter-class variability of the calculated features
poses a complex recognition problem. The recognition of components of a wheat sample or the
classification of mineral aggregates serves as an example to demonstrate different aspects in segmentation,
feature extraction, classifier design and complexity assessment. We present a technique for the
segmentation of highly overlapping and touching objects into single object images, a proposal for feature
selection and classifier parameter optimization, as well as a method to visualise the complexity of a highdimensional
recognition problem in a three-dimensional space. Every step of the pattern recognition
process needs to be optimized carefully with special attention to the risk of overfitting. Modern processors
and the application of field-programmable gate arrays as well as the outsourcing of processing steps to the
graphic processing unit speed up the calculation and make real-time computation possible also for highly
complex recognition problems such as the quality assurance of bulk materials.
This paper presents a review & performs a comparative evaluation of few known machine learning
algorithms in terms of their suitability & code performance on any given data set of any size. In this paper,
we describe our Machine Learning ToolBox that we have built using python programming language. The
algorithms used in the toolbox consists of supervised classification algorithms such as Naïve Bayes,
Decision Trees, SVM, K-nearest Neighbors and Neural Network (Backpropagation). The algorithms are
tested on iris and diabetes dataset and are compared on the basis of their accuracy under different
conditions. However using our tool one can apply any of the implemented ML algorithms on any dataset of
any size. The main goal of building a toolbox is to provide users with a platform to test their datasets on
different Machine Learning algorithms and use the accuracy results to determine which algorithms fits the
data best. The toolbox allows the user to choose a dataset of his/her choice either in structured or
unstructured form and then can choose the features he/she wants to use for training the machine We have
given our concluding remarks on the performance of implemented algorithms based on experimental
analysis
GAME THEORY BASED INTERFERENCE CONTROL AND POWER CONTROL FOR D2D COMMUNICATIO...IJCNCJournal
With the current development of mobile communication services, people need personal communication of
high speed, excellent service, high quality and low latency,however, limited spectrum resources become
the most important factor to hamper improvement of cellular systems. As big amount of data traffic will
cause greater local consumption of spectrum resources, future networks are required to have appropriate
techniques to better support such forms of communication. D2D (Device-to-device) communication
technology in a cellular network makes full use of spectrum resources underlaying, reduces the load of the
base station, minimizes transmit power of the terminals and the base stations, thereby enhances the overall
throughput of the networks. Due to the use of multiplexing D2D UE (User equipment) resources and
spectrum, and the interference caused by the sharing of resources between adjacent cells, it has become a
major factor affecting coexisting of cellular subscribers and D2D users. When D2D communication
multiplexes the uplink resources, the base-stations are easily to be disturbed; when the downlink resources
are multiplexed, the users of downlink are susceptible to interference. In order to build a high-efficient
mobile network, we can meet the QoS requirements by controlling the power to suppress the interference
between the base station and a terminal user.
OPTIMIZATION OF NEURAL NETWORK ARCHITECTURE FOR BIOMECHANIC CLASSIFICATION TA...ijaia
Electromyogram signals (EMGs) contain valuable information that can be used in man-machine interfacing between human users and myoelectric prosthetic devices. However, EMG signals are
complicated and prove difficult to analyze due to physiological noise and other issues. Computational
intelligence and machine learning techniques, such as artificial neural networks (ANNs), serve as powerful
tools for analyzing EMG signals and creating optimal myoelectric control schemes for prostheses. This
research examines the performance of four different neural network architectures (feedforward, recurrent,
counter propagation, and self organizing map) that were tasked with classifying walking speed when given
EMG inputs from 14 different leg muscles. Experiments conducted on the data set suggest that self
organizing map neural networks are capable of classifying walking speed with greater than 99% accuracy.
AN APPORACH FOR SCRIPT IDENTIFICATION IN PRINTED TRILINGUAL DOCUMENTS USING T...ijaia
In this work, we review the outcome of texture features for script classification. Rectangular White Space
analysis algorithm is used to analyze and identify heterogeneous layouts of document images. The texture
features, namely the color texture moments, Local binary pattern (LBP) and responses of Gabor, LM-filter,
S-filter, R-filter are extracted, and combinations of these are considered in the classification. In this work,
a probabilistic neural network and Nearest Neighbor are used for classification. To corrabate the
adequacy of the proposed strategy, an experiment was operated on our own data set. To study the effect of
classification accuracy, we vary the database sizes and the results show that the combination of multiple
features vastly improves the performance.
COLOR IMAGE ENCRYPTION BASED ON MULTIPLE CHAOTIC SYSTEMSIJNSA Journal
This paper proposed a novel color image encryption scheme based on multiple chaotic systems. The
ergodicity property of chaotic system is utilized to perform the permutation process; a substitution
operation is applied to achieve the diffusion effect. In permutation stage, the 3D color plain-image matrix
is converted to a 2D image matrix, then two generalized Arnold maps are employed to generate hybrid
chaotic sequences which are dependent on the plain-image’s content. The generated chaotic sequences are
then applied to perform the permutation process. The encryption’s key streams not only depend on the
cipher keys but also depend on plain-image and therefore can resist chosen-plaintext attack as well as
known-plaintext attack. In the diffusion stage, four pseudo-random gray value sequences are generated by
another generalized Arnold map. The gray value sequences are applied to perform the diffusion process by
bitxoring operation with the permuted image row-by-row or column-by-column to improve the encryption
rate. The security and performance analysis have been performed, including key space analysis, histogram
analysis, correlation analysis, information entropy analysis, key sensitivity analysis, differential analysis
etc. The experimental results show that the proposed image encryption scheme is highly secure thanks to its
large key space and efficient permutation-substitution operation, and therefore it is suitable for practical
image and video encryption
ENERGY CONSUMPTION IMPROVEMENT OF TRADITIONAL CLUSTERING METHOD IN WIRELESS S...IJCNCJournal
In the traditional clustering routing protocol of wireless sensor network, LEACH protocol (Low Energy
Adaptive Clustering Hierarchy) is considered to have many outstanding advantages in the implementation
of the hierarchy according to low energy adaptive cluster to collect and distribute the data to the base
station. The main objective of LEACH is: To prolong life time of the network, reduce the energy
consumption by each node, using the data concentration to reduce bulletins in the network. However, in the
case of large network, the distance from the nodes to the base station is very different. Therefore, the
energy consumption when becoming the host node is very different but LEACH is not based on the
remaining energy to choose the host node, which is based on the number of times to become the host node
in the previous rounds. This makes the nodes far away from the base station lose power sooner.
In this paper, we give a new routing protocol based on the LEACH protocol in order to improve operating
time of sensor network by considering energy issues and distance in selecting the cluster-head (CH), at that
time the nodes with high energy and near the base station (BS) will have a greater probability of becoming
the cluster-head than the those in far and with lower energy.
PERFORMANCE EVALUATION OF MOBILE WIMAX IEEE 802.16E FOR HARD HANDOVERIJCNCJournal
Seamless handover in wireless networks is to guarantee both service continuity and service quality. In
WiMAX, providing scalability and quality of service for multimedia services during handover is a main
challenge because of high latency and packet loss. In this paper, we created four scenarios using Qualnet
5.2 Network Simulator to analyze the hard handover functionality of WiMAX under different conditions.
The scenarios such as Flag with 5 and 10 sec UCD and DCD interval values, Random mobility scenario
and DEM scenario using 6 WiMAX Cells have been considered. This study is performed over the real
urban area of JNU where we have used JNU map for scenarios 1, 2 and 3 but for scenario 4, the JNU
terrain data has been used. Further, each BS of 6 WiMAX cell is connected to four nodes. All nodes of each
scenario are fixed except Node 1. Node 1 is moving and performing the handover between the different BSs
while sending and receiving real time traffics. Flag mobility model is used in Scenario 1, 2 and 4 to model
the movement of the Node 1 while we use random mobility model in sceanrio3. 5 seconds time interval is
used for Scenarios 1, 3, and 4 while 10 seconds time interval is used for scenario 2 to study the effect of
management messages load on handover. Further, the statistical measures of handover performance of
WiMAX in terms of number of handover performed, throughput, end-to-end delay, jitter, and packets
dropped are observed and evaluated.
Big data is a prominent term which characterizes the improvement and availability of data in all three
formats like structure, unstructured and semi formats. Structure data is located in a fixed field of a record
or file and it is present in the relational data bases and spreadsheets whereas an unstructured data file
includes text and multimedia contents. The primary objective of this big data concept is to describe the
extreme volume of data sets i.e. both structured and unstructured. It is further defined with three “V”
dimensions namely Volume, Velocity and Variety, and two more “V” also added i.e. Value and Veracity.
Volume denotes the size of data, Velocity depends upon the speed of the data processing, Variety is
described with the types of the data, Value which derives the business value and Veracity describes about
the quality of the data and data understandability. Nowadays, big data has become unique and preferred
research areas in the field of computer science. Many open research problems are available in big data
and good solutions also been proposed by the researchers even though there is a need for development of
many new techniques and algorithms for big data analysis in order to get optimal solutions. In this paper,
a detailed study about big data, its basic concepts, history, applications, technique, research issues and
tools are discussed.
ENHANCE RFID SECURITY AGAINST BRUTE FORCE ATTACK BASED ON PASSWORD STRENGTH A...IJNSA Journal
RFID systems are one of the important techniques that have been used in modern technologies; these
systems rely heavily on default and random passwords. Due to the increasing use of RFID in various
industries, security and privacy issues should be addressed carefully as there is no efficient way to achieve
security in this technology. Some active tags are low cost and basic tags cannot use standard cryptographic
operations where the uses of such techniques increase the cost of these cards. This paper sheds light on the
weaknesses of RFID system and identifies the threats and countermeasures of possible attacks. For the
sake of this paper, an algorithm was designed to ensure and measure the strength of passwords used in the
authentication process between tag and reader to enhance security in their communication and defend
against brute-force attacks. Our algorithm is design by modern techniques based on entropy, password
length, cardinality, Markov-model and Fuzzy Logic
MULTI MODEL DATA MINING APPROACH FOR HEART FAILURE PREDICTIONIJDKP
Developing predictive modelling solutions for risk estimation is extremely challenging in health-care
informatics. Risk estimation involves integration of heterogeneous clinical sources having different
representation from different health-care provider making the task increasingly complex. Such sources are
typically voluminous, diverse, and significantly change over the time. Therefore, distributed and parallel
computing tools collectively termed big data tools are in need which can synthesize and assist the physician
to make right clinical decisions. In this work we propose multi-model predictive architecture, a novel
approach for combining the predictive ability of multiple models for better prediction accuracy. We
demonstrate the effectiveness and efficiency of the proposed work on data from Framingham Heart study.
Results show that the proposed multi-model predictive architecture is able to provide better accuracy than
best model approach. By modelling the error of predictive models we are able to choose sub set of models
which yields accurate results. More information was modelled into system by multi-level mining which has
resulted in enhanced predictive accuracy.
ASSESSING THE PERFORMANCE AND ENERGY USAGE OF MULTI-CPUS, MULTI-CORE AND MANY...ijdpsjournal
This paper studies the performance and energy consumption of several multi-core, multi-CPUs and manycore
hardware platforms and software stacks for parallel programming. It uses the Multimedia Multiscale
Parser (MMP), a computationally demanding image encoder application, which was ported to several
hardware and software parallel environments as a benchmark. Hardware-wise, the study assesses
NVIDIA's Jetson TK1 development board, the Raspberry Pi 2, and a dual Intel Xeon E5-2620/v2 server, as
well as NVIDIA's discrete GPUs GTX 680, Titan Black Edition and GTX 750 Ti. The assessed parallel
programming paradigms are OpenMP, Pthreads and CUDA, and a single-thread sequential version, all
running in a Linux environment. While the CUDA-based implementation delivered the fastest execution, the
Jetson TK1 proved to be the most energy efficient platform, regardless of the used parallel software stack.
Although it has the lowest power demand, the Raspberry Pi 2 energy efficiency is hindered by its lengthy
execution times, effectively consuming more energy than the Jetson TK1. Surprisingly, OpenMP delivered
twice the performance of the Pthreads-based implementation, proving the maturity of the tools and
libraries supporting OpenMP.
USING ONTOLOGY BASED SEMANTIC ASSOCIATION RULE MINING IN LOCATION BASED SERVICESIJDKP
Recently, GPS and mobile devices allowed collecting a huge amount of mobility data. Researchers from
different communities have developed models and techniques for mobility analysis. But they mainly focused
on the geometric properties of trajectories and do not consider the semantic facet of moving objects. The
techniques are good at extracting patterns, but they are hard to interpret in a specific application domain.
This paper proposes a methodology to understand mobility data and semantically interpret trajectory
patterns. The process considers four different behavior types such as semantic, semantic and space,
semantic and time, and semantic and space-time. Finally, a system prototype was developed to evaluate the
behavior models in different aspects using one of the location based services. The results showed that
applying the semantic association rules could significantly reduce the number of available services and
customize the services based on the rules.
MR – RANDOM FOREST ALGORITHM FOR DISTRIBUTED ACTION RULES DISCOVERYIJDKP
Action rules, which are the modified versions of classification rules, are one of the modern approaches for
discovering knowledge in databases. Action rules allow us to discover actionable knowledge from large
datasets. Classification rules are tailored to predict the object’s class. Whereas action rules extracted from
an information system produce knowledge in the form of suggestions of how an object can change from one
class to another more desirable class. Over the years, computer storage has become larger and also the
internet has become faster. Hence the digital data is widely spread around the world and even it is growing
in size such a way that it requires more time and space to collect and analyze them than a single computer
can handle. To produce action rules from a distributed massive data requires a distributed action rules
processing algorithm which can process the datasets in all systems in one or more clusters simultaneously
and combine them efficiently to induce single set of action rules. There has been little research on action
rules discovery in the distributed environment, which presents a challenge. In this paper, we propose a new
algorithm called MR – Random Forest Algorithm to extract the action rules in a distributed processing
environment.
Comparing Various SDLC Models On The Basis Of Available MethodologyIJMER
There are various SDLC models widely accepted and employed for developing software.
SDLC models give a theoretical guide line regarding development of the software. Employing proper
SDLC allows the managers to regulate whole development strategy of the software. Each SDLC has its
advantages and disadvantages making it suitable for use under specific condition and constraints for
specified type of software only. We need to understand which SDLC would generate most successful
result when employed for software development. For this we need some method to compare SDLC
models. Various methods have been suggested which allows comparing SDLC models. Comparing SLDC
models is a complex task as there is no mathematical theorem or physical device available. The essence
of this paper is to analyse some methodologies that could result in successful comparison of the SDLC
models. For this we have studied various available tools, techniques and methodologies and have tried
to extract most simple, easy and highly understandable method for comparing SDLC models.
APPLYING CONTINUOUS INTEGRATION FOR INCREASING THE MAINTENANCE QUALITY AND EF...ijseajournal
In order to project resource management and time control, software system needs to be decomposed into subsystems, functional modules and basis components. Finally, all tested components have to integrate to be the complete system. Applying IID (Iterative Incremental Development) mechanism, agile development model becomes the practical method to reduce software project failure rate. Continuous integration (CI) is an IID implementation concept which can effectively reduce software development risk. Web app with high change characteristic is suitable to use agile development model as the development and maintenance methodology. The paper depth surveys CI operating environment and advantages. Introducing CI concept can make up the moving target problems to impact of Web app. For this, the paper proposes a Continuous
Integration based Web Applications Maintenance Procedure (CIWAMP) to assist the system integration operating. Based on CI characteristics, CIWAMP makes Web app can be deployed quickly, increase stakeholder communication frequency, improve staff morale, and effectively reduce Web app maintenance
quality and efficiency.
A sustainable procedural method of software design process improvementsnooriasukmaningtyas
In practice, the software process is an intermediate phase for enhancement and improvements the design for different types of software products and help developers to converts the specified requirements into prototypes that implement the design into reality. The objective of this paper is to provide software developers, designers and software engineers who work in small companies with a standards-based process improvement using a procedural method technique including detailed steps for designing the small software systems into their companies. The method used in this paper includes 1) analysis four different types of commonly design processes used by industry such as CMMI, conventional or software process in ISO 19759, generic and engineering design processes. 2) mapping between those four design processes. 3) collect the dispersed design concepts proposed by those four processes. 4) proposed a sustainable procedural method of software design process improvements 5) Illustration of the applicability of the proposed approach using A template-based implementation. The primary result of this study is a guideline procedure with detailed steps for software design process improvements to help and guide developers in small companies to analyze and design a small software scales with limited cost and duration. In conclusion, this paper proposed a method to improve the design process for different kinds of the software systems using a templatebased implementation to reduce the cost, effort and time needed in the implementation phase in small companies. The scientific implication behind a template-based implementation helps the system and software engineering to use this template easily in their small companies; because most of the time those engineering developers are responsible for analyzing, designing, implementing and testing their software systems during the whole software life cycle.
SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...ijseajournal
In software development phase software artifacts are not in consistent states such as: some of the class artifacts are fully developed some are half developed, some are major developed, some are minor developed and some are not developed yet. At this stage allowing too many software requirement changes may possibly delay in project delivery and increase development budget of the software. On the other hand rejecting too many changes may increase customer dissatisfaction. Software change effort estimation is one of the most challenging and important activity that helps software project managers in accepting or rejecting changes during software development phase. This paper extends our previous works on developing a software requirement change effort estimation model prototype tool for the software development phase. The significant achievements of the tool are demonstrated through an extensive experimental validation using several case studies. The experimental analysis shows improvement in the estimation accuracy over current change effort estimation models.
Automatic model transformation on multi-platform system development with mode...CSITiaesprime
Several difficulties commonly arise during the software development process. Among them are the lengthy technical process of developing a system, the limited number and technical capabilities of human resources, the possibility of bugs and errors during the testing and implementation phase, dynamic and frequently changing user requirements, and the need for a system that supports multi-platforms. Rapid application development (RAD) is the software development life cycle (SDLC) that emphasizes the production of a prototype in a short amount of time (30-90 days). This study discovered that implementing a model-driven architecture (MDA) approach into the RAD method can accelerate the model design and prototyping stages. The goal is to accelerate the SDLC process. It took roughly five weeks to construct the system by applying all of the RAD stages. This time frame does not include iteration and the cutover procedure. During the prototype test, there were no errors with the create, read, update, and delete (CRUD) procedure. It was demonstrated that automatic transformation in MDA can shorten the RAD phases for designing the model and developing an early prototype, reduce code errors in standard processes like CRUD, and construct a system that supports multi-platform.
It concentrate on the development rather than managerial ascepts of software projects.
XP was designed so that organization would be free to adopt all or part of the methodology.
XP projects start with a release planning phase,followed by several iteration ,each of which concludes with user acceptance testing.
When the product has enough features to satisfy users, the team termination iteration and release the software.
Software projects mostly exceeds budget, delivered late and does not meet with the customer’s satisfaction for years. In the past, many traditional development models like waterfall, spiral, iterative, and prototyping methods are used to build the software systems. In recent years, agile models are widely used in developing the software products. The major reasons are – simplicity, incorporating the requirement changes at any time, light-weight approach and delivering the working product early and in short duration. Whatever the development model used, it still remains a challenge for software engineer’s to accurately estimate the size, effort and the time required for developing the software system. This survey focuses on the existing estimation models used in traditional as well in agile software development.
The performance of an algorithm can be improved using a parallel computing programming approach. In this study, the performance of bubble sort algorithm on various computer specifications has been applied. Experimental results have shown that parallel computing programming can save significant time performance by 61%-65% compared to serial computing programming.
APPLYING CONTINUOUS INTEGRATION FOR INCREASING THE MAINTENANCE QUALITY AND EF...ijseajournal
In order to project resource management and time control, software system needs to be decomposed into
subsystems, functional modules and basis components. Finally, all tested components have to integrate to
be the complete system. Applying IID (Iterative Incremental Development) mechanism, agile development
model becomes the practical method to reduce software project failure rate. Continuous integration (CI) is
an IID implementation concept which can effectively reduce software development risk. Web app with high
change characteristic is suitable to use agile development model as the development and maintenance
methodology. The paper depth surveys CI operating environment and advantages. Introducing CI concept
can make up the moving target problems to impact of Web app. For this, the paper proposes a Continuous
Integration based Web Applications Maintenance Procedure (CIWAMP) to assist the system integration
operating. Based on CI characteristics, CIWAMP makes Web app can be deployed quickly, increase
stakeholder communication frequency, improve staff morale, and effectively reduce Web app maintenance
quality and efficiency
A PROPOSED HYBRID AGILE FRAMEWORK MODEL FOR MOBILE APPLICATIONS DEVELOPMENT ijseajournal
With the increasing in mobile application systems and a high competition between companies, that led to
increase in the number of mobile application projects.
Mobile software development is a group of process for creating software for mobile devices with limited
resources like small screen, low-power. The development of mobile applications is a big challenging
because of rapidly changing business requirements and technical constraints for mobile systems. So,
developers faced the challenge of a dynamic environment and the Changing of mobile application
requirements. Moreover, Mobile applications should adapt appropriate software development methods that
act in response efficiently to these challenges.
However, at the moment, there is limited knowledge about the suitability of different software practices for
the development of mobile applications. According to many researchers ,Agile methodologies was found to
be most suitable for mobile development projects as they are short time, require flexibility, reduces waste
and time to market.
Finally, in this research we are looking for a suitable process model that conforms to the requirement of
mobile application, we are going to investigate agile development methods to find a way, making the
development of mobile application easy and compatible with mobile device features.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
20240609 QFM020 Irresponsible AI Reading List May 2024
MODEL DRIVEN WEB APPLICATION DEVELOPMENT WITH AGILE PRACTICES
1. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.5, September 2016
DOI : 10.5121/ijsea.2016.7501 1
MODEL DRIVEN WEB APPLICATION
DEVELOPMENT WITH AGILE PRACTICES
Gürkan Alpaslan 1
and Oya Kalıpsız 2
1,2
Department of Computer Engineering, Yıldız Technical University, Istanbul, Turkey
ABSTRACT
Model driven development is an effective method due to its benefits such as code transformation, increasing
productivity and reducing human based error possibilities. Meanwhile, agile software development
increases the software flexibility and customer satisfaction by using iterative method. Can these two
development approaches be combined to develop web applications efficiently? What are the challenges and
what are the benefits of this approach? In this paper, we answer these two crucial problems; combining
model driven development and agile software development results in not only fast development and
easiness of the user interface design but also efficient job tracking. We also defined an agile model based
approach for web applications whose implementation study has been carried out to support the answers we
gave these two crucial problems.
KEYWORDS
Model driven development, Web application development, Agile methodology
1. INTRODUCTION
Model driven development or MDD is a method; proposes to produce the source codes via
models [1], [2]. Models are the abstracted representation of the system elements [3]. Created
models are transformed to source codes by MDD tools with automated code generation property
[4], [5]. This is very beneficial attribute, since it reduces the human factor on software coding. In
other words, it leaves the coding part only to computers. Thus, developers only focus on creating
the system models properly [6].
Models are utilized for web applications on different methodologies like WebML [7], [8], UWE
[9], [10] and OOHDM [11]. These methodologies are mostly based on Unified modeling
Language. Another method is Mockup driven development which is based on prototyping the
web applications [12]. Mockups are the dynamic user interface prototypes created by mockup
development tools [13], [14]. These tools provide to transform the mockups to executable web
pages created by Hyper Text Markup Language (HTML), Cascade Style Sheet (CSS), JavaScript
codes and other web development technologies [15]. The critical advantage of using tools than
hand-coding is to utilize the last technologies, low error rate and pace.
On the other hand, agile practices [16], [17] aim to deliver executable software quickly. Agile
based development methods do not consider the documentation and the structure of them are
iterative; software is developed in pieces. This structure provides more flexible skeleton and
responses the feedbacks better during the life cycle [18].
In literature, there are some life cycle diagrams combining the agile practices on model driven
development [19]. One of the prior studies in literature is the agile model driven development
2. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.5, September 2016
2
(AMDD) high level life cycle [20], [21]. It basically proposes a life cycle consists two main
phases: inception phase and development phase. The inception phase is the general modelling part
of the whole system. Iterations are implemented in the development phase.
Another life cycle is the Hybrid MDD development method [22]. It is a developed method of the
AMDD high level life cycle. It consists two main phases like high level life cycle, but it defines
the three development teams working parallel. These teams are called as model development
team, agile development team and business analyst team, which are defined in Section 3.
There are also MDD SLAP method [23], which is developed by Motorola Company in order to
work their own company agile projects, and Sage MDD [24] method, which is developed in order
to use on developing multi agent systems. The methods are evaluated with the criteria of
contribution and target platform they are developed for (Table 1).
Table 1. Main AMDD life cycles in literature
Main Contribution Target Platform
AMDD High
Level
Life Cycle
First life cycle combining
agile practices with
model driven engineering
General projects
Sage MDD
Based on the integration of the
different models incrementally
and iteratively
Multi-agent
systems
Hybrid MDD
Described the parallel
working teams on AMDD
High Level Life Cycle
Small or medium
size
general projects
MDD-SLAP
Identify the relation
between agile principles
and model-driven practices
and implemented on Scrum
method.
Telecommunication
systems
To evaluate the cost estimation of web projects have different properties than others [29].
Generally, web projects have small team groups and not trustable to evaluate the project with
code line numbers. Instead of that, the main factors which are personnel, product, platform and
project factors have been utilized for better estimation. As a result of estimation, the result give
the cost tendency of method (Section 4.2).
The main contributions of this work are: (1) to provide an agile supporting model-driven
approach customized for web applications, (2) to propose a life cycle, implemented on real
projects for developers; (3) and to provide the cost evaluation of the approach.
The paper is structured as follows: in Section 2, we describe our proposed approach in detail and
then in Section 3, we detail the implementation work of approach. Section 4 describes the
discussions and results of the work and finally, in Section 5 we draw some conclusions and
present our future work on this field.
2. PROPOSED APPROACH
The approach is based on the Hybrid MDD [22] method. The Hybrid MDD method is customized
for web applications. The dynamic prototyping method for web applications called mockup
driven development [12] is utilized on the life cycle. We defined which parts are created by using
3. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.5, September 2016
3
mockups automatically and which parts are created by handcrafted coding. After all, these parts
are integrated properly.
For web applications, system architecture can define as two programming parts called client-side
and server-side [25]. Client-side codes are not generated on servers. Server-side codes are
generated on server and send to clients. In our approach, we aim to coding all client-side parts via
model transformation. Client-side part is developed from models with automated code generation
and server-side part is developed by handcrafted codes. Finally, these parts are integrated and
final software is emerged.
2.1. The Steps of the Approach
The approach is progressed iteratively and incrementally. Three teams are worked parallel
throughout the life cycle. These teams are the model driven development team, agile development
team and business analyst team. Model development team is responsible of the model
infrastructure construction, creation of web models with their attributes and functions and the
automated code generation parts. Agile development team is responsible of the test environment
creation and handcrafted codes. Business analyst team is responsible of the interaction with
customer, creation of the requirement of the system. These teams work with high cooperation as
the result of agile principle. The steps of the method are basically illustrated (Fig. 1).
Figure 1. The main steps of the approach
The approach starts with identifying the system scope by business analyst team. The business
analyst team describes the requirements by cooperation with customer. After the requirement
analyst, all system is divided the iterations by the order of their priority. The development phase
starts with most important priority iteration. After this point, all three teams work simultaneously.
Agile development team creates the test units; while MDD team decides the proper model
development tool and sets up; and business analyst team interacts with customer and coordinates
4. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.5, September 2016
4
all team members. After the iteration skeleton is completed, agile team develops handcrafted
codes like database creation, database connection classes; model-driven development team
creates the model with assigning their attributes and functions. In this process, the aim should be
creation of all client-side codes by only model generation. Only server-side codes will be coded
by handcrafted coding. After all, handcrafted coding parts and codes generated from model are
integrated. These process repeats until the all iteration is completed.
2.2. Implementation of the Testing Phase
In the approach, test-driven development (TDD) [26] method is utilized. TDD proposed to
produce the test units before the source code is created. In our approach, all of iterations begin
with the creating the test units. After the integration part for all iterations, iteration test is
implemented. When all the iterations are completed, integrated system is tested for compatibility
and integrity.
2.3. Customer Role on the Approach
Customer interaction with the developers is a significant part of the approach. All the parts of the
process, the iteration artifacts have to be presented to customer and feedbacks have to be
evaluated immediately. Most of the parts of the process, pair development is proposed. For
instance, model design is proposed to implement with a customer representation.
2.4. Agile Practices Utilization
Agile practices are applied to the structure of the approach. Agile modelling [27] is generally
utilized for this purpose. Agile modelling is a method for modelling properly to the agile
principles. The general utilization of the agile practices in our approach is clarified (Table 2).
Table 2. The utilization of the agile practices in the approach
Agile Practices Utilization in the
Approach
Iterative and
incremental development
The approach progresses iteratively and
incrementally
Working software over
comprehensive documentation
The models are used
for documentation in the approach,
instead of creating external
documentation
Rapid feedback
All stakeholders work
together with high interaction and
responds to the feedbacks immediately
Continuous
integration
Models are created part by part and
regularly integrates
Test-driven
development
All test units are
developed before creating the
source codes throughout the process
Pair programming
Mockups and
requirements are created with
customer representation
5. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.5, September 2016
5
3. CASE STUDY
3.1. Overview
The approach is implemented in Software Quality Research Lab in our university. For this
purpose, two different teams have been constructed with different project subject.
Team 1 has developed the cinema ticket system; while team 2 is being developed the library
registration system. For model development tool, Axure [28] tool has been utilized. Team
members have the enough experience about both programming languages and basis software
engineering methods.
3.2. The Progress of the Implementation
Firstly, our approach has introduced to the teams. Every week, with the meetings, the progress
and coherency to the approach are checked. They have completed their projects in 18 weeks. The
progress of the work is illustrated (Fig. 2).
Figure 2. The progress of the case study projects
The projects are completed in three iterations. Generally, project progress can divide into two
parts. First part is the planning part and second part is iteration parts which are repeated the same
process in every iteration.
3.3. Testing Process of the Implementation Study
For the testing process, black box test technic is utilized. Black box testing is a method, proposed
to test the working system structure rather that the code structures. The reason of the preferred
this method is due to the biggest part of the codes has been created by automated code generation.
6. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.5, September 2016
6
Generated codes have big complexity for normalizing and analyzing to test their structures.
Besides, generated codes have fewer tendencies to error possibilities.
For both team projects, testing process is implemented in pieces. Firstly, general aspects are
constructed for whole project basically for mapping the process. After that, testing units are
created for every iteration as every iteration is being started. As the iteration artifact is
constructed, iteration test is implemented. After three iterations, whole system is tested.
3.4. Challenges during the Implementation Study
One of the challenges teams must solve is the integration the handcrafted codes with generated
codes created by automatically transformed from models. Teams have utilized two methods for
solving this problem: (1) adding the handcrafted codes inside the generated codes with special
tags, (2) creating the particular class for handcrafted codes and associating it with generated
codes.
Another challenge is the complexity on the generated codes. Transformation tools create lots of
line codes and it must be analyzed by developers for making changes. To decrease the
complexity, tags are added on important model elements, hence, changings are followed more
easily.
4. RESULTS AND DISCUSSION
4.1. Evaluation in the terms of Software Architecture
The proposed approach begins with the inception part where system is analyzed and the
architecture is implemented. In the development phase, for all iterations, the analyst, design,
coding and testing are implemented specially for that iteration. This process repeats as the number
of the iteration. When all iterations completed, whole system is tested and all the iteration
artifacts are integrated. The software development main processes which are analyst, design,
coding and testing, are implemented with different orders throughout the approach. The total time
efforts for those are approximately estimated. The proportions of the processes are illustrated
(Fig. 3).
Figure 3. The proportion of the main software development processes on approach
Proposed approach is based on agile architecture skeleton, thus, analyst part is proportionally less
than others. Coding part is included the handcrafted coding and coding from models. These
proportions are given for three iteration included implementation study.
7. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.5, September 2016
7
In addition, thanks to the three teams, which are business analyst, agile development and model-
driven development teams, work simultaneously; so, production time is reduced. Approximately
estimated 40% work load is implemented simultaneously in the implementation study.
4.2. Cost Estimation of the Approach
Cocomo II model [30] is utilized for cost estimation of the proposed approach. Cocomo II model
includes mainly four categories which are personnel factors, product factors, platform factors and
project factors. These factors contain totally 17 cost drivers whose has own scale ranges. These
ranges are constraint values as described in Table 3. Nominal values are defined as 1.00 rate and
the others are defined in regards to proportional of nominal. If the value is greater than 1.00, it
means that state increases the cost effort. If the value is less than 1.00, the situation is effects
positively for cost load.
Every cost driver in Cocomo II is evaluated for our approach. (Table 3) Effort adjustment factor
is the value for evaluating the cost. Effort adjustment factor (EAF) is calculated by multiplied all
rated values. (Eq. 1)
= (1)
In regarding of the ratings on Table 3, the cost estimation value is determined by multiples of all
rates. So,
EAF = 1.11561 (2)
The value is on the higher side of the nominal point, but the deviation score is not much. It is
proved that approach is tendency to nominal line and has acceptable cost effort. For this
approach, other criteria in Cocomo II like code lines are ignored, since the structure of the
approach.
Table 3. Cost Effort Estimation in regarding to COCOMO II model
Very Low Low Nominal High Very High Our Rates
Required
software
reliability
.82 .92 1.00 1.10 1.26 1.10
Database size .90 1.00 1.14 1.28 .90
Product
complexity
.73 .87 1.00 1.17 1.34 1.00
Required
Reusability
.95 1.00 1.07 1.15 1.07
Documentation
match to
life cycle needs
.81 .91 1.00 1.11 1.23 1.11
Execution Time
Constraint
1.00 1.11 1.29 1.00
Main Storage
Constraint
1.00 1.05 1.17 1.00
Platform
volatility
.89 1.00 1.15 1.30 .87
Analyst
capability
1.42 1.19 1.00 .85 .71 1.00
8. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.5, September 2016
8
Programmer
Capability
1.34 1.15 1.00 .88 .76 1.15
Applications
experience
1.22 1.10 1.00 .88 .81 1.10
Platform
experience
1.19 1.09 1.00 .91 .85 1.19
Language and
tool experience
1.20 1.09 1.00 .91 .84 1.2
Personnel
continuity
1.29 1.12 1.00 .90 .81 .90
Use of software
tools
1.17 1.09 1.00 .90 .78 .78
Multisite
development
1.22 1.09 1.00 .93 .86 .86
Required
development
schedule
1.43 1.14 1.00 1.00 1.00 1.00
4.3. Evaluation in the terms of Security and Maintenance
Another best parts of MDD are security and maintenance benefits. MDD helps to reduce the
human factor, meanwhile it is low error prone. This situation reduces the human based errors and
makes the system development in an way that development tools control the process. In addition,
with MDD, it is more flexible for change, so that, maintenance is easy as well. Changings are
implemented on models and MDD reflects the model to real system.
4.4. The Strong Aspects and Weak Aspects of the Approach
We requested teams evaluating the strong aspects and weak aspects of the approach. Team 1 has
defined the fast development and easiness of the user interface design as strong aspects of the
approach. It has also defined the difficulty of the generated code analyst as weak aspect of the
approach.
Team 2 has defined the regular job tracking as strong aspect of it. It has defined the complexity
after the model transformation as weak aspect of it.
4.5. Research Questions
RQ1: What are the methods to integrate the model transformed codes with handcrafted codes?
After the mockups are transformed and agile team completes handcrafted codes, two methods are
used for integration by teams. One of them is to add codes into the generated codes with using
special tags. Other method is to create the handcrafted codes with using object classes and
associates them on the common structure.
RQ2: What sizes of web projects are suitable for the approach? Is there any constraint?
The approach has developed on Hybrid MDD structure. Hybrid MDD is suggested for small or
medium size projects. Customized version of that for web application, it does not require a
constraint thanks to mockups eligible structure. However, big data causes more complexity for
analyzing.
9. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.5, September 2016
9
RQ3: What would be the main differences for the projects implemented in the industry ?
The main difference is the structure of the development team. In industry, team would be more
experienced and organized on their areas. This is a factor that decreases the cost estimation.
Another difference is the product size. Industry projects has more complexity as well as should be
more flexible owing to customer factor. Customer’s requests have more tendency to alteration in
industry projects rather than university research projects.
RQ4: Is there any challenge about updating the previous created mockups?
Mockups are the visual and eligible structures, and currently mockup development tools are
providing adequate features about it. In addition, model tagging [10] is proposed for decreasing
the complexity and increase to intelligibility. Model tagging is a method to decrease the
complexity of the model transformation. It proposes putting tags to every element in the model,
so after the transformation it eases to track the codes.
RQ5: What are the main contributions of the approach ?
In this research, a life cycle, mainly based on the integration of the separately produced parts and
prototyping method with agile support, is defined specially for web applications. The research
provides to web developers an approach and document that can be utilized from the starting point
of the project to software release.
5. CONCLUSIONS
In the paper, it is aimed to provide a life cycle for web application development. The proposed
approach is based on the integration the client-size codes and server-size codes. To accelerate the
process, parallel working teams are proposed. For this purpose, the life cycle of the Hybrid MDD
method is utilized as skeleton. Model web prototyping method called Mockup has been
implemented on the life cycle. As well as, agile practices are utilized for faster and flexible
developing, better analyst process, rapid feedback and more. In addition, case study has been
carried out by two different teams and obtained the feedbacks by participants.
Throughout the implementation study, the most significant challenge of the participants is the
integration the client-side codes with server-side codes. Mockup driven transform tools have
created the lots of code units than they expected. For basic web page, it has produced hundreds
line HTML, CSS and JavaScript codes. The reason of that is the transform logic of the tools. They
described every minimal element in the separated tags. In fact, the basic HTML, CSS script lines
has caused not to analyze the meaning of the code. About this situation, our suggestion is to focus
on the models instead of the codes. It has been a problem developing by models for participants
who accustomed the developing by codes. Hence, this approach requires tracking regularly in
order to implement properly.
Furthermore, the approach is evaluated positive by participants about the subject of visual design
simplicity. User interface design has been created significantly fast by exclusive Mockup tools.
In the terms of cost factors is concluded as nearly nominal range. Effort adjustment factor is
calculated as 1.11561 for case studies and this value is defined that the approach is in the
acceptable range. This study has been implemented by intermediated level development teams, so
that we can conclude that, by developing just the personnel factors, the effort can reach the
nominal value.
For future work, it is planning to implement the approach on a big size projects. In the study,
general perspective of the approach is illustrated and the applicability of that is proved for little or
medium size projects. Moreover, reverse engineering for web application models is the interesting
area for research.
10. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.5, September 2016
10
ACKNOWLEDGEMENTS
The authors would like to thank Software Quality Research Lab members for their support of this
research. We are also grateful to all the practitioners for their participation and feedback.
REFERENCES
[1] Schmidt, D.C., 2006. Model-Driven Engineering, IEEE Computer, 39(2):25-31.
[2] Stahl, T., Volter, M., 2006. Model-Driven Software Development: Technology, Engineering,
Management, John Wiley & Sons, Chichester, England.
[3] Seidewitz, E., 2003. "What Models Mean", IEEE Software, 20(5):26-32.
[4] Czarnecki, K., Helsen, S., 2006. Feature-based survey of model transformation approaches, IBM
Systems Journal, 45(3): 621–645.
[5] Selic, B., 2003. The Pragmatics of Model-driven Development, IEEE Software, 20(5): pp. 19-25.
[6] Hailpern, B., Tarr, P., 2006. Model-driven development: The Good, the Bad, and the Ugly, IBM
Systems Journal, 45(3): 451-461.
[7] Ceri, S., Fraternali, P., Matera, M., 2002. Conceptual modeling of data-intensive Web applications,
IEEE Internet Computing 6 (4) pp. 20-30.
[8] Kaim, W.E., Studer, P., Muller, P.A., 2003. Model Driven Architecture for Agile Web Information
System Engineering, In: 9th International Conference on Object-Oriented Information Systems,
Geneva, Switzerland, pp 299-303.
[9] Koch, N., Knapp, A., Zhang, G., Baumeister, H., 2008. UML-Based Web Engineering, Web
Engineering: modeling and Implementing Web Applications, Human-Computer Interaction Series,
pp. 157-191.
[10] Lee, W., Park, S., Lee, K., Lee, C., Lee, B., Jung, W., Kim, T., Kim, H., Wu, C., 2005. Agile
Development of Web Application by Supporting Process Execution and Extended UML Model, In:
Proceedings of the 12th Asia-Pacic Software Engineering Conference, Taipei, Taiwan, pp. 193-200.
[11] Rossi, G., Schwabe, D., 2008. modeling and Implementing Web Applications with OOHDM, Web
Engineering: modeling and Implementing Web Applications, Human-Computer Interaction Series,
pp. 109-155.
[12] Benson, E., 2013. Mockup Driven Web Development, In:22nd Intenational World Wide Web
Conference, 13-17 Mays 2013, Rio de Janeiro, Brazil, pp. 337-342.
[13] Rivero, J.M., Rossi, G., Grigera J., Luna, E.R., Navarro, A., 2011. From interface mockups to web
application models", In: Proceedings of the 12th international conference on Web information system
engineering, Sydney, Australia, pp. 257-264.
[14] Rivero, J.M., Grigera, J., Rossi, G., Robles Luna, E., Koch, N., 2011, Improving Agility in Model-
Driven Web Engineering, In: CAiSE 2011 Forum Proceedings, London, England, pp. 163-170.
[15] Rivero, J.M., Grigera, J., Rossi, G., Robles Luna, E., Montero, F., Gaedke, M., 2014. Mockup-Driven
Development: Providing agile support for Model-Driven Web Engineering, Information and Software
Technology, 56 (6), pp. 670-687.
[16] Cockburn, A., 2002. Agile Software Development. Addison-Wesley, Boston, MA, USA.
[17] Larman, C., 2004. Agile and Iterative Development: A Manager's Guide. Addison-Wesley, Boston,
USA.
[18] Dyba, T., Dingsory, T., 2009. What Do We Know about Agile Software Development?, IEEE
Software, 26 (5), pp. 6-9.
[19] Matinnejad R., 2011. Agile Model Driven Development: An Intelligent Compromise, In: 9th
International Conference Software Engineering Research, Management and Applications, Baltimore,
MD, USA, pp. 197-202.
[20] Ambler, S.W., 2004. The Object Primer 3rd Edition: Agile Model Driven Development with UML
2.0, Cambridge University Press, New York, NY, USA.
[21] Ambler, S.W., 2007. Agile Model Driven Development, XOOTIC Magazine, 12 (1) pp. 13-21.
[22] Guta G., Schreiner W., Draheim, D., 2009. A Lightweight MDSD Process Applied in Small Projects,
In: Proceedings of the 35th Euromicro Conference on Software Engineering and Advanced
Applications, Patras, Greece, pp. 255-258.
11. International Journal of Software Engineering & Applications (IJSEA), Vol.7, No.5, September 2016
11
[23] Zhang Y., Patel S., 2011. Agile Model-Driven Development in Practise. IEEE Software, 28 (2), pp.
84-91.
[24] Kirby J., 2006. Model Driven Agile Development of Reactive Multi Agent Systems, In: Proceedings
of the 30th Annual International Computer Software and Applications Conference, Chicago, IL, USA,
pp. 297-302.
[25] Gabarro, S., 2007. Web Application Design and Implementation: Apache 2, PHP5, MySQL,
JavaScript, and Linux/UNIX, John Wiley & Sons, Hoboken, New Jersey, USA.
[26] Astels, D., 2003. Test Driven Development: A Practical Guide, First Edition, Prentice Hall, New
Jersey, United States of America.
[27] Ambler, S., 2002. Agile Modeling: Effective Practices for eXtreme Programming and the United
Process, John Wiley & Sons, New York, NY, USA.
[28] Interactive Wireframe Software & Mockup Tool, http://http://www.axure.com, Last visit 09.2015.
[29] Andres, J. D., Fernandez-Lanvin, D., Lorca, P. (2015). Cost estimation in software engineering
projects with web components development. Dyna, 82(192), 266-275.
[30] Boehm, B., Clark, B., Horowitz, E.,Westland, C., Madachy, R., & Selby, R. (1995). Cost models for
future software life cycle processes: COCOMO 2.0. Annals of software engineering, 1(1), 57-94.
Authors
Gürkan Alpaslan received the B.S. degree in computer engineering from Istanbul University in 2012 and
M.S. degree in computer engineering from Yıldız Technical University in 2015. He is currently Ph.D.
student and working as a research assistant at the Computer Engineering Department of Yıldız Technical
University. His main research interests are software engineering and database systems.
Oya Kalıpsız received her M.S. degree in system analysis from Istanbul Technical University in 1984. She
received her Ph.D. degree from Istanbul University in 1989 with a study on Hospital Information Systems.
She is currently working as the member of Software Quality Research Lab in Computer Engineering
Department of Yildiz Technical University. Her main research interests are software engineering, database
systems, data mining, system analysis, and management information systems.