This paper presents findings from the evaluation study carried out to review the UAE national ID card software. The paper consults the relevant literature to explain many of the concepts and frameworks explained herein. The findings of the evaluation work that was primarily based on the ISO 9126 standard for system quality measurement highlighted many practical areas that if taken into account is argued to more likely increase the success chances of similar system implementation projects.
Contributors to Reduce Maintainability Cost at the Software Implementation PhaseWaqas Tariq
Software maintenance is important and difficult to measure. The cost of maintenance is the most ever during the phases of software development. One of the most critical processes in software development is the reduction of software maintainability cost based on the quality of source code during design step, however, a lack of quality models and measures can help asses the quality attributes of software maintainability process. Software maintainability suffers from a number of challenges such as lack source code understanding, quality of software code, and adherence to programming standards in maintenance. This work describes model based-factors to assess the software maintenance, explains the steps followed to obtain and validate them. Such a method can be used to eliminate the software maintenance cost. The research results will enhance the quality of the source code. It will increase software understandability, eliminate maintenance time, cost, and give confidence for software reusability.
Survey Based Reviewof Elicitation ProblemsIJERA Editor
Any software development process is the combination of multiple development activities and each activity has a
vital role in the software development cycle. Requirement Engineering is the main and basic branch of Software
Engineering, it has many phases but the most initial phase is Requirement Elicitation. In this phase requirements
are gathered for system development.
This paper provides a literature review of the requirements engineering processes performed in traditional and
modern development processes and analyses the problems in the requirements elicitation phase. This problem
analysis is based on a survey which was conducted in University. A questionnaire posing questions regarding
the problems in requirement elicitation was given to final year computer science graduate students who are
working on their final year project as a requirement for their degree. The theoretical analysis of the
questionnaire further clarifies the problems. This problems analysis will help to find out the main problems
which are faced by the perspective software developers
DESQA a Software Quality Assurance FrameworkIJERA Editor
In current software development lifecycles of heterogeneous environments, the pitfalls businesses have to face are that software defect tracking, measurements and quality assurance do not start early enough in the development process. In fact the cost of fixing a defect in a production environment is much higher than in the initial phases of the Software Development Life Cycle (SDLC) which is particularly true for Service Oriented Architecture (SOA). Thus the aim of this study is to develop a new framework for defect tracking and detection and quality estimation for early stages particularly for the design stage of the SDLC. Part of the objectives of this work is to conceptualize, borrow and customize from known frameworks, such as object-oriented programming to build a solid framework using automated rule based intelligent mechanisms to detect and classify defects in software design of SOA. The implementation part demonstrated how the framework can predict the quality level of the designed software. The results showed a good level of quality estimation can be achieved based on the number of design attributes, the number of quality attributes and the number of SOA Design Defects. Assessment shows that metrics provide guidelines to indicate the progress that a software system has made and the quality of design. Using these guidelines, we can develop more usable and maintainable software systems to fulfill the demand of efficient systems for software applications. Another valuable result coming from this study is that developers are trying to keep backwards compatibility when they introduce new functionality. Sometimes, in the same newly-introduced elements developers perform necessary breaking changes in future versions. In that way they give time to their clients to adapt their systems. This is a very valuable practice for the developers because they have more time to assess the quality of their software before releasing it. Other improvements in this research include investigation of other design attributes and SOA Design Defects which can be computed in extending the tests we performed.
STATE-OF-THE-ART IN EMPIRICAL VALIDATION OF SOFTWARE METRICS FOR FAULT PRONEN...IJCSES Journal
With the sharp rise in software dependability and failure cost, high quality has been in great demand.However, guaranteeing high quality in software systems which have grown in size and complexity coupled with the constraints imposed on their development has become increasingly difficult, time and resource consuming activity. Consequently, it becomes inevitable to deliver software that have no serious faults. In
this case, object-oriented (OO) products being the de facto standard of software development with their unique features could have some faults that are hard to find or pinpoint the impacts of changes. The earlier faults are identified, found and fixed, the lesser the costs and the higher the quality. To assess product quality, software metrics are used. Many OO metrics have been proposed and developed. Furthermore,
many empirical studies have validated metrics and class fault proneness (FP) relationship. The challenge is which metrics are related to class FP and what activities are performed. Therefore, this study bring together the state-of-the-art in fault prediction of FP that utilizes CK and size metrics. We conducted a systematic literature review over relevant published empirical validation articles. The results obtained are
analysed and presented. It indicates that 29 relevant empirical studies exist and measures such as complexity, coupling and size were found to be strongly related to FP.
STUDY ON TECHNICAL FOCUSES AND SAMPLING COVERAGE STRATEGY OF AIRBORNE SOFTWAR...ijseajournal
Airborne software is invisible and intangible, it can have a significant impact on the safety of the aircraft.
However, it cannot be exhaustively tested, and can only be assured through a structured, process, activity,
and objective-based approach. This paper studied the similarities and differences of software review
policies of the four selected National Airworthiness Authorities (NAAs) by using a comparative approach
and analysed the general certification basis of specific regulation clauses from the International Civil
Aviation Organization Conventions to each contracting States’ regulations by a traceability method. Then
analyzed the development processes and objectives applicable to different software levels based on
RTCA/DO-178C. Identified 82 technical focus points based on each airborne software development subprocess, then created a Process Technology Coverage matrix to demonstrate the technical focuses of each
process. Developed an objective-oriented top-down and bottom-up sampling strategy for the 4 software
Stage of Involvement reviews by taking into account the frequency and depth of involvement. Finally,
created the Technology Objective Coverage matrix, which can support the reviewers to perform the
efficient risk-based SOI reviews by considering the identified technical points, thus to ensure the safety of
the aircraft from the software assurance perspective.
ITERATIVE AND INCREMENTAL DEVELOPMENT ANALYSIS STUDY OF VOCATIONAL CAREER INF...ijseajournal
Software development process presents various types of models with their corresponding phases required to be accordingly followed in delivery of quality products and projects. Despite the various expertise and skills of systems analysts, designers, and programmers, systems failure is inevitable when a suitable development process model is not followed. This paper focuses on the Iterative and Incremental Development (IID)model and justified its role in the analysis and design software systems. The paper adopted the qualitative research approach that justified and harnessed the relevance of IID in the context of systems analysis and design using the Vocational
Career Information System (VCIS) as a case study. The paper viewed the IID as a change-driven software development process model. The results showed some system specification, functional specification of system and design specifications that can be used in implementing the VCIS using the IID model. Thus, the paper concluded that in systems analysis and design, it is imperative to consider a suitable development process that reflects the engineering mind-set, with heavy emphasis on good analysis and design for quality assurance.
HYBRID PRACTICES IN GLOBAL SOFTWARE DEVELOPMENT: A SYSTEMATIC LITERATURE REVIEWijseajournal
Although agile methods in their purest way fit several companies, it has been a challenge to perform them in environments with distributed teams developing large software applications. Contractual items, for projects under development for external organizations, introduce additional complexities for pure agilebased approaches. The majority of global teams and companies use hybrid development practices that combine different development methods and frameworks. This research provides results from an empirical field study on how the hybrids practices are adopted in Global Software Development (GSD) projects. A systematic literature review was conducted to capture the status of combining agile with plan-driven in GSD projects. The results were limited to peer-reviewed conference papers or journal articles, published between 2001 and 2020. The present study selected 37 papers from five different bibliographic databases. In the end, 16 practices were summarized and described as hybrid by GSD projects. Based on the findings of this study, the authors can conclude that the contribution of this study is not only limited to identifying how hybrid development practices are applied in GSD but also allowing that practitioners can have a basis for adapting their development methods.
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...ijseajournal
Rework is a known vicious circle in software development since it plays a central role in the generation of
delays, extra costs and diverse risks introduced after software delivery. It eventually triggers a negative
impact on the quality of the software developed. In order to cater the rework issue, this paper goes in depth
with the notion of rework in software development as it occurs in practice by analysing a development
process on an organisation in Mauritius where rework is a major issue. Meticulous strategies to reduce
rework are then analysed and discussed. The paper ultimately leads to the recommendation of the best
strategy that is software configuration management to reduce the rework problem in software development
Contributors to Reduce Maintainability Cost at the Software Implementation PhaseWaqas Tariq
Software maintenance is important and difficult to measure. The cost of maintenance is the most ever during the phases of software development. One of the most critical processes in software development is the reduction of software maintainability cost based on the quality of source code during design step, however, a lack of quality models and measures can help asses the quality attributes of software maintainability process. Software maintainability suffers from a number of challenges such as lack source code understanding, quality of software code, and adherence to programming standards in maintenance. This work describes model based-factors to assess the software maintenance, explains the steps followed to obtain and validate them. Such a method can be used to eliminate the software maintenance cost. The research results will enhance the quality of the source code. It will increase software understandability, eliminate maintenance time, cost, and give confidence for software reusability.
Survey Based Reviewof Elicitation ProblemsIJERA Editor
Any software development process is the combination of multiple development activities and each activity has a
vital role in the software development cycle. Requirement Engineering is the main and basic branch of Software
Engineering, it has many phases but the most initial phase is Requirement Elicitation. In this phase requirements
are gathered for system development.
This paper provides a literature review of the requirements engineering processes performed in traditional and
modern development processes and analyses the problems in the requirements elicitation phase. This problem
analysis is based on a survey which was conducted in University. A questionnaire posing questions regarding
the problems in requirement elicitation was given to final year computer science graduate students who are
working on their final year project as a requirement for their degree. The theoretical analysis of the
questionnaire further clarifies the problems. This problems analysis will help to find out the main problems
which are faced by the perspective software developers
DESQA a Software Quality Assurance FrameworkIJERA Editor
In current software development lifecycles of heterogeneous environments, the pitfalls businesses have to face are that software defect tracking, measurements and quality assurance do not start early enough in the development process. In fact the cost of fixing a defect in a production environment is much higher than in the initial phases of the Software Development Life Cycle (SDLC) which is particularly true for Service Oriented Architecture (SOA). Thus the aim of this study is to develop a new framework for defect tracking and detection and quality estimation for early stages particularly for the design stage of the SDLC. Part of the objectives of this work is to conceptualize, borrow and customize from known frameworks, such as object-oriented programming to build a solid framework using automated rule based intelligent mechanisms to detect and classify defects in software design of SOA. The implementation part demonstrated how the framework can predict the quality level of the designed software. The results showed a good level of quality estimation can be achieved based on the number of design attributes, the number of quality attributes and the number of SOA Design Defects. Assessment shows that metrics provide guidelines to indicate the progress that a software system has made and the quality of design. Using these guidelines, we can develop more usable and maintainable software systems to fulfill the demand of efficient systems for software applications. Another valuable result coming from this study is that developers are trying to keep backwards compatibility when they introduce new functionality. Sometimes, in the same newly-introduced elements developers perform necessary breaking changes in future versions. In that way they give time to their clients to adapt their systems. This is a very valuable practice for the developers because they have more time to assess the quality of their software before releasing it. Other improvements in this research include investigation of other design attributes and SOA Design Defects which can be computed in extending the tests we performed.
STATE-OF-THE-ART IN EMPIRICAL VALIDATION OF SOFTWARE METRICS FOR FAULT PRONEN...IJCSES Journal
With the sharp rise in software dependability and failure cost, high quality has been in great demand.However, guaranteeing high quality in software systems which have grown in size and complexity coupled with the constraints imposed on their development has become increasingly difficult, time and resource consuming activity. Consequently, it becomes inevitable to deliver software that have no serious faults. In
this case, object-oriented (OO) products being the de facto standard of software development with their unique features could have some faults that are hard to find or pinpoint the impacts of changes. The earlier faults are identified, found and fixed, the lesser the costs and the higher the quality. To assess product quality, software metrics are used. Many OO metrics have been proposed and developed. Furthermore,
many empirical studies have validated metrics and class fault proneness (FP) relationship. The challenge is which metrics are related to class FP and what activities are performed. Therefore, this study bring together the state-of-the-art in fault prediction of FP that utilizes CK and size metrics. We conducted a systematic literature review over relevant published empirical validation articles. The results obtained are
analysed and presented. It indicates that 29 relevant empirical studies exist and measures such as complexity, coupling and size were found to be strongly related to FP.
STUDY ON TECHNICAL FOCUSES AND SAMPLING COVERAGE STRATEGY OF AIRBORNE SOFTWAR...ijseajournal
Airborne software is invisible and intangible, it can have a significant impact on the safety of the aircraft.
However, it cannot be exhaustively tested, and can only be assured through a structured, process, activity,
and objective-based approach. This paper studied the similarities and differences of software review
policies of the four selected National Airworthiness Authorities (NAAs) by using a comparative approach
and analysed the general certification basis of specific regulation clauses from the International Civil
Aviation Organization Conventions to each contracting States’ regulations by a traceability method. Then
analyzed the development processes and objectives applicable to different software levels based on
RTCA/DO-178C. Identified 82 technical focus points based on each airborne software development subprocess, then created a Process Technology Coverage matrix to demonstrate the technical focuses of each
process. Developed an objective-oriented top-down and bottom-up sampling strategy for the 4 software
Stage of Involvement reviews by taking into account the frequency and depth of involvement. Finally,
created the Technology Objective Coverage matrix, which can support the reviewers to perform the
efficient risk-based SOI reviews by considering the identified technical points, thus to ensure the safety of
the aircraft from the software assurance perspective.
ITERATIVE AND INCREMENTAL DEVELOPMENT ANALYSIS STUDY OF VOCATIONAL CAREER INF...ijseajournal
Software development process presents various types of models with their corresponding phases required to be accordingly followed in delivery of quality products and projects. Despite the various expertise and skills of systems analysts, designers, and programmers, systems failure is inevitable when a suitable development process model is not followed. This paper focuses on the Iterative and Incremental Development (IID)model and justified its role in the analysis and design software systems. The paper adopted the qualitative research approach that justified and harnessed the relevance of IID in the context of systems analysis and design using the Vocational
Career Information System (VCIS) as a case study. The paper viewed the IID as a change-driven software development process model. The results showed some system specification, functional specification of system and design specifications that can be used in implementing the VCIS using the IID model. Thus, the paper concluded that in systems analysis and design, it is imperative to consider a suitable development process that reflects the engineering mind-set, with heavy emphasis on good analysis and design for quality assurance.
HYBRID PRACTICES IN GLOBAL SOFTWARE DEVELOPMENT: A SYSTEMATIC LITERATURE REVIEWijseajournal
Although agile methods in their purest way fit several companies, it has been a challenge to perform them in environments with distributed teams developing large software applications. Contractual items, for projects under development for external organizations, introduce additional complexities for pure agilebased approaches. The majority of global teams and companies use hybrid development practices that combine different development methods and frameworks. This research provides results from an empirical field study on how the hybrids practices are adopted in Global Software Development (GSD) projects. A systematic literature review was conducted to capture the status of combining agile with plan-driven in GSD projects. The results were limited to peer-reviewed conference papers or journal articles, published between 2001 and 2020. The present study selected 37 papers from five different bibliographic databases. In the end, 16 practices were summarized and described as hybrid by GSD projects. Based on the findings of this study, the authors can conclude that the contribution of this study is not only limited to identifying how hybrid development practices are applied in GSD but also allowing that practitioners can have a basis for adapting their development methods.
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...ijseajournal
Rework is a known vicious circle in software development since it plays a central role in the generation of
delays, extra costs and diverse risks introduced after software delivery. It eventually triggers a negative
impact on the quality of the software developed. In order to cater the rework issue, this paper goes in depth
with the notion of rework in software development as it occurs in practice by analysing a development
process on an organisation in Mauritius where rework is a major issue. Meticulous strategies to reduce
rework are then analysed and discussed. The paper ultimately leads to the recommendation of the best
strategy that is software configuration management to reduce the rework problem in software development
An interactive approach to requirements prioritization using quality factorsijfcstjournal
As the prevalence of software increases, so does the complexity and the number of requirements assoc
iated
to the software project. This presents a dilemma for the developers to clearly identify and prioriti
ze the
most important requirements in order to del
iver the project in given amount of resources and time.
A
number of prioritization methods have been proposed which provide consistent results, but they are v
ery
difficult and complex to implement in practical scenarios as well as lack proper structure to
analyze the
requirements properly. In this study, the users can provide their requirements in two forms: text ba
sed
story form and use case form.
Moreover, the existing prioritization techniques have a very little or no
interaction with the users. So, in t
his paper an attempt has been made to make the prioritization process
user interactive by adding a second level of prioritization where after the developer has properly a
nalyzed
and ranked the requirements on the basis of quality attributes in the first le
vel, takes the opinion of distinct
user’s about the requirements priority sequence. The developer then calculates the disagreement valu
e
associated with each user sequence in order to find out the final priority sequence.
A methodology to evaluate object oriented software systems using change requi...ijseajournal
It is a well known fact that software maintenance plays a major role and finds importance in software
development life cycle. As object
-
oriented programming has become the standard, it is very important to
understand th
e problems of maintaining object
-
oriented software systems. This paper aims at evaluating
object
-
oriented software system through change requirement traceability
–
based impact analysis
methodology
for non functional requirements using functional requirem
ents
. The major issues have been
related to change impact algorithms and inheritance of functionality.
EVALUATION OF SOFTWARE DEGRADATION AND FORECASTING FUTURE DEVELOPMENT NEEDS I...ijseajournal
This article is an extended version of a previously published conference paper. In this research, JHotDraw (JHD), a well-tested and widely used open source Java-based graphics framework developed with the best software engineering practice was selected as a test suite. Six versions of this software were profiled, and data collected dynamically, from which four metrics namely (1) entropy (2) software maturity index, COCOMO effort and duration metrics were used to analyze software degradation, maturity level and use
the obtained results as input to time series analysis in order to predict effort and duration period that may
be needed for the development of future versions. The novel idea is that, historical evolution data is used to
project, predict and forecast resource requirements for future developments. The technique presented in
this paper will empower software development decision makers with a viable tool for planning and decision
making.
A User Story Quality Measurement Model for Reducing Agile Software Developmen...ijseajournal
In Mobile communications age, the IT environment and IT technology update rapidly. The requirements
change is the software project must face challenge. Able to overcome the impact of requirements change,
software development risks can be effectively reduced. Agile software development uses the Iterative and
Incremental Development (IID) process and focuses on the workable software and client communication.
Agile software development is a very suitable development method for handling the requirements change in
software development process. In agile development, user stories are the important documents for the
client communication and criteria of acceptance test. However, the agile development doesn’t pay attention
to the formal requirements analysis and artifacts tracability to cause the potential risks of software change
management. In this paper, analyzing and collecting the critical quality factors of user stories, and
proposes the User Story Quality Measurement (USQM) model. Applied USQM model, the requirements
quality of agile development can be enhanced and risks of requirement changes can be reduced.
A Review on Quality Assurance of Component- Based Software Systemiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
FISHBONE ANALYSIS ON WASTES IN SOFTWARE DEVELOPMENT USING THE LEAN I.T. PRINC...ecij
The transformative global economy posed challenges to businesses in service management. In this computing age, the perceptual and operational edge of a certain business or organization manifested on the kind of technology it offers in the Service Management. Organizations have long recognized the
importance of managing key resources such as people and information. Information has now moved to its rightful place as a key resource in the organization and therefore management of the same can be instituted by employing methodology. To keep their brand promise, technology has been used;The number of new entrants to every sectors of economy has grown significantly in recent years, and each firm strives to make their daily operation efficient in which demand for business software or application software getting higher and businesses or organizations opted to build or buy this software. Because of
new entrants, it had offered opportunity to software developers to translate business processes into systems. This study investigates waste in the software development by application of Lean principles. Like any conventional projects, software becomes buggy and oftentimes it fails. Software failure is always attributed to the software engineering, not the incompetence of project managers, inadequacy of the
people on the project, or lack of clear goal. The researchers’ contentions are there wastes in the software development and serve as mechanism and evidence to why software fails. Software failure is not attributed to the software itself, it includes however the acceptance of the clients and end-users. Descriptive secondary data analysis, participant observation and Fishbone Analysis were the methodology used in the study. Wastes include unfinished or partially done work, extra features,
relearning, handoffs, delays, task switching, and defects.
An Elite Model for COTS Component Selection ProcessIJEACS
Component-based software development (CBD) promises development of high-quality trustworthy software systems within specified budget and deadline. The selection of the most appropriate component based on specific requirement plays a vital role for high-quality software product. Multi-Agent software (MAS) engineering approach played a crucial role for selection of the most appropriate component based on a specific requirement in a distributed environment. In this paper, multi agent technique is used for component selection. A semi-automated solution to COTS component selection is proposed. It is evident from the result that (MAS) plays an essential role and is suitable for component selection in a distributed environment keeping in view of the system design and testing strategies.
Transitioning IT Projects to Operations Effectively in Public Sector : A Case...ijmpict
Operational effectiveness is measured by the application availability to end-users and the extent of
convenient usage of the application to perform their business functions. This paper demonstrates how
varying project transition process can affect the operational effectiveness. This explanatory case study
uses various projects in a South Australian government agency as the candidates for evaluation. With
various applications that existed in the production environment, the end-users had varying levels of
satisfaction. This research analyses factors influencing the operational efficiency the projects in transition
from project delivery into operations. The evidence clearly demonstrates criticality of the transition
process of applications from project delivery phase to operations phase. The research analyses the
findings specific to government agencies and presents recommendations. These findings can be useful to
public sector agencies for improving availability of IT applications in operations.
A reliability estimation framework for OO design complexity perspective has been developed inthis paper. The proposed framework correlates the object oriented design constructs with complexity and also correlates complexity with reliability. No such framework has been available in the literature that estimates software reliability of OO design by taking complexity into consideration. The framework bridges the gap between object oriented design constructs, complexity and reliability. Framework measures and minimizes the complexity of software design at the early stage of software development life cycle leading to a reliable end product. Reliability and complexity estimation models have been proposed by following the proposed framework. Complexity estimation model has been developed which takes OO design constructs into consideration and proposed reliability estimation models take complexity in consideration for estimating reliability of OO design.
EVALUATION AND STUDY OF SOFTWARE DEGRADATION IN THE EVOLUTION OF SIX VERSIONS...csandit
When a software system evolves, new requirements may be added, existing functionalities
modified, or some structural change introduced. During such evolution, disorder may be
introduced, complexity increased or unintended consequences introduced, producing rippleeffect
across the system. JHotDraw (JHD), a well-tested and widely used open source Javabased
graphics framework developed with the best software engineering practice was selected
as a test suite. Six versions were profiled and data collected dynamically, from which two metrics were derived namely entropy and software maturity index. These metrics were used to investigate degradation as the software transitions from one version to another. This study observed that entropy tends to decrease as the software evolves. It was also found that a
software product attains its lowest decrease in entropy at the turning point where its highest maturity index is attained, implying a possible correlation between the point of lowest decreasein entropy and software maturity index.
Validation Item – Stating Requirements at the Adequate Level of DetailEdwagney Luz
Nowadays many software companies still suffer from dealing with low quality software requirements specification. They need some mechanisms to improve their requirements engineering activities. In this paper we discuss how we face this issue, presenting our STARS methodology. Its foundation relies on the concept of Validation Item (VI) and its steps and most expressive benefits are highlighted.
REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...ijaia
Predicting the time to build software is a very complex task for software engineering managers. There are complex factors that can directly interfere with the productivity of the development team. Factors directly related to the complexity of the system to be developed drastically change the time necessary for the completion of the works with the software factories. This work proposes the use of a hybrid system based on artificial neural networks and fuzzy systems to assist in the construction of an expert system based on rules to support in the prediction of hours destined to the development of software according to the complexity of the elements present in the same. The set of fuzzy rules obtained by the system helps the management and control of software development by providing a base of interpretable estimates based on fuzzy rules. The model was submitted to tests on a real database, and its results were promissory in the construction of an aid mechanism in the predictability of the software construction
ANALYSIS OF SOFTWARE QUALITY USING SOFTWARE METRICSijcsa
Software metrics have a direct link with measurement in software engineering. Correct measurement is the prior condition in any engineering fields, and software engineering is not an exception, as the size and complexity of software increases, manual inspection of software becomes a harder task. Most Software Engineers worry about the quality of software, how to measure and enhance its quality. The overall objective of this study was to asses and analysis’s software metrics used to measure the software product and process.
In this Study, the researcher used a collection of literatures from various electronic databases, available since 2008 to understand and know the software metrics. Finally, in this study, the researcher has been identified software quality is a means of measuring how software is designed and how well the software conforms to that design. Some of the variables that we are looking for software quality are Correctness, Product quality, Scalability, Completeness and Absence of bugs, However the quality standard that was used from one organization is different from others for this reason it is better to apply the software metrics to measure the quality of software and the current most common software metrics tools to reduce the subjectivity of faults during the assessment of software quality. The central contribution of this study is an overview about software metrics that can illustrate us the development in this area, and a critical analysis about the main metrics founded on the various literatures.
Defect Prevention Based on 5 Dimensions of Defect Originijseajournal
Discovering the unexpected is more important than confirming the known [7]. In software development,
the “unexpected” one relates to defects. These defects when unattended would cause failure to the product
and risk to the users. The increasing dependency of society on software and the crucial consequences that a
failure can cause requires the need to find out the defects at the origin itself. Based on the lessons learnt
from the earlier set of projects, a defect framework highlighting the 5 Dimensions (Ds) of defect origin is
proposed in this work. The defect framework is based on analyzing the defects that had emerged from
various stages of software development like Requirements, Design, Coding, Testing and Timeline (defects
due to lack of time during development). This study is not limited to just identifying the origin of defects at
various phases of software development but also finds out the reasons for such defects, and defect
preventive (DP) measures are proposed for each type of defect. This work can help practitioners choose
effective defect avoidance measures.
In addition to arriving at defect framework, this work also proposes a defect injection metric based on
severity of the defect rather than just defect count, which gives the number of adjusted defects produced by
a project at various phases. The defect injection metric value, once calculated, serves as a yardstick to
make a comparison in the improvements made in the software process development between similar set of
projects.
MIDAS: A Design Quality Assessment Method for Industrial SoftwareGanesh Samarthyam
Siemens Corporate Development Center Asia Australia (CT DC AA) develops and maintains software applications for the Industry, Energy, Healthcare, and Infrastructure & Cities sectors of Siemens. The critical nature of these applications necessitates a high level of software design quality. A survey of software architects indicated a low level of satisfaction with existing design assessment practices in CT DC AA and highlighted several shortcomings of existing practices. To address this, we have developed a design assessment method called MIDAS (Method for Intensive Design ASsessments).
MIDAS is an expert-based method wherein manual assessment of
design quality by experts is directed by the systematic application
of design analysis tools through the use of a three view-model
consisting of design principles, project-specific constraints, and
an “ility”-based quality model. In this paper, we describe the
motivation for MIDAS, its design, and its application to three
projects in CT DC AA. We believe that the insights from our
MIDAS experience not only provide useful pointers to other
organizations and practitioners looking to assess and improve
software design quality but also suggest research questions for
the software engineering community to explore.
Today in era of software industry there is no perfect software framework available for
analysis and software development. Currently there are enormous number of software development
process exists which can be implemented to stabilize the process of developing a software system. But no
perfect system is recognized till yet which can help software developers for opting of best software
development process. This paper present the framework of skillful system combined with Likert scale. With
the help of Likert scale we define a rule based model and delegate some mass score to every process and
develop one tool name as MuxSet which will help the software developers to select an appropriate
development process that may enhance the probability of system success.
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT ijseajournal
Software product quality can be defined as the features and characteristics of the product that meet the user needs. The quality of any software can be achieved by following a well defined software process. These software process results into various metrics like Project metrics, Product metrics and Process metrics. Software quality depends on the process which is carried out to design and develop software. Even though the process can be carried out with utmost care, still it can introduce some error and defects. Process metrics are very useful from management point of view. Process metrics can be used for improving the software development and maintenance process for defect removal and also for reducing the response
time.
This paper describes the importance of capturing the Process metrics during the quality audit process and also attempts to categorize them based on the nature of error captured. To reduce such errors and defects found, steps for corrective actions are recommended.
PROBLEMS ARE THE GOLDEN EGGS
problems??? day by day in our proffessional life we faces so many problems, but didn't recognize about the problem. Because we are habituate to facing to problems, if we want to solve the problems, first we can feel YES am facing a problem then you have a chance to solve it... after that we should find is it REPEATATIVE problem or New problem, on the bases of the issue we can take further steps, how to break it. how to analyse, how to find countermeasure, how to check is it suitable or not, how to make standard.... if you want to know gothrough my presentations..
This is my first presentation posted in Slideshare
An interactive approach to requirements prioritization using quality factorsijfcstjournal
As the prevalence of software increases, so does the complexity and the number of requirements assoc
iated
to the software project. This presents a dilemma for the developers to clearly identify and prioriti
ze the
most important requirements in order to del
iver the project in given amount of resources and time.
A
number of prioritization methods have been proposed which provide consistent results, but they are v
ery
difficult and complex to implement in practical scenarios as well as lack proper structure to
analyze the
requirements properly. In this study, the users can provide their requirements in two forms: text ba
sed
story form and use case form.
Moreover, the existing prioritization techniques have a very little or no
interaction with the users. So, in t
his paper an attempt has been made to make the prioritization process
user interactive by adding a second level of prioritization where after the developer has properly a
nalyzed
and ranked the requirements on the basis of quality attributes in the first le
vel, takes the opinion of distinct
user’s about the requirements priority sequence. The developer then calculates the disagreement valu
e
associated with each user sequence in order to find out the final priority sequence.
A methodology to evaluate object oriented software systems using change requi...ijseajournal
It is a well known fact that software maintenance plays a major role and finds importance in software
development life cycle. As object
-
oriented programming has become the standard, it is very important to
understand th
e problems of maintaining object
-
oriented software systems. This paper aims at evaluating
object
-
oriented software system through change requirement traceability
–
based impact analysis
methodology
for non functional requirements using functional requirem
ents
. The major issues have been
related to change impact algorithms and inheritance of functionality.
EVALUATION OF SOFTWARE DEGRADATION AND FORECASTING FUTURE DEVELOPMENT NEEDS I...ijseajournal
This article is an extended version of a previously published conference paper. In this research, JHotDraw (JHD), a well-tested and widely used open source Java-based graphics framework developed with the best software engineering practice was selected as a test suite. Six versions of this software were profiled, and data collected dynamically, from which four metrics namely (1) entropy (2) software maturity index, COCOMO effort and duration metrics were used to analyze software degradation, maturity level and use
the obtained results as input to time series analysis in order to predict effort and duration period that may
be needed for the development of future versions. The novel idea is that, historical evolution data is used to
project, predict and forecast resource requirements for future developments. The technique presented in
this paper will empower software development decision makers with a viable tool for planning and decision
making.
A User Story Quality Measurement Model for Reducing Agile Software Developmen...ijseajournal
In Mobile communications age, the IT environment and IT technology update rapidly. The requirements
change is the software project must face challenge. Able to overcome the impact of requirements change,
software development risks can be effectively reduced. Agile software development uses the Iterative and
Incremental Development (IID) process and focuses on the workable software and client communication.
Agile software development is a very suitable development method for handling the requirements change in
software development process. In agile development, user stories are the important documents for the
client communication and criteria of acceptance test. However, the agile development doesn’t pay attention
to the formal requirements analysis and artifacts tracability to cause the potential risks of software change
management. In this paper, analyzing and collecting the critical quality factors of user stories, and
proposes the User Story Quality Measurement (USQM) model. Applied USQM model, the requirements
quality of agile development can be enhanced and risks of requirement changes can be reduced.
A Review on Quality Assurance of Component- Based Software Systemiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
FISHBONE ANALYSIS ON WASTES IN SOFTWARE DEVELOPMENT USING THE LEAN I.T. PRINC...ecij
The transformative global economy posed challenges to businesses in service management. In this computing age, the perceptual and operational edge of a certain business or organization manifested on the kind of technology it offers in the Service Management. Organizations have long recognized the
importance of managing key resources such as people and information. Information has now moved to its rightful place as a key resource in the organization and therefore management of the same can be instituted by employing methodology. To keep their brand promise, technology has been used;The number of new entrants to every sectors of economy has grown significantly in recent years, and each firm strives to make their daily operation efficient in which demand for business software or application software getting higher and businesses or organizations opted to build or buy this software. Because of
new entrants, it had offered opportunity to software developers to translate business processes into systems. This study investigates waste in the software development by application of Lean principles. Like any conventional projects, software becomes buggy and oftentimes it fails. Software failure is always attributed to the software engineering, not the incompetence of project managers, inadequacy of the
people on the project, or lack of clear goal. The researchers’ contentions are there wastes in the software development and serve as mechanism and evidence to why software fails. Software failure is not attributed to the software itself, it includes however the acceptance of the clients and end-users. Descriptive secondary data analysis, participant observation and Fishbone Analysis were the methodology used in the study. Wastes include unfinished or partially done work, extra features,
relearning, handoffs, delays, task switching, and defects.
An Elite Model for COTS Component Selection ProcessIJEACS
Component-based software development (CBD) promises development of high-quality trustworthy software systems within specified budget and deadline. The selection of the most appropriate component based on specific requirement plays a vital role for high-quality software product. Multi-Agent software (MAS) engineering approach played a crucial role for selection of the most appropriate component based on a specific requirement in a distributed environment. In this paper, multi agent technique is used for component selection. A semi-automated solution to COTS component selection is proposed. It is evident from the result that (MAS) plays an essential role and is suitable for component selection in a distributed environment keeping in view of the system design and testing strategies.
Transitioning IT Projects to Operations Effectively in Public Sector : A Case...ijmpict
Operational effectiveness is measured by the application availability to end-users and the extent of
convenient usage of the application to perform their business functions. This paper demonstrates how
varying project transition process can affect the operational effectiveness. This explanatory case study
uses various projects in a South Australian government agency as the candidates for evaluation. With
various applications that existed in the production environment, the end-users had varying levels of
satisfaction. This research analyses factors influencing the operational efficiency the projects in transition
from project delivery into operations. The evidence clearly demonstrates criticality of the transition
process of applications from project delivery phase to operations phase. The research analyses the
findings specific to government agencies and presents recommendations. These findings can be useful to
public sector agencies for improving availability of IT applications in operations.
A reliability estimation framework for OO design complexity perspective has been developed inthis paper. The proposed framework correlates the object oriented design constructs with complexity and also correlates complexity with reliability. No such framework has been available in the literature that estimates software reliability of OO design by taking complexity into consideration. The framework bridges the gap between object oriented design constructs, complexity and reliability. Framework measures and minimizes the complexity of software design at the early stage of software development life cycle leading to a reliable end product. Reliability and complexity estimation models have been proposed by following the proposed framework. Complexity estimation model has been developed which takes OO design constructs into consideration and proposed reliability estimation models take complexity in consideration for estimating reliability of OO design.
EVALUATION AND STUDY OF SOFTWARE DEGRADATION IN THE EVOLUTION OF SIX VERSIONS...csandit
When a software system evolves, new requirements may be added, existing functionalities
modified, or some structural change introduced. During such evolution, disorder may be
introduced, complexity increased or unintended consequences introduced, producing rippleeffect
across the system. JHotDraw (JHD), a well-tested and widely used open source Javabased
graphics framework developed with the best software engineering practice was selected
as a test suite. Six versions were profiled and data collected dynamically, from which two metrics were derived namely entropy and software maturity index. These metrics were used to investigate degradation as the software transitions from one version to another. This study observed that entropy tends to decrease as the software evolves. It was also found that a
software product attains its lowest decrease in entropy at the turning point where its highest maturity index is attained, implying a possible correlation between the point of lowest decreasein entropy and software maturity index.
Validation Item – Stating Requirements at the Adequate Level of DetailEdwagney Luz
Nowadays many software companies still suffer from dealing with low quality software requirements specification. They need some mechanisms to improve their requirements engineering activities. In this paper we discuss how we face this issue, presenting our STARS methodology. Its foundation relies on the concept of Validation Item (VI) and its steps and most expressive benefits are highlighted.
REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...ijaia
Predicting the time to build software is a very complex task for software engineering managers. There are complex factors that can directly interfere with the productivity of the development team. Factors directly related to the complexity of the system to be developed drastically change the time necessary for the completion of the works with the software factories. This work proposes the use of a hybrid system based on artificial neural networks and fuzzy systems to assist in the construction of an expert system based on rules to support in the prediction of hours destined to the development of software according to the complexity of the elements present in the same. The set of fuzzy rules obtained by the system helps the management and control of software development by providing a base of interpretable estimates based on fuzzy rules. The model was submitted to tests on a real database, and its results were promissory in the construction of an aid mechanism in the predictability of the software construction
ANALYSIS OF SOFTWARE QUALITY USING SOFTWARE METRICSijcsa
Software metrics have a direct link with measurement in software engineering. Correct measurement is the prior condition in any engineering fields, and software engineering is not an exception, as the size and complexity of software increases, manual inspection of software becomes a harder task. Most Software Engineers worry about the quality of software, how to measure and enhance its quality. The overall objective of this study was to asses and analysis’s software metrics used to measure the software product and process.
In this Study, the researcher used a collection of literatures from various electronic databases, available since 2008 to understand and know the software metrics. Finally, in this study, the researcher has been identified software quality is a means of measuring how software is designed and how well the software conforms to that design. Some of the variables that we are looking for software quality are Correctness, Product quality, Scalability, Completeness and Absence of bugs, However the quality standard that was used from one organization is different from others for this reason it is better to apply the software metrics to measure the quality of software and the current most common software metrics tools to reduce the subjectivity of faults during the assessment of software quality. The central contribution of this study is an overview about software metrics that can illustrate us the development in this area, and a critical analysis about the main metrics founded on the various literatures.
Defect Prevention Based on 5 Dimensions of Defect Originijseajournal
Discovering the unexpected is more important than confirming the known [7]. In software development,
the “unexpected” one relates to defects. These defects when unattended would cause failure to the product
and risk to the users. The increasing dependency of society on software and the crucial consequences that a
failure can cause requires the need to find out the defects at the origin itself. Based on the lessons learnt
from the earlier set of projects, a defect framework highlighting the 5 Dimensions (Ds) of defect origin is
proposed in this work. The defect framework is based on analyzing the defects that had emerged from
various stages of software development like Requirements, Design, Coding, Testing and Timeline (defects
due to lack of time during development). This study is not limited to just identifying the origin of defects at
various phases of software development but also finds out the reasons for such defects, and defect
preventive (DP) measures are proposed for each type of defect. This work can help practitioners choose
effective defect avoidance measures.
In addition to arriving at defect framework, this work also proposes a defect injection metric based on
severity of the defect rather than just defect count, which gives the number of adjusted defects produced by
a project at various phases. The defect injection metric value, once calculated, serves as a yardstick to
make a comparison in the improvements made in the software process development between similar set of
projects.
MIDAS: A Design Quality Assessment Method for Industrial SoftwareGanesh Samarthyam
Siemens Corporate Development Center Asia Australia (CT DC AA) develops and maintains software applications for the Industry, Energy, Healthcare, and Infrastructure & Cities sectors of Siemens. The critical nature of these applications necessitates a high level of software design quality. A survey of software architects indicated a low level of satisfaction with existing design assessment practices in CT DC AA and highlighted several shortcomings of existing practices. To address this, we have developed a design assessment method called MIDAS (Method for Intensive Design ASsessments).
MIDAS is an expert-based method wherein manual assessment of
design quality by experts is directed by the systematic application
of design analysis tools through the use of a three view-model
consisting of design principles, project-specific constraints, and
an “ility”-based quality model. In this paper, we describe the
motivation for MIDAS, its design, and its application to three
projects in CT DC AA. We believe that the insights from our
MIDAS experience not only provide useful pointers to other
organizations and practitioners looking to assess and improve
software design quality but also suggest research questions for
the software engineering community to explore.
Today in era of software industry there is no perfect software framework available for
analysis and software development. Currently there are enormous number of software development
process exists which can be implemented to stabilize the process of developing a software system. But no
perfect system is recognized till yet which can help software developers for opting of best software
development process. This paper present the framework of skillful system combined with Likert scale. With
the help of Likert scale we define a rule based model and delegate some mass score to every process and
develop one tool name as MuxSet which will help the software developers to select an appropriate
development process that may enhance the probability of system success.
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT ijseajournal
Software product quality can be defined as the features and characteristics of the product that meet the user needs. The quality of any software can be achieved by following a well defined software process. These software process results into various metrics like Project metrics, Product metrics and Process metrics. Software quality depends on the process which is carried out to design and develop software. Even though the process can be carried out with utmost care, still it can introduce some error and defects. Process metrics are very useful from management point of view. Process metrics can be used for improving the software development and maintenance process for defect removal and also for reducing the response
time.
This paper describes the importance of capturing the Process metrics during the quality audit process and also attempts to categorize them based on the nature of error captured. To reduce such errors and defects found, steps for corrective actions are recommended.
PROBLEMS ARE THE GOLDEN EGGS
problems??? day by day in our proffessional life we faces so many problems, but didn't recognize about the problem. Because we are habituate to facing to problems, if we want to solve the problems, first we can feel YES am facing a problem then you have a chance to solve it... after that we should find is it REPEATATIVE problem or New problem, on the bases of the issue we can take further steps, how to break it. how to analyse, how to find countermeasure, how to check is it suitable or not, how to make standard.... if you want to know gothrough my presentations..
This is my first presentation posted in Slideshare
AN IMPROVED REPOSITORY STRUCTURE TO IDENTIFY, SELECT AND INTEGRATE COMPONENTS...ijseajournal
An ultimate goal of software development is to build high quality products. The customers of software
industry always demand for high-quality products quickly and cost effectively. The component-based
development (CBD) is the most suitable methodology for the software companies to meet the demands of
target market. To opt CBD, the software development teams have to customize generic components that are
available in the market and it is very difficult for the development teams to choose the suitable components
from the millions of third party and commercial off the shelf (COTS) components. On the other hand, the
development of in-house repository is tedious and time consuming. In this paper, we propose an easy and
understandable repository structure to provide helpful information about stored components like how to
identify, select, retrieve and integrate components. The proposed repository will also provide previous
assessments of developers and end-users about the selected component. The proposed repository will help
the software companies by reducing the customization effort, improving the quality of developed software
and preventing integrating unfamiliar components.
An Empirical Study of SQA Function Effectiveness in CMMI Certified Companies ...zillesubhan
The most vital component for any software development process is, “quality”, as it ensures the reliability and effectiveness of new software. Software Quality Assurance (SQA) techniques as well as a standardized qualitative metric known as Capability Maturity Model Integration (CMMI) are used to ensure this quality. The purposes of both the practices are same as both make efforts for end product’s quality. In spite of this, CMMI certified organizations have SQA function, but face a lot of issues, which resulted in lowering the quality of the products. Standards usually provide documentation, but SQA consider testing as a chief element and also documentation only for authentication and appraisals. The relationship of the SQA function with CMMI has not attended much in common literatures. This paper is centered on investigation conducted through data collection from diverse CMMI certified software development firm to check the practice of SQA function.
Although there has been an extensive study over delivering, increasing and maintaining software quality, there has not been enough aide- mémoire on ‘Rating a Software‘s Quality’. This study would project the literature review thus far and also sculpt the scope and need for the evolution of a rating system of software quality for the future.
Syed Zaffar Iqbal, Prof. Urwa Javed and Dr. Shakeel Ahmed Roshan. Department of Computer Science, Alhamd Islamic University, Pakistan. “Software Quality Assurance Model for Software Excellence with Its Requirements” United International Journal for Research & Technology (UIJRT) 1.1 (2019): 39-43.
A novel defect detection method for software requirements inspections IJECEIAES
The requirements form the basis for all software products. Apparently, the requirements are imprecisely stated when scattered between development teams. Therefore, software applications released with some bugs, missing functionalities, or loosely implemented requirements. In literature, a limited number of related works have been developed as a tool for software requirements inspections. This paper presents a methodology to verify that the system design fulfilled all functional requirements. The proposed approach contains three phases: requirements collection, facts collection, and matching algorithm. The feedback results provided enable analysist and developer to make a decision about the initial application release while taking on consideration missing requirements or over-designed requirements.
A Novel Method for Quantitative Assessment of Software QualityCSCJournals
This paper deals with quantitative quality model that needs to be practiced formally through out the software development life cycle at each phase. Proposed quality model emphasizes that various stakeholders need to be consulted for quality requirements. The quality goals are set through various measurements and metrics. Software under development is evaluated against expected value of set of metrics. The use of proposed quantitative model is illustrated through a simple case study. The unaddressed quality attribute reusability in ISO 9126 is also discussed.
Insights on Research Techniques towards Cost Estimation in Software Design IJECEIAES
Software cost estimation is of the most challenging task in project management in order to ensuring smoother development operation and target achievement. There has been evolution of various standards tools and techniques for cost estimation practiced in the industry at present times. However, it was never investigated about the overall picturization of effectiveness of such techniques till date. This paper initiates its contribution by presenting taxonomies of conventional cost-estimation techniques and then investigates the research trends towards frequently addressed problems in it. The paper also reviews the existing techniques in well-structured manner in order to highlight the problems addressed, techniques used, advantages associated and limitation explored from literatures. Finally, we also brief the explored open research issues as an added contribution to this manuscript.
IT PROJECT SHOWSTOPPER FRAMEWORK: THE VIEW OF PRACTITIONERSijseajournal
The study intended to unravel critical IT project showstoppers which tend to halt IT projects temporarily or permanently, and ultimately cause them to fail, by positioning them in the systems development life cycle (SDLC) framework. Interviewing 8 IT project and program managers of the banking and telecommunications industries in Ghana individually and in a group, 19 critical showstoppers were identified spanning the whole SDLC. Generally, it was observed that for the successful completion of IT projects, the expertise and availability of project managers and team members are critical. Again, the project manager must be able to prove that the project is in line with the objectives and strategic direction of the business, is being mounted to gain competitive advantage, and has a solid business case. Thirdly, funding is key at all stages of the cycle, as well as approval for continuation at various stages.
FROM THE ART OF SOFTWARE TESTING TO TEST-AS-A-SERVICE IN CLOUD COMPUTINGijseajournal
Researchers consider that the first edition of the book "The Art of Software Testing" by Myers (1979)
initiated research in Software Testing. Since then, software testing has gone through evolutions that have
driven standards and tools. This evolution has accompanied the complexity and variety of software
deployment platforms. The migration to the cloud allowed benefits such as scalability, agility, and better
return on investment. Cloud computing requires more significant involvement in software testing to ensure
that services work as expected. In addition to testing cloud applications, cloud computing has paved the
way for testing in the Test-as-a-Service model. This review aims to understand software testing in the
context of cloud computing. Based on the knowledge explained here, we sought to linearize the evolution of
software testing, characterizing fundamental points and allowing us to compose a synthesis of the body of
knowledge in software testing, expanded by the cloud computing paradigm.
From the Art of Software Testing to Test-as-a-Service in Cloud Computingijseajournal
Researchers consider that the first edition of the book "The Art of Software Testing" by Myers (1979)
initiated research in Software Testing. Since then, software testing has gone through evolutions that have
driven standards and tools. This evolution has accompanied the complexity and variety of software
deployment platforms. The migration to the cloud allowed benefits such as scalability, agility, and better
return on investment. Cloud computing requires more significant involvement in software testing to ensure
that services work as expected. In addition to testing cloud applications, cloud computing has paved the
way for testing in the Test-as-a-Service model. This review aims to understand software testing in the
context of cloud computing. Based on the knowledge explained here, we sought to linearize the evolution of
software testing, characterizing fundamental points and allowing us to compose a synthesis of the body of
knowledge in software testing, expanded by the cloud computing paradigm.
Human safety in the Middle East is a crucial aspect especially when working on critical mission systems. Any trivial error may result in inevitable dangerous causalities that lead to loss of innocent souls. The main objective of this paper is to introduce a complete study of a system that automates the currently adopted manual process of having dedicated men to control the barriers at the railway crossings when trains pass, the main objective is to reduce the possible human errors resulting from manual control. This study aims to provide a robust solution that adheres to a formal, systematic and new procedure to enhance the overall quality of requirements gathered for critical systems. In addition, it reflects how effective is the usage of goal oriented modelling in requirements elicitation stage for critical systems to define a clear scope and validate requirements against any missing, inconsistent or vague requirements at early stage.
From previous year researches, it is concluded that testing is playing a vital role in the development of the software product. As, software testing is a single approach to assure the quality of the software so most of the development efforts are put on the software testing. But software testing is an expensive process and consumes a lot of time. So, testing should be start as early as possible in the development to control the money and time problems. Even, testing should be performed at every step in the software development life cycle (SDLC) which is a structured approach used in the development of the software product. Software testing is a tradeoff between budget, time and quality. Now a day, testing becomes a very important activity in terms of exposure, security, performance and usability. Hence, software testing faces a collection of challenges.
PRODUCT QUALITY EVALUATION METHOD (PQEM): TO UNDERSTAND THE EVOLUTION OF QUAL...ijseajournal
Promoting quality within the context of agile software development, it is extremely important as well as
useful to improve not only the knowledge and decision-making of project managers, product owners, and
quality assurance leaders but also to support the communication between teams. In this context, quality
needs to be visible in a synthetic and intuitive way in order to facilitate the decision of accepting or
rejecting each iteration within the software life cycle. This article introduces a novel solution called
Product Quality Evaluation Method (PQEM) which can be used to evaluate a set of quality characteristics
for each iteration within a software product life cycle. PQEM is based on the Goal-Question-Metric
approach, the standard ISO/IEC 25010, and the extension made of testing coverage in order to obtain the
quality coverage of each quality characteristic. The outcome of PQEM is a unique multidimensional value,
that represents the quality level reached by each iteration of a product, as an aggregated measure. Even
though a value it is not the regular idea of measuring quality, we believe that it can be useful to use this
value to easily understand the quality level of each iteration. An illustrative example of the PQEM method
was carried out with two iterations from a web and mobile application, within the healthcare environment.
A single measure makes it possible to observe the evolution of the level of quality reached in the evolution
of the product through the iterations.
IJCER (www.ijceronline.com) International Journal of computational Engineerin...ijceronline
Call for paper 2012, hard copy of Certificate, research paper publishing, where to publish research paper,
journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJCER, journal of science and technology, how to get a research paper published, publishing a paper, publishing of journal, publishing of research paper, research and review articles, IJCER Journal, How to publish your research paper, publish research paper, open access engineering journal, Engineering journal, Mathematics journal, Physics journal, Chemistry journal, Computer Engineering, Computer Science journal, how to submit your paper, peer review journal, indexed journal, research and review articles, engineering journal, www.ijceronline.com, research journals,
yahoo journals, bing journals, International Journal of Computational Engineering Research, Google journals, hard copy of Certificate,
journal of engineering, online Submission
يسلط الكتاب الضوء على دور اقتصاد البيانات في دعم الأنظمة الاقتصادية الوطنية، وإرشاد القرارات والسياسات في مختلف القطاعات. ويقدم مجموعة من التوصيات لتطوير السياسات التنظيمية والبنى التحتية ودعم الابتكار وتشجيع نمو القطاع الخاص وريادة الأعمال.
يسلط الكتاب الضوء على أهم أبعاد التحول الرقمي الحكومي، ويقدم أطر ومفاهيم عامة لتصميم وتطوير المنظومات الخدماتية القائمة على إنشاء القيمة وتحقيق النمو الاجتماعي والاقتصادي
يتناول الكتاب المتغيرات التي فرضتها التكنولوجيات الحديثة على المفاهيم المرتبطة بالنقد، ودور العملات الرقمية في تشكيل مستقبل الأسواق العالمية.
ويتطرق أيضاً إلى التحولات الكبيرة في الاقتصاد العالمي الجديد، ودور العملة في الاقتصاد، والمشكلات التي تواجه العملات بشكل عام والورقية على الأخص، والمدفوعات والعملات الإلكترونية، ومفهوم العملات الرقمية الحكومية والتجارية والافتراضية والمشفرة والمستقرة، هذا بالإضافة إلى مميزات العملات المشفرة ومخاطرها، والقيم المتذبذبة للعملات التجارية، والترويج الذي يتم لها.
كما يستعرض الكتاب التجارب الدولية للعملات الرقمية، والتجارب العربية في هذا المجال، والمواقف الحكومية من العملات المشفرة، وعملة الـفيسبوك، ومستقبل العملات الرقمية والمدفوعات الرقمية، هذا بالإضافة إلى دور تكنولوجيا البلوك تشين في تأمين المعاملات المالية.
ويضع الكتاب النموذج المتوقع لعمل العملات الرقمية الحكومية، وذلك من خلال التطرق إلى 9 خصائص رئيسية مطلوبة لأي منظومة عملة ناجحة من العملات الرقمية، وكذلك 8 مكونات أساسية لنموذج عمل العملات الرقمية الحكومية، مع شرح آلية عمل النموذج.
ومن بين أهم التوصيات التي أوصى بها الكتاب هو ضرورة أن يتوحد العالم العربي لدراسة وتأسيس عملة رقمية وفق منهج علمي مدروس وجماعي، تكون الأولوية الحاكمة فيه هي التوافق والتكامل.
يشير الكتاب على أن التنمية الفكرية في بناء المؤسسات تمثل إحدى أهم الأبعاد المحورية نحو المجتمعات والاقتصادات المستدامة، ويبين أن وصول المؤسسات للقمة والتميز يتطلب منها تكوين فهم دقيق للقيمة المضافة التي تنشئها المؤسسة قبل بحث الكيفية والوسيلة، وأن القدرة على الابتكار والتغيير والتحديث هي السمات التي يمكن من خلالها أن تسهم في توفير قوة دافعة للمؤسسة للتميز والاستدامة.
ويتناول الكتيب عدة محاور ضرورية من شأنها أن تعزز من إسهامات القيادات الشابة في مؤسساتهم، والتي جاء في مقدمتها: مفهوم القمة في العمل المؤسسي، ومجالات وصول المؤسسات للقمة، وأدوات الوصول للقمة والمحافظة عليها؛ وشرح لمفاهيم الإدارة المتميزة، وأساسيات إدارة الموارد البشرية، ومدى الاستفادة من التكنولوجيا، وما يحمله المستقبل من تطورات في مجالات جديدة بالثورة الصناعية الرابعة، وطريقة الوصول للقمة، وكذلك استمرارية البقاء على القمة التي تستدعي التعلم بشكل مستمر، وكيفية استدامة التميز المعتمدة على المنهج المتكامل القابل لتحديد نفسه بصفة دائمة.
يتطرق الكتاب إلى بعض المعطيات التي أصبحت تدفع نحو تحولات كبيرة في النظام الاقتصادي العالمي، ويتناول توقعات المؤسسات الدولية لأداء الاقتصاد العالمي.
كما يضع الكتاب عدد من السمات المفترضة للنظام الاقتصادي الجديد خلال فترة ما بعد (كوفيد 19).
يتناول الكتاب الصادر من مجلس الوحدة الاقتصادية العربية بجامعة الدول العربية بعنوان “الاقتصاد الرقمي ودوره في تعزيز الأمن الوطني”، الفرص التي يمتلكها الاقتصاد الرقمي ويدعو للتركيز على الاقتصاد الرقمي كعنصر تنموي استراتيجي لتطوير مقومات الأمن الاقتصادي.
The study highlights the effects of the revolutions and unrest in Arab countries with an attempt to provide an overview of Arab present and its prospects. It primarily recommends the adoption and employment of advanced technologies in reconstruction efforts and supporting the development of resilient and sustainable economies.
The book is designed to promote understanding of conflicts in organisations, and establish how they can be handled effectively, and make them work as opportunities for improvement and constructive change.
دراسة موجزة لمجلس الوحدة الاقتصادية العربية بجامعة الدول العربية حول آثار أزمة جائحة كورونا على الدول العربية وتقدم توصيات لمتخذي القرار وراسمي السياسات للتعامل مع تداعيات الأزمة.
The book pinpoints that the digital future is exposed to the danger of chaotic, unregulated growth, which constitutes a challenge for countries that still operate according to traditional economic models, and that public thinking in the Arab region in facing challenges still follows the "reaction methodology" and temporary solutions with short-term prospects, and that this is confirmed by the current international indicators of its competitiveness. The book proposes that in order to address this, visions and efforts should be based on strategies driven by scientific methods, and with it the Arab countries must develop a clear understanding of the main challenges before jumping to seize opportunities.
The book shows that it is fundamental for policymakers and decision-makers to have precise and accurate understanding of the intricate details in digital transformation initiatives and the role that modern technologies can play in changing the rules and systems of current practices, and in how to develop digitized, more innovative business models with which to build resilient and sustainable social economies and systems.
The book also draws on the current data and indicators of the global economy and that they are pushing to form a worrying picture of weaknesses in Arab countries, which in turn may threaten the stability of the entire region, especially with regard to the "cognitive decline" and “increasing unemployment rates” and “poor economic performance"; and that these challenges call for dealing with it as key strategic indicators that require urgent action plans; with emphasis that these plans need to be designed to reflect different ways of thinking and adapted to the nature of the requirements and challenges of the 21st century and treat them as forces and positive factors.
The book highlights the importance of accelerating the implementation of a set of initial reform projects to encourage the development of more dynamic and developed digital business environments in the Arab region, in parallel with the development of educational systems and healthcare, and strengthening agricultural capabilities to achieve food security targets, and focus on economies based on industry and production, and promoting the development of Arab digital platforms to support e-commerce practices.
يتطرق الكتاب إلى تحليل الوضع الراهن لمشاريع التحول الرقمي في الحكومات العربية، والمراحل الأربعة للتحول في المنظومة الحكومية، وأهم العوائق والتحديات التي تعطل مسيرة التطور والتقدم ويقدم الكتاب بعض الحلول الموجزة والمقترحة في هذا السياق.
ويوضح الكتاب بأن مفاهيم الصناعة باتت تدفع بتطبيقات ممكّنة للآلات للتواصل فيما بينها من خلال الشبكات الإلكترونية واتخاذ القرارات اللامركزية بمستويات تفوق القدرات البشرية. كما أن ذلك أصبح يدفع أيضاً إلى ظهور نماذج عمل للمؤسسات أشبه بـ «المصانع الذكية» تتميز بقدراتها الآلية على التطوير الذاتي والتعامل مع المتغيرات والتعلم المستدام، والقدرة على تطوير المنظومات الخدمية والإنتاجية بمستويات كفاءة وفاعلية ومستويات أداء غير مسبوقة؛ وأن المؤسسات في المنطقة العربية لم يعد أمامها خيار سوى الاستمرار ومواكبة التقدم في تنفيذ مشاريع التحول الرقمي.
وجدير بالذكر أن المنطقة العربية وكثير من دول العالم شهدت في العقدين الماضيين آلاف المبادرات والمشاريع في مجال التحول الرقمي، مستهدفة دعم قدراتها كحكومة مسئولة عن صناعة العديد من القرارات وتقديم الخدمات، وانطلاقاً من رؤية القيادات في هذه الدول لتحويل الخدمات الحكومية إلى خدمات إلكترونية وذكية.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Connector Corner: Automate dynamic content and events by pushing a button
Using Quality Models to Evaluate National ID Systems: the Case of the UAE
1. International Journal of Social Sciences Volume 1 Number 2
Using Quality Models to Evaluate National ID
Systems: the Case of the UAE
Ali M. Al-Khouri
ISO 9126 framework. The next section presents the methods
Abstract—This paper presents findings from the evaluation study employed to obtain data based on which the system was
carried out to review the UAE national ID card software. The paper evaluated. The next few sections provide an overview of the
consults the relevant literature to explain many of the concepts and PRIDC system, its components, its development lifecycle
frameworks explained herein. The findings of the evaluation work
approach, results obtained from the previous tests, and
that was primarily based on the ISO 9126 standard for system quality
measurement highlighted many practical areas that if taken into mapping these latter set of data to ISO 9126 quality attributes.
account is argued to more likely increase the success chances of The paper is then concluded with some reflections on the
similar system implementation projects. areas that need to be considered when pursing similar
evaluation studies with a focus on national ID systems.
Keywords—National ID system, software quality, ISO 9126.
II. SOFTWARE QUALITY
I. INTRODUCTION It is becoming a common trend for IT projects to fail. The
T HE United Arab Emirates (UAE) have recently initiated a
national ID scheme that encompasses very modern and
sophisticated technologies. The goals and objectives of the
rate of failure in government projects is far higher than those
in the private industry. One of the main causes for such
failures was widely quoted in the literature to be related to
UAE national ID card programme go far beyond introducing a poor user requirements resulting in a system that does not
new ID card document and homeland security [1]. To deliver what was expected from it (see also the statistics
increase its success, the government is pushing for many presented in Figure 1 from the recent Standish Group study).
innovative applications to explore ‘what can be done with the
card’. Examples of such possible applications of the card Standish Study Results:
ranges from using it as a physical identity document to prove 51% of project failed
identity, to linking it to wide range of government services, 31% were partially successful
with the vision of replacing all existing identity documents Failure causes:
(e.g., driving license, labour card, health card, etc.) with this 13.1% In complete requirements
12.4% Lack of user involvement
new initiative. From such perspectives, it becomes critical 10.6% Inadequate resources
that such systems maintain a high level of quality. Quality 9.9% Unrealistic user expectations
models can play a good role as useful tools for quality 9.3% Lack of management support
8.7 % Requirements keep changing
requirements engineering as well as for quality evaluation, 8.1% Inadequate planning
since they define how quality can be measured and specified 7.5 % System no longer needed
[2]. In fact, the literature reveals that the use of quality
frameworks and models may well contribute to project Fig. 1 Standish group study results
success, as it enables the early detection and addressing of
risks and issues of concern at an early stage of the project (see The CHAOS survey of 8000+ projects found that of the
for example [3],[4],[5]. This paper attempts to provide a short eight main reasons given for project failures, five are
evaluation of the population register software (referred to in requirements related. Getting the requirements right is
this paper as PRIDC – population register and ID card) probably the single most important thing that can be done to
implemented part of the national ID card project in the UAE achieve customer satisfaction. Figure 2 depicts further
to pinpoint areas of possible improvements. reasons for such failures [6]. Many of these failures are argued
The paper is structured as follows. The first section to could have been prevented with requirements verification
provides brief background information about the concept of and the adoption of quality assurance frameworks [4],[7].
software quality and measurement standards, with focus on
Manuscript received March 27, 2007.
Ali M. Al-Khouri is with Emirates Identity Authorit, Abu Dhabi, United
Arab Emirates (phone: +97150-613-7020; fax: +9712-404-6661; e-mail:
alkhouri@ emiratesid.ae).
117
2. International Journal of Social Sciences Volume 1 Number 2
Source: Adopted from Pfleeger (2001)
Fig. 2 Causes of faults during development
A. Quality Measurement Standards
In general terms, there are three different approaches to
system quality assurance: Software quality assessment is attracting great attention as
the global drive for systemic quality assurance continues to
1. Product Certification gather momentum e.g., pressures of consolidations, mergers,
An independent party (or a QA company) conduct and downsizing, emergence of new technologies [8]. Of the
a limited exercise in verification, validation and / very initial works conducted in the field of software quality
or test of the software components. assessment was done by B. Boehm and associates at TRW [9]
and incorporated by McCall and others in the Rome Air
2. Process Audit:
Development Center (RADC) report [10]. The quality models
An independent party conduct and assessment of at the time focused on the final product and on the
the development process used to design, build and identification of the key attributes of quality from the user’s
deliver the software component. point of view. The assessment framework was later
3. User Satisfaction: improved; consisting of quality attributes related to quality
Analysis of the actual behaviour of the software. factors, which were decomposed into particular quality criteria
and lead to quality measures (see Figure 3).
Since the objective of the evaluation study in this paper Attempted standardisation work over the intervening years
here is to judge whether the given implemented system has resulted in the Software Product Evaluation Standard, ISO-
met the requirement of product quality, the third category 9126 (ISO/IEC, 1991). This model was fairly closely
approach was defined as the boundaries for the evaluation patterned after the original Boehm structure, with a six
taken place in this study. primary quality attributes that were subdivided into 27 sub-
characteristics as illustrated in Figure 4.
118
3. International Journal of Social Sciences Volume 1 Number 2
carrying out representative tasks in a realistic working
environment. It can be used to measure the degree of
excellence, and can be used to validate the extent to which the
software meets user needs. Figure 5 depicts the relationship
between these approaches.
Fig. 3 Boehm quality model
However, the standard was criticised to provide very
general quality models and guidelines, and that they are very
difficult to apply to specific domains such as components and
CBSD (see for example: [11],[12]. However, this is believed
by others to be in fact one of its strengths as it is more
adaptable and can be used across many systems [13],[14]. To
solve this problem ISO/IEC 9126 has been revised to include
a new quality model which distinguishes between three
different approaches to product quality:
(1) External Quality metrics – ISO TR 9126 -1: a result of
the combined behaviour of the software and the computer
system and can be used to validate the internal quality of the
software;
(2) Internal Quality metrics – ISO TR 9126 – 3: a
quantitative scale and measurement method, which can be
used for measuring an attribute or characteristic of a software
product;
(3) Quality in use metrics – ISO TR 9126 – 4: is the
Fig. 5 Relationship between internal quality,
effectiveness, productivity and satisfaction of the user when external quality and quality in use
ISO/IEC 9126
Functionality Reliability Usability Efficiency Maintainability Portability
Suitability Maturity Understand Time behaviour Analysability Adaptability
Accuracy Fault tolerance ability Changeability Installability
Interoperability Recoverability Learnability Resource Stability Co-existence
Security Operability utilisation Testability Replaceability
Attractiveness
Functionality Reliability Usability Efficiency Maintainability Portability
compliance Compliance Compliance Compliance Compliance Compliance
Are the required How reliable is the Is the software How efficient is How easy is to How easy is to
functions available software? easy to use? the software? modify the transfer the
in the software? software? software to
another
environment?
Fig. 4 ISO/IEC 9126 standards characteristics
119
4. International Journal of Social Sciences Volume 1 Number 2
ISO 9126 quality characteristics and sub-characteristics
In brief, internal metrics measure the software itself, were used to evaluate the national ID system. In this
external metrics measure the behaviour of the computer-based investigation several evaluation methods were employed.
system that includes the software, and quality in use metrics Following were the prime sources of information for the
measure the effects of using the software in a specific context evaluation study:
of use. Appropriate internal attributes of the software are 1. information gathered from the test sessions took place
prerequisites for achieving the required external behaviour, during the acceptance of the project deliverables;
whereas external behaviour is a prerequisite for achieving
2. observation of the system environment (both at the
quality in use (see also Figure 6).
central operational and registration centres);
3. by means of recording the author’s own experience as
the Director of the Central Operations sector, and head
of the technical committee overseeing the
implementation of the programme.
In general, the evaluation was qualitative in nature. In
carrying out the evaluation and recording the findings, the
PRIDC system went through two types of testing; functional
and technical.
A. Functional Testing
Fig. 6 Approaches to software product quality This is an application level testing from business and
operational perspective. It is conducted on a complete,
It is also worth to mention that a new project was launched integrated system to evaluate the system's compliance with its
called SQuaRE - Software Product Quality Requirements and specified requirements. Often called black box testing, this
Evaluation (ISO/IEC 25000, 2005) - to replace the above but type of tests is generally performed by QA analysts who are
follow the same general concepts of 9126 standard (see also concerned about the predictability of the end-user experience.
Figure 7). During the deliverables acceptance, the national ID system
was tested with black box testing procedures (that focuses on
testing functional requirements and does not explicitly use
The five divisions of SQuaRE standard: knowledge of the internal structure) as per the test plan
designed by the solution provider. No change was allowed by
(1) Quality management division (ISO 2500n) the vendor to the test plan as they wanted to narrow down the
(2) Quality model division (ISO 2501n)
(3) Quality measurement division (ISO 2502n) scope of testing, and limit it to the test cases developed by
(4) Quality requirements division (ISO 2503n) them.
(5) Quality evaluation division (ISO 2504n)
B. Technical Testing
Fig. 7 SQuaRE standard This is the system level testing. It tests the systems, which
are supporting or enabling to run the Functional Applications.
Nonetheless, research and practical work shows that the With general perception of QA, the COTS are not seen to be
assessment of the quality of a software component is in required to test but they need to be audited for the
general a very broad and ambitious goal [11]. Recent research configuration and deployment set-up.
also shows that these characteristics and sub-characteristics Generally, white-box testing (also called as glass, structural,
covers a wide spectrum of system features and represent a open box or clear box testing) was considered here by the
detailed model for evaluating any software system as Abran et technical team to test the design of the system that should
al. [15] explain: allow a peek inside the ‘box’, as this approach focuses
specifically on using internal knowledge of the software to
“…ISO 9126 series of standards …. even though it is not exhaustive, guide the selection of test data. White-box testing requires the
this series constitutes the most extensive software quality model source code to be produced before the tests can be planned
developed to date. The approach of its quality model… is to represent and is much more laborious in the determination of suitable
quality as a whole set of characteristics… This ISO standard includes
input data and the determination if the software is or is not
the user’s view and introduces the concept of quality in use.”
correct. It is worth mentioning that a failure of a white box
test may result in a change which requires all black-box
III. METHODOLOGY testing to be repeated and the re-determination of the white
box paths. For this obvious reason there was always
negligence from the vendor to initiate a white-box testing.
“If you chase two rabbits, both will escape." Chinese Proverb It must also be heeded that neither black nor white box
testing can guarantee that the complete specifications have
120
5. International Journal of Social Sciences Volume 1 Number 2
implemented and all parts of the implementation have been 1. is an independent and replaceable part of a system that
tested. To fully test a software product, both black and white fulfils a clear functions,
box testing are required. While black-box testing was limited 2. works in the context of well-defined architecture,
by the test plan documents provided by the vendor, the white 3. communicates with other components by its interface.
box testing was not possible to perform since the source code
was still not handed-over to the client at the time of writing
this study. However, all the architectural component of the
Component 1
national ID Sub-systems which were selected and assembled
from the COTS were assessed and audited to their
Component Component 2 Software
configuration and deployment set up. Having addressed the Repository system
evaluation methods, the following sections describe the details
of the work carried out in this study. Component n
assemble
select
IV. PRIDC SYSTEM AS A COMPONENT-BASED SYSTEM
“For more than a decade good software development practice has Fig. 8 Component-based software development.
been based on a “divide and conquer” approach to software design
and implementation. Whether they are called “modules”, “packages”, According to [22] two main advances are raising the profile
“units”, or “computer software configuration items”, the approach of software components as the basic building blocks of
has been to decompose a software system into manageable
software - see also: [16],[23],[24],[25],[26],[27]:
components based on maximizing cohesion within a component and
minimizing coupling among components.” (Brown and Wallnau, (1) the object-oriented development approach which is
1996, p.414) based on the development of an application system
through the extension of existing libraries of self-
contained operating units, and
A. What is a component-based software (CBD)? (2) the economic reality that large-scale software
Component-based software development (CBD) is an development must take greater advantage of existing
emerging discipline that promises to take software engineering commercial software, reducing the amount of new code
into a new era [16]. Building on the achievements of object- that is required for each application.
Component-based development approach introduces
oriented software construction, it aims to deliver software
fundamental changes in the way systems are acquired,
engineering from a cottage industry into an industrial age for
integrated, deployed and evolved. Unlike the classic waterfall
Information Technology, wherein software can be assembled
approach to software development, component-based systems
from components, in the manner that hardware systems are
are designed by examining existing components to see how
currently constructed from kits of parts (ibid).
they meet the system requirements, followed by an iterative
Component-based software development (CBSD) shifts the
process of refining requirements to integrate with the existing
development emphasis from programming software to
components to provide the necessary functionality [22].
composing software systems as it embodies the ‘buy, don’t
build’ philosophy espoused by [17]. See also Figure 8. The B. Component-based software development lifecycle
concept is also referred to in the current literature as The component life cycle is similar to the life cycle of
component-based software engineering (CBSE) [18],[19]. It typical applications, except in implementation and acquisition
principally focuses on building large software systems by phases where the two life cycles differ. The life cycle of
integrating different software components and enhancing the component-based software systems can be summarised as
overall flexibility and maintainability of the systems. If follows:
implemented appropriately, the approach is argued to have the 1. Requirement analysis: a process of defining and
potential to reduce software development costs, assemble understanding the activities that the information system
systems rapidly, and reduce the spiraling maintenance burden is meant to support;
associated with the support and upgrade of large systems [20]. 2. Software architecture design: a process of developing
[21] define component-based software development as an detailed descriptions for the information system;
approach “based on the idea to develop software systems by 3. Component identification and customisation
selecting appropriate off-the-self components and then to (implementation): a process of formalising the design
assemble them with a well-defined software architecture.” in an executable way by acquiring complete
They state that a component has three main features: applications or components through purchase,
outsourcing, inhouse development, component-leasing
etc;
4. System integration: a process of adjusting the system to
fit the existing information system architecture. This
121
6. International Journal of Social Sciences Volume 1 Number 2
can include tasks such as adjusting components and of population into the system in accordance to the pre-
applications to their specific software surroundings, defined business requirements, and
5. System testing: a process of identifying and eliminating 2. the integration of several application/hardware package
nondesirable effects and errors and to verify the to achieve the desired functionality requirements e.g.,
information system. This can include both user- biometrics, PKI, smart cards.
acceptance- and application integration-tests, For the purpose of benchmarking PRIDC system
6. Software maintenance: a process of keeping the development lifecycle, a framework proposed by [28] for
integrated information system up and running. This quality assurance of component-based software development
can include tasks such as upgrading and replacing paradigm has been adopted in this study. The framework
applications and components in the information contains eight phases relating to components and systems that
system. It also includes performing consecutive provide better control over the quality of software
revisions of the integrated information system. development activities and processes:
Having shortly highlighted some background information
about the concept of component-based development and 1. Component requirement analysis.
lifecycle, the next section takes a snapshot of the PRIDC 2. Component development.
3. Component certification.
system and maps it to component-based software.
4. Component customisation.
C. PRIDC System development life cycle 5. System architecture design.
6. System integration.
TABLE I
COMPARISON OF PRIDC SYSTEM LIFECYCLE WITH COMPONENT BASED SOFTWARE APPROACH
No. Component-based Software Phases . PRIDC Remarks
System Life
cycle
1 Category A All the PRIDC project Lots which are part of collection and analysis of
Component Requirement Analysis –Component user requirements. Based on these functional requirement applications
requirement analysis is the process of discovering, have design.
understanding, documenting, validating and managing the
requirements for the Component.
2 Component Development- Component development is Category B This phase is an internal process happing within the solution provider
the process of implementing the requirement for a well- boundary.
functional, high quality component with multiple interface.
3 Component Certification-Component certification is the Category B
process that involves: This phase is an internal process happing within the solution provider
1.component outsourcing, boundary. Emirates ID may request for this certification if exits.
2.component selection,
3.component testing.
4 Component Customisation-It is the process that involves Category B
1) modifying the component for the specific requirement;2) This phase is an internal process happing within the solution provider
doing necessary changes to run the component on special boundary.
platform;3) upgrading the specific component to get better
performance or a higher quality.
5 System Architecture Design- Category B
It is the process of evaluating, selecting and creating This phase is an internal process happing within the solution provider
software architecture of a component-based system. boundary.
6 System Integration-it is process of assembling Category B
components selected into a whole system under the This phase is an internal process happing within the solution provider
designed system architecture. boundary.
7 System Testing- Category B The solution provider must have their own framework for testing (such as
System testing is the process of evaluating a system to : 1) and code testing and unit testing) of their product. But as a part of Project Lot
confirm that system satisfies the specified requirement; 2) Category C in category C (that means sub-systems installation and commissioning –
identify and correct defects in the system implementation. Lot 12,Lot3 testing) ,this task has performed.
8 System Maintenance- This is a one of the major phases missing in the PRIDC system life
System maintenance is the process of providing service No Lot of cycle. This is one of the rigorous drawbacks in PRIDC Project
and maintenance activities needed to use the software PRIDC contract.
effectively after it has delivered. project
matched
with this
phase
Broadly speaking, the development of the PRIDC system in 7. System testing.
general can be described to have incorporated the following 8. System maintenance.
two approaches:
1. the development of a uniquely tailored information The details of this benchmarking are presented in the
system (population register) to enable the registration following tables.
122
7. International Journal of Social Sciences Volume 1 Number 2
ISO Standard for system implementation.
In 1987 ISO and IEC(International Electrotechnical initiated the development of ISO/IEC 12207, on software life
Commission ) established a joint Technical Committee cycle processes to fill a critical need. The ISO was published
(JCT1) on Information Technology. In June 1989, the JCT1 August 1,1995.
Fig. 9 ISO 12207 standard
A comparison of PRIDC system with ISO standard.
PRIDC systems Lifecycle is currently based on project comparative study of PRIDC Systems Life Cycle with ISO
implementation phases. The Project implementation is 12207 standard can be presented as below:
executing in Lot-wise as framed in the contract. A
TABLE II
COMPARISON OF PRIDC SYSTEM WITH ISO 12207 STANDARD
No ISO 12207 PRIDC System Life cycle
1 Primary life cycle processes
1.1 Acquisition process All lots of Category A.
1.2 Supply process All lots of Category D.
1.3 Development process All lot of Category B.
1.3.1 Process implementation All lots of Category C.
1.3.2 System requirement analysis All lots of Category A.
1.3.3 System architectural design All lots of Category B.
1.3.4 Software requirement analysis The solution provider internal process.
1.3.5 Software architectural design The solution provider internal process.
1.3.6 Software detail design The solution provider internal process.
1.3.7 Software coding and testing The solution provider internal process.
EIDA had done few of such testing.
1.3.8 Software integration The solution provider internal process.
1.3.9 Software Qualification testing The solution provider internal process.
1.3.10 System integration The solution provider internal process.
123
8. International Journal of Social Sciences Volume 1 Number 2
1.3.11 Software qualification testing The solution provider internal process.
1.3.12 Software installation The solution provider internal process.
1.3.13 Software acceptance Support Lot 12 and Lot3 testing
4 Operation Process Need to define.
5 Maintenance Process Need to define.
6 Supporting life cycle processes
6.1 Documentation process Done as a part Project deliverable.
6.2 Done as a part Project deliverable.
Configuration management process
6.3 Quality assurance process Not done as a part of Project contract.
6.4 Verification process
Can be considered, Lot 3 system compliance test.
6.5 Validation process
Can be considered, Lot 12 system compliance test.
6.6 Joint review process
Can be consider – program management meeting.
6.7 Audit process Need to perform.
6.8 Problem resolution process Need to perform.
7 Organizational life cycle processes
7.1 Management process EIDA needs to perform.
7.2 Infrastructure process EIDA needs to perform.
7.3 Improvement process EIDA needs to perform.
7.4 Training process
Done as a part of Lot 3 – Admin Training.
D. ISO 9126 and PRIDC mapping
Following is a summary of the evaluation results as per the ISO 9126 quality attributes.
the degree of existence of a set of functions that satisfy stakeholder/business implied needs and their properties. Overall, in terms
of number of changes requested on the system, as illustrated in Table 1.10, there were more than 213 modification items in the form
Functionality of 53 change requests (23 major changes) passed to the vendor to implement. This was the first functional test results with the first
version of the PRIDC system. This is a significant amount of modifications and it clearly implies that there was a big gap during the
system requirement analysis and capturing phase.
Suitability Can software perform the tasks required? The checked against specifications & feedback from registration centres.
degree of presence of a set of functions for
specified tasks (fitness for purpose)
Accuracy is the result as expected? The degree of checked against specifications. Test cases were developed by the test team of
provision of right or agreed results or effects the vendor. Besides, there were many other cases that were not tested for
accuracy but encountered later after the release of the software.
Interoperability Can the system interact with another system? checked against specifications. However, the system was designed to be a
the degree to which the software is able to closed architecture, as interoperability with future systems was seen to be of big
interact with specified systems (i.e. physical concern.
devices)
Security Does the software prevent unauthorised access? checked against specifications and in accordance with the Information Security
a set of regulations for maintaining a certain Policy
level of security; degree to which the software
is able to prevent unauthorised access, whether The PRIDC system is a critical system for the country, thus important security
accidental or deliberative, to programs and data features were incorporated into the system to ensure high confidentiality,
(i.e. login functions, encryption of personal integrity and authenticity of the data. The security is built around the following
data etc). main rules:
• Strong authentication of the operators (each end-user will use both
password and
fingerprint to logon onto the system),
• Network security using Virtual Private Network (VPN) + Demilitarised
Zone (DMZ) and
Secure Socket Layer (SSL) over Hyper Text Transfer Protocol (HTTP),
• Strong physical protection of the Central, Disaster Recovery and Service
Points Local
Area Networks (LAN)
124
9. International Journal of Social Sciences Volume 1 Number 2
The security scheme was implemented at 4 levels:
1) Application level, 2) Network level, 3) System level, 4) Physical level.
The security features carried out at each of the above levels included a wide
range of advanced international security standards and measures: X509 V3
certificates, X500 directory, LDAP V2 and V3, DES, 3xDES, RC2, RC4, AES
ciphering algorithms (used by CISCO VPN), RSA (PKCS#1) signature
algorithms, MD2, MD5, SHA1, Diffie-Hellman and RSA key exchange
algorithms, pkcs#12, pkcs#7, pkcs#10, IPsec, IKE. TOO MUCH SECURITY!
Compliance the degree to which the software adheres to checked against specifications
application-related standards or conventions or
regulations in laws and similar prescriptions
Reliability the capability of the software to maintain its level of performance under stated conditions for a stated period of time (This is
assessed based on the number of failures encountered per release)
Maturity Have most of the faults in the software been If looked at the number of sub versions released of the PRIDC system (ie., ver
eliminated over time? the frequency of failure by 1.5,ver 1.6 and ver3.0) as depicted in Table 1.7, the evolution of these
faults in the software versions was unplanned (i.e., previously not specified) versions of the system,
which signifies that the immaturity of the system in terms of business
requirements and needs. At the time of carrying out this evaluation, the
software was still seen to require further modifications before the system can
be finally accepted.
Fault tolerance is the software capable of handling errors? the Although the system had a centralised architecture, its architecture allowed the
ability to maintain a specified level of different systems to continue operation in the cases of failure of the central
performance in cases of software faults or of system through replication and redundant systems.
infringement of its specified interface; is the
property that enables a system to continue
operating properly in the event of the failure of
some of its components.
Recoverability Can the software resume working and restore lost Databases were continuously replicated on the Disaster Recovery site. The
data after failure? the capability of software to system insured that no more than one hour of work would be lost following a
re-establish its level of performance and recover database crash/failure. However, in case of a major disaster that would lead to
the data directly affected in case of a failure the loss of the operational capacity of the main data centre, the PRIDC system
was planned to be restarted within 24 hours.
Usability the effort needed for the use by a stated or implied set of users.
Understandability does the user comprehend how to use the Usability testing uncovered many difficulties, such as operators having difficulty
system easily? evaluates the attributes of understanding system interface, business logic and processes. With the lack of
software that bear on the users' effort for on-line help function, the GUI interface of the system did not seem to follow any
recognizing the underlain concept of the clear standard, as operators started guessing what different buttons may mean.
software. This effort could be decreased by the For example, two registration centre’s' operators deleted the files of all
existence of demonstrations registered applicants on one day when they pressed the button 'Abort' to cancel
an operation, where the system was performing the action of 'Delete' with the
'Abort' button. In general, Interface functions (e.g., menus, controls) were no
easy to understand.
Learnability can the user learn to use the system easily? User documentation and help were not complete at the time of carrying out this
evaluates the attributes of software that bear on evaluation. The system was not easy to learn as users had to repeat the training
the users' the user's effort for learning how to sessions many times as the cases of data entry errors was raising when post-audit
use the software procedures for data quality check were implemented.
Operability can the user use the system without much The interface actions and elements were sometimes found to be inconsistent -
effort? evaluates the attributes of software that error messages were not clear and led to more confusion and resulted in
bear on the users' effort for operation and operators guessing and attempts to rectify problems which in turn led to deeper
operation control (e.g. function keys, mouse problems as the system was not designed to handle user play-around cases (i.e.,
support, shortcuts e.t.c.) to handle unexceptional errors). Some important functions such as deletion was
being performed with prompt to confirmation.
125
10. International Journal of Social Sciences Volume 1 Number 2
Attractiveness does the interface look good? evaluates how the system design and screen layout and colour was not so appealing
attractive is the interface to the user?
Efficiency Have functions been optimised for speed? Have repeatedly used blocks of code been formed into sub-routines?
Time Behaviour how quickly does the system respond? To be checked against specification. However, the full testing of this
evaluates the time it takes for an operation to characteristic was not possible at the time of carrying out this study since the
complete; software's response and processing daily enrolment throughput was around 1200 people a day, subsequently the
times and throughput rates in performing its same figures for the card production.
function
From a database capacity view point, the PRIDC system was dimensioned to
manage records of 5 Million persons. Whereas the throughput
of the system was as follows:
• allows for up to 7,000 enrolments per day.
• able to produce up to 7,000 ID Cards per day.
• The Biometric Sub-System is able to perform up to 7,000 person
identification (TP/TP)
searches per day.
The processing operations was designed as follows:
• New enrolment: ............... 20 minutes
• Card collection: ............... 3.5 minutes
• Card Renewal: ................ 8 minutes
• PR Functions: ................. 8.5 minutes
• Civil investigation: ........... 11 minutes
• Biometric subsystem: ....... Within 22 hours
Resource Utilisation does the system utilise resources efficiently? is This task was not possible at the time of carrying out the evaluation, since the
the process of making code as efficient as source code was still not handed over to the client.
possible; the amount of resources and the
duration of such use in performing the
software's function
Maintainability the effort needed to make specified modifications
can faults be easily diagnosed? the effort During system installation and with the release of the software (also during
needed for diagnosis of inefficiencies or causes business operations), undocumented defects and deficiencies were discovered by
of failure or for identification of parts to be the users of the software. Those encountered faults were very difficult to
Analysability
modified analyse and diagnose even by the vendor technical team and encountered
software inefficiencies usually took long time to fix, as problems were usually
passed to the development team in France for investigation and response.
can the software be easily modified? The system architecture was so complex, as the word 'change to the system'
Changeability is the effort needed for meant a nightmare to the vendor. The vendor always tried to avoid changes all
modification, fault removal or for the time with the justification: 'the system in the current form, allows you to
environmental change enrol the population and produce ID cards for them'. The client concern was that
the software in its current version opens doors for many errors from user entry
Changeability errors to incomplete business functions that were not captured during the phase
of requirement specifications.
it is worth also to mention that changes to the system when agreed was taking so
long to implement. For example, adding a field to the system (job title) took a
work of 1 month to implement with an amazing amount bill.
can the software continue functioning if as mentioned above, the system complex architecture implied that a change in
changes are made? the risk of unexpected one place would almost affect many parts of the system. A change in one part of
Stability
effects of modifications the system, would normally cause unexpected effects as a result of the
modification.
126
11. International Journal of Social Sciences Volume 1 Number 2
can the software be tested easily? the effort in general, system (business) processes and functions were tested against
needed for validating the modified software. specifications. However from a technical perspective, the complex
architecture of the system made it impossible to test many areas of the
Testability software. The vendor was pushing for the system to be accepted from a
functional perspective (including the network setup).
Portability A set of attributes that bear on the ability of software to be transferred from one environment to another
can the software be moved to another The software was designed and coded to operate within a unique environment
environments? the software's opportunity for of databases, operating systems and hardware. Most of the hardware used
Adaptability adaptation to different environments(e.g. other proprietary APIs' (Programming Applications Interface) to interface with the
hardware/OS platforms system. This automatically locked the system to only use the specified set of
hardware but not otherwise.
can the software be installed easily? the effort Though installation files and guides were available, the software architecture
needed to install the software in a specified was not clear at all. All attempts made by the technical members failed in this
Installability
environment regard. Despite the several requests, the vendor felt that the system should not
be installed other than the vendor himself.
does the software comply with portability the system did not comply with any portability standards other than the
standards? Conformance is the degree to which vendor's own.
Co-existence
the software adheres to standards or conventions
related to portability
does the software easily replace other software? The PRIDC software was expected to take over the current population register
the opportunity and effort of using the software database maintained part of the immigration system in the Ministry of Interior.
Replaceability
in the place of specified older software. However, this was a long-term objective. The software needed to go under
several revolutions, before it can achieve this objective.
ABOUT THE AUTHOR
Ali M. Al-Khouri, has been involved in
the UAE national ID card project since it
early developments as a member of the
technical executive steering committee,
and he was later appointment as the head
of the technical committee when
Emirates Identity Authority (a federal
government organisation formed to
oversee the management and
implementation of national ID card
system rollout in the United Arab
Emirates) was established. He received
his Bachelor’s and Master’s degrees in
Business IT Management with honors and distinction from Manchester and
Lancaster Universities in the UK, and currently doing his doctorate degree in
the field of engineering management and advanced technologies. His research
interests are in leadership & management, e-government and the applications
of advanced technologies in large contexts. Email: alkhouri@emiratesid.ae
127
12. International Journal of Social Sciences Volume 1 Number 2
3. the system architecture forced the client to return again
V. REFLECTION and again to the original vendor for additional
functionality or capacity.
“Some problems are so complex that you have to be highly The closed architecture with the different proprietary
intelligent and well-informed just to be undecided about them.” platforms it incorporated were altogether more likely to
slowdown the pace of organisational business and process
Laurence J. Peter excellence as changes to the system would be expensive and
extremely difficult to maintain. The literature has not been
kind to the closed system architectures as research show that
Many researchers and practitioners argue that measurement such systems have proven to be too slow and too expensive to
is an essential issue in project and process management and meet the rapidly changing market needs, as it restricts the
improvement from the logic that it is not possible to control level of quality that can be achieved [30],[31],[32],[33].
what is not understood and it is not possible to scientifically However, some vendors and service providers strongly
understand what is not measured [4]. Using measurement advocate standardised systems via closed architectures. Their
practices may well increase the rate of project success to a argument is that such architectures are so necessary in their
higher level, statistically [30]. In the real world, however, this system standards efforts and that the openness of the
may be argued to be a valid issue at the organisation level not component-based approach leads to a chaos of choices and
at the individual project level. This is to say that projects integration headaches, and that such architectures to address
usually have very short term strategies with very tight the ‘security’ needs.
deadlines and tend to be the result of an opportunistic Moreover, over the long-term life of a system, additional
behaviour; where applying such measurement strategies may challenges may well arise, including inserting of COTS
not be seen to add value, bearing in mind the time and cost components that correspond to new functionality and
associated with such measurement analysis activities. "consolidation engineering" wherein several components may
As the UAE national ID system will become the most be replaced by one "integrated" component. Following are
critical system in the country as the main central hub for further reflections on the major ISO 9126 quality attributes:
population identity cross checking and service eligibility (i.e.,
online with 24/7 availability requirement), it becomes A. Functionality
important that the software goes under a thorough quality The functionality factors were mainly checked against the
check. Taking into consideration the CBS nature of the system specification documents. However, it was discovered
system, some components were viewed to be more critical to on the release of the first version of the software that many
go under a through quality checks as a failure in different business functions were not covered in the specifications,
software components may lead to everything from public resulting in the need for subsequent releases to address and fill
frustration to complete chaos when the card becomes ‘the the operational gaps. However, the evaluated software
means’ to accessing services. version in this study was not at an acceptable state, as it
The evaluation study carried out here attempted to provide required additional enhancements to cover some of the
a short but thorough overview of the PRIDC system, and additional business functions and rectify identified
measure the system quality against ISO 9126 standard. Many deficiencies, errors and bugs. It is also worth to mention that
limitations had been encountered that will help greatly the the overemphasis of security requirements during the
project team to address before the final acceptance of the specification phase contributed exponentially to the existing
system from the vendor. high complexity of the overall system, and its interoperability
From the evaluation, the system was found to have been with other sub-systems.
developed as a component-based software system, but most The fundamental problem concerning software
importantly was observed to be a closed system. This closed development is defined as to try to understand the customer’s
architecture—although it was promised to work as prescribed sometimes unspoken needs and requirements and translate
in the specification documents— was viewed to likely cause these into a tangible software solution. The literature shows
the following major drawbacks in the short and long run: that one of the principle causes of information system failure
1. the system supported only a few hardware vendors, as is when the designed system fails to capture the business
this was seen to result in the system loosing certain requirements or improve the organisational performance.
amount of autonomy and promoting it to acquire Researchers argue that such failures were because many
additional dependencies when integrating COTS organisations tend to use rule-of-thump and rely on previous
components; experiences [35]. The vendor adopted the waterfall system
2. system evolution was not a simple plug-and-play development approach when it came to user requirements
approach. Replacing one component was more analysis and system implementation. The vendor was reluctant
typically to have rippling affects throughout the to make any modification to the developed system, and was
system, especially where many of the components in presenting high cost impact on each change to it even if it was
the system were black box components; and
a change to modify labels of text fields on user screens. The
128
13. International Journal of Social Sciences Volume 1 Number 2
common response of the vendor was that ‘the system is F. Portability
developed according to the agreed specification, and any The system had many proprietary API’s to interface with
deviation from that is probably to have a cost impact.’ the different components of the system, locking the system to
This attitude of the vendor opened doors for long use a specified set of hardware but not otherwise. Installation
discussion meetings and arguments around this area, and files and guides did not enable the reinstallation of the system.
slowed down the progress of the project, as changes got Overall, the system was observed not to comply with any
parked for long periods as some got buried and lost into the portability standards other than the vendor’s own, which can
huge project documents and long meeting minutes. However, be carried out only by the vendor himself. The vendor was
system functionality is a temporary matter that can be resolved asked to add APIs to the system to allow the plug-in of new
once attended to. The most critical items that needed to be components to the system both data and hardware wise.
addressed along with the functionality concerns were the areas
discussed next. VI. CONCLUSION
B. Reliability
Software reliability is the probability that a software system “You don't drown by falling in the water; you drown by staying
will not cause the failure of the system for a specified time there.”
under specified conditions. The different tests carried out
Edwin Louis Cole
during the deliverables acceptance relied on systematic
software testing strategies, techniques, and process, and
software inspection and review against specifications.
As widely quoted in the literature, the application of
However, and during this study, it was found very useful to
software metrics has proven to be an effective technique for
incorporate less systematic testing approaches to explore the
improving the quality of software and the productivity of the
ability of the system to perform under adverse conditions.
development process i.e the use of a software metrics program
C. Usability will provide assistance to assessing, monitoring and
The software seemed to have many usability concerns as identifying improvement actions for achieving quality goals
system users struggled to understand system processes and (see for example: [3],[4],[5],[6],[8],[9],[12],[14],[29],[35],
functions, as minimal user documentation were available that [36],[37]. In this study, the author attempted to use the ISO
also did not cover the areas users needed most. Extensive 9126 quality model to evaluate the PRIDC system; mainly
training was required to educate the users on the system, as from a product quality angle. See also Figure 10.
much effort was required from the registration centre
supervisors to support the users. The system was required to
go through a major review to evaluate its usability. It also
needed to be enhanced to follow a standard GUI methodology
overall.
D.Efficiency
System processes and functions were checked against the
time indicated in the specifications from a functional
perspective. Nonetheless, code review was not possible
because the source code was not handed over to the client at
the time of carrying out this evaluation. Overall, the technical
team had concerns about the capability of the system to
provide acceptable performance in terms of speed and
resource usage.
Fig. 10 Software quality metrics framework - Source: [37]
E. Maintainability
The complex architecture of the system made the analysis The study presented in this paper contributed to a great
and diagnoses of discovered system faults and their extent in spotting some of the system deficiencies that were
maintenance so difficult where problems were usually passed addressed prior to the final acceptance and handover of the
to the development team in another country for investigation system. It was also the author’s observation that the project
and preparation of bug-fix patches. Besides, the complex team, with the workload and responsibilities put on them,
architecture acted also as a huge barrier to making urgent seemed to be overloaded and to have a scattered vision of how
changes to the system as it required long analysis to evaluate things be done and achieved. Everybody wanted the project
the impact on the different components of the software, to conclude as quickly as possible as everybody seemed also
associated with an unrealistic cost impact of implementing to be confident of the work produced by the vendor. The use
such changes claimed by the vendor. of quality framework showed in this study can be a very
129
14. International Journal of Social Sciences Volume 1 Number 2
useful and supportive methodological approach for going [13] R. Black (2003) “Quality Risk Analysis,” USA: Rex Black Consulting
Services [Online] Available:
about software quality assessment. ISO 9126 framework can http://www.rexblackconsulting.com/publications/Quality%20Risk%20A
act as a comprehensive analytical tool as it can move beyond nalysis1.pdf.
superficial evaluation to achieve a more thorough view of the [14] G.G. Schulmeyer & J.I, Mcmanus, “The Handbook of Software Quality
Assurance” (3rd edition). Upper Saddle River, New Jersey: Prentice
system’s strengths and weaknesses than can be provided by
Hall, 1999.
less systematic approaches. [15] A. Abran, Al-Qutaish, E. Rafa, J.M. Desharnais, & N. Habra, “An
When implementing big projects such as National ID Information Model for Software Quality Measurement with ISO
schemes, project management and technical teams should use Standards,” in SWEDC-REK, International Conference on Software
Development, Reykjavik, Islande , University of Iceland, pp. 104-116,
quality models for evaluating the overall architecture prior to 2005.
the final acceptance of the system. As such and if used as a [16] K.-K. Lau, (editor) ‘Component-based Software Development: Case
guide in an early stage of the project it can arguably provide a Studies,’ World Scientific (Series on Component-Based Software
Development), vol. 1, 2004.
basis for informed and rational decision making and have the [17] F.P. Brooks, “No Silver Bullet: Essence and Accidents of Software
potential to increase the project success rate. Engineering,” Computer, vol. 20, no. 4 , pp. 10-9, 1987.
From a technical view point, the ISO software quality [18] A.W. Brown, “Preface: Foundations for Component-Based Software
Engineering,” Component-Based Software Engineering: Selected Papers
metrics may also be extended throughout the phases of from the Software Engineering Institute. Los Alamitos, CA: IEEE
software development life cycle. The framework is designed Computer Society Press, pp. vii-x, 1996.
to address the wide range of quality characteristics for the [19] A. Brown & K. Wallnau “Engineering of Component-Based Systems,”
software products and processes enabling better description of Proceedings of the Second International IEEE Conference on
Engineering of Complex Computer Systems, Montreal, Canada, 1996.
software quality aspects and its importance. [20] C. Szyperski, Component Software: Beyond Object-Oriented
Programming. New York, NY.: Addison- Wesley, 1997.
ACKNOWLEDGMENT [21] X. Cai, M.R. Lyu & K. Wong(2000) “Component-Based Software
Engineering: Technologies, Development Frameworks and Quality
The author would like to thank Mr. Naorem Nilkumar for his Assurance Schemes,” in Proceedings APSEC 2000, Seventh Asia-
contribution and the technical input that improved the overall Pacific Software Engineering Conference, Singapore, December 2000,
pp372-379 [Online]. Available:
work presented in this paper. http://www.cse.cuhk.edu.hk/~lyu/paper_pdf/apsec.pdf.
[22] A.W. Brown & K.C. Wallnau, “The Current State of CBSE,” IEEE
REFERENCES Software, vol. 155, pp. 37– 46, 1998.
[23] M. Kirtland, Designing Component-Based Applications. Redmond,
[1] A.M. Al-Khouri, “UAE National ID Programme Case Study,” Washington: Microsoft Press, 1999.
International Journal Of Social Sciences, vol. 1, no. 2, pp.62-69, 2007. [24] G.T. Heineman & W.T. Councill (editors) Component Based Software
[2] E. Folmer & J. Bosch (2006) “A Pattern Framework for Software Quality Engineering: Putting the Pieces Together. Boston, MA: Addis on-
Assessment and Tradeoff analysis,” International Journal of Software Wesley, 2001.
Engineering and Knowledge Engineering, 2006 [Online]. Available: [25] G.T. Leavnesn & M. Sitaraman, Foundations of Component-Based
http://www.eelke.com/research/literature/SQTRF.pdf. Systems. New York: Cambridge University Press, 2000.
[3] S.N. Bhatti, “Why Quality? ISO 9126 Software Quality Metrics [26] R. Richardson, “Components Battling Component,” Byte, vol. 22, no.
(Functionality) Support by UML,” ACM SIGSOFT Software 11, 1997.
Engineering Notes, vol. 30, no. 2, 2005. [27] R. Veryard, The Component-Based Business: Plug and Play. London:
[4] E.J. Garrity & G.L. Sanders, “Introduction to Information Systems Success Springer-Verla, 2001.
Measurement,” in E.J. Garrity & G.L. Sanders (editors) Information [28] G. Pour, “Component-Based Software Development Approach: New
System Success Measurement. Idea Group Publishing, pp.1-11, 1998. Opportunities and Challenges,” in Proceedings Technology of Object-
[5] R.B. Grady, “Practical results from measuring software quality,” Oriented Languages, TOOLS 26, pp. 375-383, 1998.
Communications of the ACM, vol. 36, no. 11, pp.63-68, 1993. [29] N.S. Godbole, Software Quality Assurance: Principles and Practice.
[6] S. Hastie (2002) “Software Quality: the missing X-Factor,’ Wellington, Oxford, UK: Alpha Science International, 2004.
New Zealand: Software Education [Online]. Available: [30] L. Bass, P. Clements & R. Kazman, Software Architecture in Practice.
http://softed.com/Resources/WhitePapers/SoftQual_XF-actor.aspx. Reading MA.: Addison Wesley, 1998.
[7] S.L. Pfleeger, Software Engineering Theory & Practice. Upper Saddle [31] J. Bosch, Design and use of Software Architectures: Adopting and
River, New Jersey: Prentice Hall, 2001. evolving a product line approach. Harlow: Pearson Education (Addison-
[8] R.A. Martin & L.H. Shafer (1996) “Providing a Framework for Effective Wesley and ACM Press), 2000.
Software Quality Assessment - Making a Science of Risk Assessment,” [32] F. Buschmann, R. Meunier, H. Rohnert, P. Sommerlad, & M. Stal,
6th Annual International symposium of International council on Systems Pattern-Oriented Software Architecture: A System of Patterns. New
Engineering (INCOSE), Systems Engineering: Practices and Tools, York: John Wiley and Son Ltd, 1996
Bedford, Massachuestts [Online]. Available: http://www.mitre.org/wor- [33] M. Shaw, and D. Garlan, Software Architecture: Perspectives on an
k/tech_transfer/pdf/risk_assessment.pdf. Emerging Discipline. New Jersey: Prentice Hall, 1996.
[9] B.W. Boehm, J.R. Brown, H. Kaspar, M. Lipow, G.J. MacLeod, G.J. & [34] P.B. Crosby, Quality Is Free: The Art of Making Quality Certain. New
M.J. Merritt, “Characteristics of Software Quality.” TRW Software York: McGraw-Hill, 1979.
Series - TRW-SS-73-09, December, 1973. [35] R.G. Dromey, “A model for software product quality,” IEEE
[10] J.A. McCall, P.K. Richards & G.F. Walters, “Factors in Software Transactions on Software Engineering, vol. 21, no. 2, pp. 146-162, 1995.
Quality,” volumes I, II, and III, US. Rome Air Development Center [36] J.T. McCabe, (1976) “A Complexity Measure,” IEEE Transactions on
Reports NTIS AD/A-049 014, NTIS AD/A-049 015 and NTIS AD/A- Software Engineering, vol. SE2, no. 4, pp. 308-320, 1976.
049 016, U. S. Department of Commerce, 1977. [37] K.H. Möller & D.J. Paulish, Software Metrics. London: Chapman & Hall
[11] M.F. Bertoa, J.M. Troya & A. Vallecillo, “Measuring the Usability of Computing, 1993.
Software Components,” Journal of Systems and Software, Vol. 79, No.
3, pp. 427-439, 2006.
[12] S. Valenti, A. Cucchiarelli, & M. Panti, “Computer Based Assessment
Systems Evaluation via the ISO9126 Quality Model,” Journal of
Information Technology Education, vol. 1, no. 3, pp. 157-175, 2002.
130