The document discusses approaches for reengineering web applications. It proposes using a unified V-model approach to reinforce web application development through reengineering. Specifically, it discusses:
1) Using reverse engineering to analyze existing web applications and recover designs, followed by forward engineering to restructure the applications based on new requirements.
2) Applying the V-model at each phase of the web development process during reengineering to incorporate methodology.
3) The reengineering process involves reverse engineering, transformations to adapt to new technologies/requirements, and forward engineering to implement the new design.
This document summarizes a research paper on software architecture reconstruction methods. It discusses how software architectures can drift over time from the original design due to changes and deviations. Architecture reconstruction is used to recover the original architecture by applying reverse engineering techniques. The document reviews different bottom-up, top-down, and hybrid methods for architecture reconstruction, including tools like ARMIN and Rigi. It also defines key terms related to architecture reconstruction and the challenges of architectural aging, erosion, drift, and mismatch.
Requirements Analysis and Design in the Context of Various Software Developme...zillesubhan
This document provides a comparative analysis of requirements analysis and design phases between traditional and agile software development approaches. It discusses the importance of requirements analysis and outlines the key stages in a traditional software development lifecycle, including requirements analysis, system design, coding, testing, and maintenance. The document also examines requirements engineering processes and sources of requirements. It describes the goals and importance of software design as a key phase for implementing requirements and allowing flexibility for changes.
A SURVEY ON ACCURACY OF REQUIREMENT TRACEABILITY LINKS DURING SOFTWARE DEVELO...ijiert bestjournal
There are number of routing protocols proposed for the data transmission in WSN. Initially single path routing schemes with number of variations are proposed. Sti ll there were some drawbacks in single path routing . Single path routing was unable to provide the reliability and h igh throughput. Also security level was not conside red while routing. Recently,to remove the drawbacks of the s ingle path routing new routing technique is propose d called as multipath routing. In this paper we discussed the different multipath routing protocols with number of variants. Initiall y multipath routing was proposed for the purpose of guaranteed delivery of packet to sink in case of link or node failure. There are other protocols which are proposed for the reli ability,energy saving,security and high throughpu t. Some multipath routing protocols have discussed the load balancing and security during packet transmission.
VTrace-A Tool for Visualizing Traceability Links among Software Artefacts for...journalBEEI
Traceability Management plays a key role in tracing the life of a requirement through all the specifications produced during the development phase of a software project. A lack of traceability information not only hinders the understanding of the system but also will prove to be a bottleneck in the future maintenance of the system. Projects that maintain traceability information during the development stages somehow fail to upgrade their artefacts or maintain traceability among the different versions of the artefacts that are produced during the maintenance phase. As a result the software artefacts lose the trustworthiness and engineers mostly work from the source code for impact analysis. The goal of our research is on understanding the impact of visualizing traceability links on change management tasks for an evolving system. As part of our research we have implemented a Traceability Visualization Tool-VTrace that manages software artefacts and also enables the visualization of traceability links. The results of our controlled experiment show that subjects who used the tool were more accurate and faster on change management tasks than subjects that didn’t use the tool.
Contributors to Reduce Maintainability Cost at the Software Implementation PhaseWaqas Tariq
This document discusses factors that can reduce software maintenance costs during the implementation phase. It identifies that maintenance costs are highest during software development phases. The objective is to define criteria to assess software quality characteristics and assist during implementation. This will help reduce maintenance costs by creating criteria groups to support writing standard code, developing a model to apply criteria, and increasing understandability. Student groups will study code standardization, write programs, and test software maintenance on programs to validate the model and proposed criteria.
David vernon software_engineering_notesmitthudwivedi
This document provides an overview of the Software Engineering 2 course, including its aims, objectives, course contents, and recommended textbooks. The course aims to provide knowledge of techniques for estimating, designing, building, and ensuring quality in software projects. The objectives cover understanding software metrics, estimating project costs and schedules, quality assurance attributes and standards, and software analysis and design techniques. The course content includes topics like software metrics, estimation models, quality assurance, and object-oriented analysis and design. The document also summarizes several software engineering process models and risk management approaches.
A methodology to evaluate object oriented software systems using change requi...ijseajournal
It is a well known fact that software maintenance plays a major role and finds importance in software
development life cycle. As object
-
oriented programming has become the standard, it is very important to
understand th
e problems of maintaining object
-
oriented software systems. This paper aims at evaluating
object
-
oriented software system through change requirement traceability
–
based impact analysis
methodology
for non functional requirements using functional requirem
ents
. The major issues have been
related to change impact algorithms and inheritance of functionality.
IMPLEMENTATION OF MOSRE FRAMEWORK FOR A WEB APPLICATION - A CASE STUDYijwscjournal
The Security Engineering discipline has become more and more important in the recent years. Security requirements engineering is essential to assure the Quality of the resulting software. An increasing part of the communication and sharing of information in our society utilize Web Applications. Last two years have
seen a significant surge in the amount of Web Application specific vulnerabilities that are disclosed to the public because of the importance of Security Requirements Engineering for Web based systems and as it is still underestimated. Integration of Web and object technologies offer a foundation for expanding the Web to a new generation of applications. In this paper, we outline our proposed Model- Oriented Security Requirement Engineering (MOSRE) Framework for Web Applications. By applying Object-Oriented technologies and modeling to Security Requirement phase. So the completeness, consistency, traceability and reusability of Security Requirements can be cost effectively improved. We implemented our MOSRE Framework for E-Voting Application and set of Security Requirements are identified.
This document summarizes a research paper on software architecture reconstruction methods. It discusses how software architectures can drift over time from the original design due to changes and deviations. Architecture reconstruction is used to recover the original architecture by applying reverse engineering techniques. The document reviews different bottom-up, top-down, and hybrid methods for architecture reconstruction, including tools like ARMIN and Rigi. It also defines key terms related to architecture reconstruction and the challenges of architectural aging, erosion, drift, and mismatch.
Requirements Analysis and Design in the Context of Various Software Developme...zillesubhan
This document provides a comparative analysis of requirements analysis and design phases between traditional and agile software development approaches. It discusses the importance of requirements analysis and outlines the key stages in a traditional software development lifecycle, including requirements analysis, system design, coding, testing, and maintenance. The document also examines requirements engineering processes and sources of requirements. It describes the goals and importance of software design as a key phase for implementing requirements and allowing flexibility for changes.
A SURVEY ON ACCURACY OF REQUIREMENT TRACEABILITY LINKS DURING SOFTWARE DEVELO...ijiert bestjournal
There are number of routing protocols proposed for the data transmission in WSN. Initially single path routing schemes with number of variations are proposed. Sti ll there were some drawbacks in single path routing . Single path routing was unable to provide the reliability and h igh throughput. Also security level was not conside red while routing. Recently,to remove the drawbacks of the s ingle path routing new routing technique is propose d called as multipath routing. In this paper we discussed the different multipath routing protocols with number of variants. Initiall y multipath routing was proposed for the purpose of guaranteed delivery of packet to sink in case of link or node failure. There are other protocols which are proposed for the reli ability,energy saving,security and high throughpu t. Some multipath routing protocols have discussed the load balancing and security during packet transmission.
VTrace-A Tool for Visualizing Traceability Links among Software Artefacts for...journalBEEI
Traceability Management plays a key role in tracing the life of a requirement through all the specifications produced during the development phase of a software project. A lack of traceability information not only hinders the understanding of the system but also will prove to be a bottleneck in the future maintenance of the system. Projects that maintain traceability information during the development stages somehow fail to upgrade their artefacts or maintain traceability among the different versions of the artefacts that are produced during the maintenance phase. As a result the software artefacts lose the trustworthiness and engineers mostly work from the source code for impact analysis. The goal of our research is on understanding the impact of visualizing traceability links on change management tasks for an evolving system. As part of our research we have implemented a Traceability Visualization Tool-VTrace that manages software artefacts and also enables the visualization of traceability links. The results of our controlled experiment show that subjects who used the tool were more accurate and faster on change management tasks than subjects that didn’t use the tool.
Contributors to Reduce Maintainability Cost at the Software Implementation PhaseWaqas Tariq
This document discusses factors that can reduce software maintenance costs during the implementation phase. It identifies that maintenance costs are highest during software development phases. The objective is to define criteria to assess software quality characteristics and assist during implementation. This will help reduce maintenance costs by creating criteria groups to support writing standard code, developing a model to apply criteria, and increasing understandability. Student groups will study code standardization, write programs, and test software maintenance on programs to validate the model and proposed criteria.
David vernon software_engineering_notesmitthudwivedi
This document provides an overview of the Software Engineering 2 course, including its aims, objectives, course contents, and recommended textbooks. The course aims to provide knowledge of techniques for estimating, designing, building, and ensuring quality in software projects. The objectives cover understanding software metrics, estimating project costs and schedules, quality assurance attributes and standards, and software analysis and design techniques. The course content includes topics like software metrics, estimation models, quality assurance, and object-oriented analysis and design. The document also summarizes several software engineering process models and risk management approaches.
A methodology to evaluate object oriented software systems using change requi...ijseajournal
It is a well known fact that software maintenance plays a major role and finds importance in software
development life cycle. As object
-
oriented programming has become the standard, it is very important to
understand th
e problems of maintaining object
-
oriented software systems. This paper aims at evaluating
object
-
oriented software system through change requirement traceability
–
based impact analysis
methodology
for non functional requirements using functional requirem
ents
. The major issues have been
related to change impact algorithms and inheritance of functionality.
IMPLEMENTATION OF MOSRE FRAMEWORK FOR A WEB APPLICATION - A CASE STUDYijwscjournal
The Security Engineering discipline has become more and more important in the recent years. Security requirements engineering is essential to assure the Quality of the resulting software. An increasing part of the communication and sharing of information in our society utilize Web Applications. Last two years have
seen a significant surge in the amount of Web Application specific vulnerabilities that are disclosed to the public because of the importance of Security Requirements Engineering for Web based systems and as it is still underestimated. Integration of Web and object technologies offer a foundation for expanding the Web to a new generation of applications. In this paper, we outline our proposed Model- Oriented Security Requirement Engineering (MOSRE) Framework for Web Applications. By applying Object-Oriented technologies and modeling to Security Requirement phase. So the completeness, consistency, traceability and reusability of Security Requirements can be cost effectively improved. We implemented our MOSRE Framework for E-Voting Application and set of Security Requirements are identified.
A PROCESS QUALITY IMPROVEMENT MECHANISM FOR REDUCING THE RISK OF CI ENVIRONMENTijcsit
In the age of fast evolution, software development project must accept many challenges of unpredicted requirements change and new technology environment. Software development processes should have adjustable and extendable features to meet the multifaceted needs of the users. Iterative and Incremental Development (IID) is a practical approach to overcome the various challenges of software development.
However, continuous testing and building new versions need to spend more time and human resources that is a major obstacle of IID. The other, the iterative operations must have a sound communication skills. Lack of standard version control and intercommunication manner often lead to failure of software project. High quality Continuous Integration (CI) environment can effectively make up the defects of IID. In this paper, CI environment and advantages are deeply surveyed. In order to overcome the defects of IID, CI environment needs combine the perfect procedures and qualified tools, and concretely enhance the quality of CI environment. Based on the process quality measurement model, this paper proposes the Process
Quality Improvement Mechanism (PQIM). Applying PQIM, in software development, the processes problems and the CI environment quality defects can identify timely and indeed revise to reduce the risk of CI environment.
Relational Analysis of Software Developer’s Quality AssuresIOSR Journals
This document discusses relational analysis of software developer quality and measures. It begins by introducing the importance of software architecture and development models in ensuring project success. It then discusses measuring processes, products, and resources in software engineering. Internal attributes like size and complexity can be measured from products alone, while external attributes like reliability require executing the code. The research aims to measure internal attributes of the process. It outlines different types of process and product metrics used to measure properties and quality. Finally, it discusses specific defect and lines of code metrics used during implementation to estimate defects and size code.
ITERATIVE AND INCREMENTAL DEVELOPMENT ANALYSIS STUDY OF VOCATIONAL CAREER INF...ijseajournal
Software development process presents various types of models with their corresponding phases required to be accordingly followed in delivery of quality products and projects. Despite the various expertise and skills of systems analysts, designers, and programmers, systems failure is inevitable when a suitable development process model is not followed. This paper focuses on the Iterative and Incremental Development (IID)model and justified its role in the analysis and design software systems. The paper adopted the qualitative research approach that justified and harnessed the relevance of IID in the context of systems analysis and design using the Vocational
Career Information System (VCIS) as a case study. The paper viewed the IID as a change-driven software development process model. The results showed some system specification, functional specification of system and design specifications that can be used in implementing the VCIS using the IID model. Thus, the paper concluded that in systems analysis and design, it is imperative to consider a suitable development process that reflects the engineering mind-set, with heavy emphasis on good analysis and design for quality assurance.
This document discusses key concepts in software architecture including:
- The core activities of software architecture are architectural analysis, synthesis (design), evaluation, and evolution.
- Important supporting activities are knowledge management, design reasoning, documentation, and architecture description.
- Common views for documenting architecture include logical, process, physical, development, and use case views.
- Architectural styles and patterns provide reusable solutions for common architectural problems.
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...IJNSA Journal
The document discusses software development methodologies used by the UK government, specifically comparing the traditional Waterfall methodology to the more modern Agile Scrum methodology. It notes that while Agile has been adopted for development, the accreditation process still follows Waterfall, creating delays. The document then proposes a security framework based on OWASP's Application Security Verification Standard that could allow secure development within Agile sprints and provide assurance for accreditors.
This document summarizes a research paper about reengineering PDF documents containing complex software specifications into multilayer hypertext interfaces. The paper proposes extracting the logical structure and text from PDFs, transforming them into XML, and generating multiple interconnected HTML pages. It describes techniques for extracting figures, tables, lists and concepts to produce navigable outputs that improve on original PDFs and HTML conversions. The framework is evaluated on its usability and architecture with the goal of future work expanding its capabilities to other document formats.
An integrated security testing framework and toolMoutasm Tamimi
The document presents an integrated security testing framework for the secure software development life cycle (SSDLC). The framework includes four main phases: 1) defining security guidelines based on enterprise security requirements for each SSDLC phase, 2) constructing security test cases based on the guidelines, 3) executing test cases by integrating various security testing tools, and 4) converging results from different tools using a meta-vulnerability data model. The framework aims to adopt security activities into each SSDLC phase to improve security, generate test cases, integrate testing tools, and provide accurate results. It was evaluated through prototype testing of 50 software projects.
Cognitive Approach Towards the Maintenance of Web-Sites Through Quality Evalu...Waqas Tariq
It is a well established fact that the Web-Applications require frequent maintenance because of cutting– edge business competitions. The authors have worked on quality evaluation of web-site of Indian ecommerce domain. As a result of that work they have made a quality-wise ranking of these sites. According to their work and also the survey done by various other groups Futurebazaar web-site is considered to be one of the best Indian e-shopping sites. In this research paper the authors are assessing the maintenance of the same site by incorporating the problems incurred during this evaluation. This exercise gives a real world maintainability problem of web-sites. This work will give a clear picture of all the quality metrics which are directly or indirectly related with the maintainability of the web-site.
REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...ijaia
Predicting the time to build software is a very complex task for software engineering managers. There are complex factors that can directly interfere with the productivity of the development team. Factors directly related to the complexity of the system to be developed drastically change the time necessary for the completion of the works with the software factories. This work proposes the use of a hybrid system based on artificial neural networks and fuzzy systems to assist in the construction of an expert system based on rules to support in the prediction of hours destined to the development of software according to the complexity of the elements present in the same. The set of fuzzy rules obtained by the system helps the management and control of software development by providing a base of interpretable estimates based on fuzzy rules. The model was submitted to tests on a real database, and its results were promissory in the construction of an aid mechanism in the predictability of the software construction
The document provides an overview of the objectives and requirements for developing an online information portal for a college. The key points are:
- The objective is to develop a unified portal using open source tools to replace existing separate systems for managing notices, events and other college information.
- The proposed system aims to overcome limitations of the existing systems like lack of a unified interface and platform independence.
- Technical, operational, economic and timeline feasibility studies were conducted and the project was found to be feasible in all aspects.
- System requirements like hardware, software, functional, performance, security and maintainability requirements are specified to guide the development of the proposed online information portal.
A reliability estimation framework for OO design complexity perspective has been developed inthis paper. The proposed framework correlates the object oriented design constructs with complexity and also correlates complexity with reliability. No such framework has been available in the literature that estimates software reliability of OO design by taking complexity into consideration. The framework bridges the gap between object oriented design constructs, complexity and reliability. Framework measures and minimizes the complexity of software design at the early stage of software development life cycle leading to a reliable end product. Reliability and complexity estimation models have been proposed by following the proposed framework. Complexity estimation model has been developed which takes OO design constructs into consideration and proposed reliability estimation models take complexity in consideration for estimating reliability of OO design.
The document discusses cloud testing and literature related to web accessibility testing. It outlines the need for cloud testing due to limitations of traditional testing approaches. Literature on integrating web accessibility into testing processes and challenges of cloud testing is reviewed. The document proposes that accessibility and testing be integrated earlier in the development cycle to avoid costly retrofitting. It identifies researching technical competencies for accessibility testing and building a new testing framework to address challenges as potential future work.
EReeRisk- EFFICIENT RISK IMPACT MEASUREMENT TOOL FOR REENGINEERING PROCESS OF...ijpla
EReeRisk (Efficient Reengineering Risk) is a risk impact measurement tool which automatically identifies
and measure impact of various risk components involve in reengineering process of legacy software system.
EReeRisk takes data directly from users of legacy system and establishes various risk measurement metrics
according to different risk measurement scheme of ReeRisk framework [1]. Furthermore EReeRisk present
a variety of statistical quantities for project management to obtain decision concerning at what time
evolution of a legacy system through reengineering is successful. Its enhanced user interface greatly
simplifies the risk assessment procedures and the usage reaming time. The tool can perform the following
tasks to support decision concern with the selection of reengineering as a system evolution strategy.
SECURING SOFTWARE DEVELOPMENT STAGES USING ASPECT-ORIENTATION CONCEPTSijseajournal
The document summarizes research on securing software development stages using aspect-orientation concepts. It proposes a model called the Aspect-Oriented Software Security Development Life Cycle (AOSSDLC) which incorporates security activities into each stage of the software development life cycle. The model aims to efficiently integrate security as a cross-cutting concern using aspect orientation. It is concluded that aspect orientation allows security features to be installed without changing the existing software structure, providing benefits over other approaches.
Software Metrics for Identifying Software Size in Software Development ProjectsVishvi Vidanapathirana
This paper defines the best software metrics that can be used to define the size of the software in the current industry of information technology (IT)
This document provides an overview of software architecture. It defines software architecture as the set of structures needed to reason about a computing system, including elements, relations among them, and their properties. Good architecture is important as poor design decisions can lead to project cancellation. It also discusses the differences between architecture and design. Additionally, it describes why documenting architecture is important to allow stakeholders to use it effectively. Finally, it briefly introduces the Model-View-Controller pattern used in web development to separate user interface, data, and application logic.
AN INVESTIGATION OF SOFTWARE REQUIREMENTS PRACTICES AMONG SOFTWARE PRACTITION...ijseajournal
This paper presents the result of software requirements practices survey among software practitioners, a study in a city of Jeddah, Saudi Arabia. As software requirements are important and they lead to the successful of a software development project, it becomes interesting to investigate the current software requirements practices in the kingdom of Saudi Arabia. As an initial work, a survey was conducted in
Jeddah as a study before we conduct it in the Kingdom of Saudi Arabia software industry. The survey is conducted by distributing a set of questionnaire to the software practitioners. There are 17 respondents completed the questionnaire out of 50 distributed questionnaire, which is 34% of response rate. The result of this survey is promising and it has shown that requirements management area should be focused for
future improvement. In the future, the survey will focus on software engineering and requirements engineering practices over the entire Kingdom of Saudi Arabia software industry.
This document describes a VSC-HVDC transmission system connecting a strong AC system to a weak island AC system. It investigates the AC filter requirements and designs controllers for the VSC using optimal techniques. The controllers designed are a active power controller for the rectifier, reactive power controller for the rectifier, DC voltage controller for the inverter, and AC voltage controller for the inverter. Simulations are performed in MATLAB to analyze the system's performance for load variations in the island system.
This document describes a Matlab program developed for soil classification according to the AASHTO soil classification system. The program classifies soil samples based on inputted liquid limit, plasticity index, and percentage passing the #200 sieve values. It outputs the AASHTO soil type classification as well as additional information like group index and general subgrade rating. The program was tested on sample soils from textbooks and correctly classified them. It provides an easier way to classify soils than manually using charts. The program aims to reduce errors, save time, and provide a user-friendly soil classification tool for engineers.
The document proposes a method for grouping files and allocating jobs using server scheduling to balance load. It involves splitting a server into multiple sub-servers. The performance of client machines is analyzed based on factors like processing speed, bandwidth, and memory usage. Jobs are then assigned to sub-servers based on these performance analyses, with the goal of completing all tasks quickly. Once tasks are complete, files are distributed to the respective client machines. The proposed method aims to reduce the workload on servers and improve response times compared to the existing system.
This document describes an experimental and numerical study of a steel bridge model through vibration testing. Sensors were used to measure vibrations from the model when excited by a moving car model. Data was acquired and analyzed using LabVIEW software to determine modal frequencies and mode shapes. A finite element model of the bridge was also created in ANSYS software and results were compared. The goal was to evaluate vibration properties of the bridge and test sensor technology for structural health monitoring.
A PROCESS QUALITY IMPROVEMENT MECHANISM FOR REDUCING THE RISK OF CI ENVIRONMENTijcsit
In the age of fast evolution, software development project must accept many challenges of unpredicted requirements change and new technology environment. Software development processes should have adjustable and extendable features to meet the multifaceted needs of the users. Iterative and Incremental Development (IID) is a practical approach to overcome the various challenges of software development.
However, continuous testing and building new versions need to spend more time and human resources that is a major obstacle of IID. The other, the iterative operations must have a sound communication skills. Lack of standard version control and intercommunication manner often lead to failure of software project. High quality Continuous Integration (CI) environment can effectively make up the defects of IID. In this paper, CI environment and advantages are deeply surveyed. In order to overcome the defects of IID, CI environment needs combine the perfect procedures and qualified tools, and concretely enhance the quality of CI environment. Based on the process quality measurement model, this paper proposes the Process
Quality Improvement Mechanism (PQIM). Applying PQIM, in software development, the processes problems and the CI environment quality defects can identify timely and indeed revise to reduce the risk of CI environment.
Relational Analysis of Software Developer’s Quality AssuresIOSR Journals
This document discusses relational analysis of software developer quality and measures. It begins by introducing the importance of software architecture and development models in ensuring project success. It then discusses measuring processes, products, and resources in software engineering. Internal attributes like size and complexity can be measured from products alone, while external attributes like reliability require executing the code. The research aims to measure internal attributes of the process. It outlines different types of process and product metrics used to measure properties and quality. Finally, it discusses specific defect and lines of code metrics used during implementation to estimate defects and size code.
ITERATIVE AND INCREMENTAL DEVELOPMENT ANALYSIS STUDY OF VOCATIONAL CAREER INF...ijseajournal
Software development process presents various types of models with their corresponding phases required to be accordingly followed in delivery of quality products and projects. Despite the various expertise and skills of systems analysts, designers, and programmers, systems failure is inevitable when a suitable development process model is not followed. This paper focuses on the Iterative and Incremental Development (IID)model and justified its role in the analysis and design software systems. The paper adopted the qualitative research approach that justified and harnessed the relevance of IID in the context of systems analysis and design using the Vocational
Career Information System (VCIS) as a case study. The paper viewed the IID as a change-driven software development process model. The results showed some system specification, functional specification of system and design specifications that can be used in implementing the VCIS using the IID model. Thus, the paper concluded that in systems analysis and design, it is imperative to consider a suitable development process that reflects the engineering mind-set, with heavy emphasis on good analysis and design for quality assurance.
This document discusses key concepts in software architecture including:
- The core activities of software architecture are architectural analysis, synthesis (design), evaluation, and evolution.
- Important supporting activities are knowledge management, design reasoning, documentation, and architecture description.
- Common views for documenting architecture include logical, process, physical, development, and use case views.
- Architectural styles and patterns provide reusable solutions for common architectural problems.
A SECURITY EVALUATION FRAMEWORK FOR U.K. E-GOVERNMENT SERVICES AGILE SOFTWARE...IJNSA Journal
The document discusses software development methodologies used by the UK government, specifically comparing the traditional Waterfall methodology to the more modern Agile Scrum methodology. It notes that while Agile has been adopted for development, the accreditation process still follows Waterfall, creating delays. The document then proposes a security framework based on OWASP's Application Security Verification Standard that could allow secure development within Agile sprints and provide assurance for accreditors.
This document summarizes a research paper about reengineering PDF documents containing complex software specifications into multilayer hypertext interfaces. The paper proposes extracting the logical structure and text from PDFs, transforming them into XML, and generating multiple interconnected HTML pages. It describes techniques for extracting figures, tables, lists and concepts to produce navigable outputs that improve on original PDFs and HTML conversions. The framework is evaluated on its usability and architecture with the goal of future work expanding its capabilities to other document formats.
An integrated security testing framework and toolMoutasm Tamimi
The document presents an integrated security testing framework for the secure software development life cycle (SSDLC). The framework includes four main phases: 1) defining security guidelines based on enterprise security requirements for each SSDLC phase, 2) constructing security test cases based on the guidelines, 3) executing test cases by integrating various security testing tools, and 4) converging results from different tools using a meta-vulnerability data model. The framework aims to adopt security activities into each SSDLC phase to improve security, generate test cases, integrate testing tools, and provide accurate results. It was evaluated through prototype testing of 50 software projects.
Cognitive Approach Towards the Maintenance of Web-Sites Through Quality Evalu...Waqas Tariq
It is a well established fact that the Web-Applications require frequent maintenance because of cutting– edge business competitions. The authors have worked on quality evaluation of web-site of Indian ecommerce domain. As a result of that work they have made a quality-wise ranking of these sites. According to their work and also the survey done by various other groups Futurebazaar web-site is considered to be one of the best Indian e-shopping sites. In this research paper the authors are assessing the maintenance of the same site by incorporating the problems incurred during this evaluation. This exercise gives a real world maintainability problem of web-sites. This work will give a clear picture of all the quality metrics which are directly or indirectly related with the maintainability of the web-site.
REGULARIZED FUZZY NEURAL NETWORKS TO AID EFFORT FORECASTING IN THE CONSTRUCTI...ijaia
Predicting the time to build software is a very complex task for software engineering managers. There are complex factors that can directly interfere with the productivity of the development team. Factors directly related to the complexity of the system to be developed drastically change the time necessary for the completion of the works with the software factories. This work proposes the use of a hybrid system based on artificial neural networks and fuzzy systems to assist in the construction of an expert system based on rules to support in the prediction of hours destined to the development of software according to the complexity of the elements present in the same. The set of fuzzy rules obtained by the system helps the management and control of software development by providing a base of interpretable estimates based on fuzzy rules. The model was submitted to tests on a real database, and its results were promissory in the construction of an aid mechanism in the predictability of the software construction
The document provides an overview of the objectives and requirements for developing an online information portal for a college. The key points are:
- The objective is to develop a unified portal using open source tools to replace existing separate systems for managing notices, events and other college information.
- The proposed system aims to overcome limitations of the existing systems like lack of a unified interface and platform independence.
- Technical, operational, economic and timeline feasibility studies were conducted and the project was found to be feasible in all aspects.
- System requirements like hardware, software, functional, performance, security and maintainability requirements are specified to guide the development of the proposed online information portal.
A reliability estimation framework for OO design complexity perspective has been developed inthis paper. The proposed framework correlates the object oriented design constructs with complexity and also correlates complexity with reliability. No such framework has been available in the literature that estimates software reliability of OO design by taking complexity into consideration. The framework bridges the gap between object oriented design constructs, complexity and reliability. Framework measures and minimizes the complexity of software design at the early stage of software development life cycle leading to a reliable end product. Reliability and complexity estimation models have been proposed by following the proposed framework. Complexity estimation model has been developed which takes OO design constructs into consideration and proposed reliability estimation models take complexity in consideration for estimating reliability of OO design.
The document discusses cloud testing and literature related to web accessibility testing. It outlines the need for cloud testing due to limitations of traditional testing approaches. Literature on integrating web accessibility into testing processes and challenges of cloud testing is reviewed. The document proposes that accessibility and testing be integrated earlier in the development cycle to avoid costly retrofitting. It identifies researching technical competencies for accessibility testing and building a new testing framework to address challenges as potential future work.
EReeRisk- EFFICIENT RISK IMPACT MEASUREMENT TOOL FOR REENGINEERING PROCESS OF...ijpla
EReeRisk (Efficient Reengineering Risk) is a risk impact measurement tool which automatically identifies
and measure impact of various risk components involve in reengineering process of legacy software system.
EReeRisk takes data directly from users of legacy system and establishes various risk measurement metrics
according to different risk measurement scheme of ReeRisk framework [1]. Furthermore EReeRisk present
a variety of statistical quantities for project management to obtain decision concerning at what time
evolution of a legacy system through reengineering is successful. Its enhanced user interface greatly
simplifies the risk assessment procedures and the usage reaming time. The tool can perform the following
tasks to support decision concern with the selection of reengineering as a system evolution strategy.
SECURING SOFTWARE DEVELOPMENT STAGES USING ASPECT-ORIENTATION CONCEPTSijseajournal
The document summarizes research on securing software development stages using aspect-orientation concepts. It proposes a model called the Aspect-Oriented Software Security Development Life Cycle (AOSSDLC) which incorporates security activities into each stage of the software development life cycle. The model aims to efficiently integrate security as a cross-cutting concern using aspect orientation. It is concluded that aspect orientation allows security features to be installed without changing the existing software structure, providing benefits over other approaches.
Software Metrics for Identifying Software Size in Software Development ProjectsVishvi Vidanapathirana
This paper defines the best software metrics that can be used to define the size of the software in the current industry of information technology (IT)
This document provides an overview of software architecture. It defines software architecture as the set of structures needed to reason about a computing system, including elements, relations among them, and their properties. Good architecture is important as poor design decisions can lead to project cancellation. It also discusses the differences between architecture and design. Additionally, it describes why documenting architecture is important to allow stakeholders to use it effectively. Finally, it briefly introduces the Model-View-Controller pattern used in web development to separate user interface, data, and application logic.
AN INVESTIGATION OF SOFTWARE REQUIREMENTS PRACTICES AMONG SOFTWARE PRACTITION...ijseajournal
This paper presents the result of software requirements practices survey among software practitioners, a study in a city of Jeddah, Saudi Arabia. As software requirements are important and they lead to the successful of a software development project, it becomes interesting to investigate the current software requirements practices in the kingdom of Saudi Arabia. As an initial work, a survey was conducted in
Jeddah as a study before we conduct it in the Kingdom of Saudi Arabia software industry. The survey is conducted by distributing a set of questionnaire to the software practitioners. There are 17 respondents completed the questionnaire out of 50 distributed questionnaire, which is 34% of response rate. The result of this survey is promising and it has shown that requirements management area should be focused for
future improvement. In the future, the survey will focus on software engineering and requirements engineering practices over the entire Kingdom of Saudi Arabia software industry.
This document describes a VSC-HVDC transmission system connecting a strong AC system to a weak island AC system. It investigates the AC filter requirements and designs controllers for the VSC using optimal techniques. The controllers designed are a active power controller for the rectifier, reactive power controller for the rectifier, DC voltage controller for the inverter, and AC voltage controller for the inverter. Simulations are performed in MATLAB to analyze the system's performance for load variations in the island system.
This document describes a Matlab program developed for soil classification according to the AASHTO soil classification system. The program classifies soil samples based on inputted liquid limit, plasticity index, and percentage passing the #200 sieve values. It outputs the AASHTO soil type classification as well as additional information like group index and general subgrade rating. The program was tested on sample soils from textbooks and correctly classified them. It provides an easier way to classify soils than manually using charts. The program aims to reduce errors, save time, and provide a user-friendly soil classification tool for engineers.
The document proposes a method for grouping files and allocating jobs using server scheduling to balance load. It involves splitting a server into multiple sub-servers. The performance of client machines is analyzed based on factors like processing speed, bandwidth, and memory usage. Jobs are then assigned to sub-servers based on these performance analyses, with the goal of completing all tasks quickly. Once tasks are complete, files are distributed to the respective client machines. The proposed method aims to reduce the workload on servers and improve response times compared to the existing system.
This document describes an experimental and numerical study of a steel bridge model through vibration testing. Sensors were used to measure vibrations from the model when excited by a moving car model. Data was acquired and analyzed using LabVIEW software to determine modal frequencies and mode shapes. A finite element model of the bridge was also created in ANSYS software and results were compared. The goal was to evaluate vibration properties of the bridge and test sensor technology for structural health monitoring.
This document analyzes the thermal characteristics of flared and rectangular fin profiles using finite element analysis. Solid models of the fin geometries were created in SolidWorks. Meshing was performed in ANSYS using tetrahedral and hexahedral elements. Boundary conditions were set up and the analysis was run to obtain temperature distributions and heat fluxes. Results for the different fin profiles were compared to determine the more efficient design. Prior research on heat transfer analysis of fins using finite element methods is also reviewed.
Android Management Redefined: An Enterprise PerspectiveIOSR Journals
This document discusses how enterprises can better manage Android devices. It begins by outlining the business requirements and challenges faced by enterprises in adopting Android, such as needing fine-grained policy management and secure user/data access. It then describes how to plan a custom Android solution through understanding requirements, choosing appropriate devices, preparing devices with necessary policies and customizations. A key recommendation is choosing purpose-built devices only for large-scale deployments with specific hardware needs, otherwise using consumer devices with accessories is more cost-effective.
This document describes a memristor device using a heterojunction of silver nanoparticles and aluminum oxide for resistive switching applications. The device consists of an aluminum-aluminum oxide-silver nanoparticles-aluminum structure. Current-voltage measurements show a transition between two states, with a resistance ratio of 105 for the major transition and a ratio of 101 for the minor transition. Scanning electron microscopy images confirm the growth of a thin aluminum oxide film consisting of spherical nanoparticles approximately 40nm in size on an aluminum substrate. This memristor operates at low voltages and shows potential for non-volatile resistive random access memory.
This document discusses quality technical and vocational education and training (TVET) as a tool for self-reliance. It begins by defining TVET and identifying skills and knowledge as important for economic growth. The study aims to investigate factors that contribute to quality TVET for self-reliance, including student factors, school factors, instructional materials, and government strategies. A questionnaire was administered to 36 TVET teachers. The results found that students often lack background in TVET and confidence in it as a career. Schools have inadequate facilities, materials, and practical periods. Instructional materials like tools, textbooks, and ICT resources are also insufficient. The document recommends that governments prioritize TVET curriculum planning and ensure qualified teaching staff
This document compares the main lobe and side lobes of the frequency response curves for different types of FIR filters designed using the Fourier Series Expansion Method. It analyzes low pass, high pass, and band pass FIR filters with sampling frequencies of 4000Hz, 8000Hz, 12000Hz, 16000Hz, and 20000Hz. The maximum magnitudes in the main lobe and side lobes are observed and compared for each case. The results show that as the sampling frequency increases, the response curves move closer to the ideal response curves with smaller deviations in the magnitudes of the different lobes.
This document describes a numerical method for designing a traditional aerospike nozzle contour using the method of characteristics. The method discretizes the characteristic and compatibility equations and applies boundary conditions to define a characteristic net. Points along the nozzle contour are determined by satisfying the stream function within the characteristic net, sweeping through the expansion fan in small increments of the Prandtl-Meyer expansion angle. The accuracy of the numerical method is verified by comparing the exit Mach number and area ratio to theoretical values and running CFD simulations with ANSYS-FLUENT. Good agreement is achieved when using small increments of the expansion angle.
This document describes an automated anti-theft and misuse alerting system for ATMs that uses face detection. The system uses a Raspberry Pi board running a face detection algorithm based on Haar classifiers to detect a user's face before opening the ATM door. Vibration sensors on the door and ATM trigger an alarm if forced entry is detected. The system is designed to prevent theft and attacks at ATMs by only granting access after verifying the user's identity and alerting authorities if intrusion is detected.
“Impact of Demographics and Personality traits on Confidence level: Determina...IOSR Journals
The purpose of this study is to explore the relationship among demographics, personality traits and level of confidence. The impact of this paper is twofold, one is to measure the determinants of overconfidence in employees and other is in students. This paper adopts the primary data approach, collected from employees and students through questionnaires .Two diverse populations have been selected and various statistical technique (Pearson correlation, Pearson regression, Chi-square, and Kolmogorov-Smirnov tests) are used for analysis purpose using SPSS software on a 100 sample size. Research findings shows that in employees when Openness to experience increase , overconfidence level decrease, however all remaining personality traits(conscientiousness, agreeableness, emotional stability and openness to experience) is correlated with overconfidence. In students there is no correlation between overconfidence and any of the personality traits. The regression analysis findings show that no linear relationship exists between independent and dependent variable in employees for individual personality traits except of emotional stability. Only emotional stability has a significant predictor of overconfidence among all five personality traits. However the overall personality is the significant predictor of overconfidence in employees. For students, neither individual personality traits nor overall personality has linear relationship with overconfidence.
Determining Tax Literacy of Salaried Individuals - An Empirical AnalysisIOSR Journals
In personal financial planning, tax management plays a very important role. An individual should have thorough knowledge of various aspects of taxes and tax policies, which would help him to understand how much he can save even after paying taxes. Those people who have not taken any formal course on taxation finds it difficult to understand and comprehend the issues related to determination of tax liability, tax filling and tax saving. An attempt has been made through this paper to determine tax literacy level of salaried individuals based on various demographic and socio-economic factors. Findings of the study suggest that overall tax literacy level of respondents is not very high. The results suggest that level of tax literacy varies significantly among respondents. Also tax literacy level gets affected by gender, age, education, income, nature of employment and place of work whereas it does not get affected by geographic region. Findings of this paper suggest that government should adopt more aggressive approaches to educate taxpayers, thereby raising the level of tax literacy among them.
Measurement of Efficiency Level in Nigerian Seaport after Reform Policy Imple...IOSR Journals
This paper focuses on the impact of reforms on port performance using Onne and Rivers ports as a reference point. It analyses the pre and post reform eras of the ports in terms of their performance. The reforms took effect from 1996 after the Federal Government of Nigeria concessioned the ports to private investors. Parameters such as Ship traffic, Cargo throughput, Ship turn round time, Berth Occupancy and personnel were used as variables for the assessment. Secondary Data were collected from the Nigerian Ports Authority and Integrated Logistic Services Nigeria (Intels) for the period 2001 to 2010 and analyzed using Data Envelopment Analysis to assess the efficiency of the port. Analysis revealed a continuous improvement in the overall efficiency of both Ports Since 2006 when the new measure was introduced. Average Ship turn-around time improved in the ports due to modern and fast cargo handling equipment and more cargo handling space which were provided. There is an increase in Ship traffic calling at the ports, resulting in increased cargo throughput and berth occupancy rate at ports of Onne and Rivers. The reform also led to more private investment in the ports’ existing and new facilities and the introduction of a World Class service in port operation. This study concludes that the Ports of Onne and Rivers are performing better under the reform programme of the Federal Government of Nigeria. It finally recommends the urgent need for a regulator to appraise the performance of the reform programme from time to time as provided by the agreement and for the full adoption and utilization of management information system (MIS) to aid performance efficiency.
Kinetic study of free and immobilized protease from Aspergillus sp.IOSR Journals
In the present investigation partially purified alkaline protease from Aspergillus sp. As#6 and As#7 strains were entrapped in calcium alginate beads and characterized using casein as a substrate. Temperature and pH maxima of protease from As#6 strain showed no changes before and after immobilization and remained stable at 450C and pH 9, respectively. However km value was slightly shifted from 4.5mg/ml to 5 mg/ml. Proteases from As#7 strain showed shifting in pH optima to a more alkaline range (10.0) as compared with free enzyme (9.0). Optimum temperature for protease from As#7 strain showed changes after immobilization and shifted from 650C to 850C. However there was no significant effect on Km value but Vmax of immobilized protease from As#7 strain was also shifted from 200U/ml to 370U/ml. Immobilized protease from As#6 strain was reused for 3 cycles with 22% loss in its activity whereas immobilize protease from As#7 strain was reused for 3 cycles with 17% loss in its activity. Protease from As#7 strain has a higher affinity for the substrate and higher proteolysis activity than protease from As#6 strain. The present work concludes that Aspergillus As#7 strain may be a good source of industrial protease
This document summarizes a paper that presents a novel method for determining the optimal location of Flexible AC Transmission System (FACTS) controllers in a multi-machine power system using a Fuzzy Controlled Genetic Algorithm (FCGA). The proposed algorithm aims to simultaneously optimize the location, type, and rated values of FACTS controllers while minimizing the overall system cost, which includes generation and investment costs. The algorithm is tested on IEEE 14-bus and 30-bus test systems, incorporating thyristor-controlled series compensator (TCSC) and unified power flow controller (UPFC) devices. Simulation results show the obtained solution is feasible and accurate for solving the optimal power flow problem.
Design And Analysis Of Chain Outer Link By Using Composite MaterialIOSR Journals
This document summarizes the design and analysis of using a composite material for the outer link of a roller conveyor chain. It begins with an introduction to roller conveyor chains and their applications. It then describes the design process for the original outer link made of carbon steel, including hand calculations to determine the link dimensions. Finite element analysis was conducted on both the original design and a modified design using glass fiber-epoxy composite material. The results showed that the composite material link weighed less and experienced lower stresses than the original design, indicating that using a composite material can improve the design of the chain outer link.
This document discusses the development of a web-based decision support system (DSS) for monitoring and predicting water quality parameters important for outdoor microalgae cultivation. The DSS allows users to monitor temperature and salinity in real-time and predicts future values of these parameters based on historical data using a K-nearest neighbors algorithm. Validation tests found the DSS could accurately forecast temperature and salinity 96.98% and 98.92% of the time respectively. The DSS notifies users when parameter levels rise or fall outside standard ranges and provides suggestions to maintain optimal water quality.
This document presents a scalable method for image classification using sparse coding and dictionary learning. It proposes parallelizing the computation of image similarity for faster recognition. Specifically, it distributes the task of measuring similarity between images among multiple cores in a cluster. Experimental results on a face recognition dataset show nearly linear speedup when balancing the dataset size and number of nodes. Reconstruction errors are used as a similarity measure, with dictionaries learned using K-SVD for each image. The proposed parallel method distributes this similarity computation process to achieve faster image classification.
Role of Educational Qualification of Consumers on Need Recognition: A Study w...IOSR Journals
Demographic variables are the most popular bases for segmenting the customer groups. One reason is that consumer needs, wants, preferences and usage rates often highly associated with demographic variables. Another is that demographic variables are easier to measure than the most of other type variables. Marketers are keenly interested in the size and growth rate of population in different cities, regions, nations; age distribution; educational levels; household patterns; and regional characteristics and movements. Because, on the basis of these measures only, marketers have to formulate their marketing strategies in order to fulfil the needs, wants and preferences of consumers. Moreover, demographic variables make known the ongoing trends, such as shifts in age, sex and income distribution that signal new business opportunities to the marketers. Demographic trends are highly reliable for the short and intermediate run. This paper, with a strong backing of literature, explains the role of educational qualification of consumers on recognizing a need for car.
Software Engineering Process in Web Application DevelopmentIOSR Journals
This document discusses the software engineering process for developing web applications. It begins by noting that conventional software engineering models cannot be directly applied to web development due to the unique characteristics of web applications. The document then outlines the conventional software development process and various models used. It describes how web engineering adapts these conventional processes, using an incremental development approach better suited to evolving web requirements. The document identifies differences between traditional software and web engineering, and proposes a modified process model for web application development.
This document discusses the differences between conventional software engineering processes and web application development processes. It notes that web applications have large user bases, evolving requirements, and frequent changes, requiring an incremental development approach. The document outlines the typical phases of conventional software development (analysis, design, coding, testing, implementation) and various process models (waterfall, V-model, etc.). It then describes adaptations needed for web engineering, including incorporating analysis of content, interactions, functions and configurations. The design phase in web engineering includes aesthetic and navigation design elements not present in conventional models. Testing for web applications must also consider functionality, usability, interfaces, compatibility and security across various hardware/software environments.
Reliability Improvement with PSP of Web-Based Software ApplicationsCSEIJJournal
In diverse industrial and academic environments, the quality of the software has been evaluated using
different analytic studies. The contribution of the present work is focused on the development of a
methodology in order to improve the evaluation and analysis of the reliability of web-based software
applications. The Personal Software Process (PSP) was introduced in our methodology for improving the
quality of the process and the product. The Evaluation + Improvement (Ei) process is performed in our
methodology to evaluate and improve the quality of the software system. We tested our methodology in a
web-based software system and used statistical modeling theory for the analysis and evaluation of the
reliability. The behavior of the system under ideal conditions was evaluated and compared against the
operation of the system executing under real conditions. The results obtained demonstrated the
effectiveness and applicability of our methodology
QUALITY ASSURANCE AND INTEGRATION TESTING ASPECTS IN WEB BASED APPLICATIONSIJCSEA Journal
Integration testing is one the important phase in software testing life cycle (STLC). With the fast growth of internet and web services, web-based applications are also growing rapidly and their importance and complexity is also increasing. Heterogeneous and diverse nature of distributed components, applications, along with their multi-platform support and cooperativeness make these applications more complex and swiftly increasing in their size. Quality assurance of these applications is becoming more crucial and important. Testing is one of the key processes to achieve and ensure the quality of these software or Webbased products. There are many testing challenges involved in Web-based applications. But most importantly integration is the most critical testing associated with Web-based applications. There are number of challenging factors involved in integration testing efforts. These factors have almost 70 percent to 80 percent impact on overall quality of Web-based applications. In software industry different kind of testing approaches are used by practitioners to solve the issues associated with integration which are due
to ever increasing complexities of Web-based applications.
This document provides an overview of Web Engineering. It discusses the need for Web Engineering due to the differences between developing web applications versus conventional software, including compressed development schedules, constant evolution, integrated content and code, and new technologies. The document also outlines a taxonomy of web applications from simple to advanced. Finally, it notes that web development requires a multidisciplinary approach drawing from fields like information science, multimedia, and human-computer interaction, necessitating the use of systematic and quantifiable approaches like those of Web Engineering.
APPLYING CONTINUOUS INTEGRATION FOR INCREASING THE MAINTENANCE QUALITY AND EF...ijseajournal
In order to project resource management and time control, software system needs to be decomposed into
subsystems, functional modules and basis components. Finally, all tested components have to integrate to
be the complete system. Applying IID (Iterative Incremental Development) mechanism, agile development
model becomes the practical method to reduce software project failure rate. Continuous integration (CI) is
an IID implementation concept which can effectively reduce software development risk. Web app with high
change characteristic is suitable to use agile development model as the development and maintenance
methodology. The paper depth surveys CI operating environment and advantages. Introducing CI concept
can make up the moving target problems to impact of Web app. For this, the paper proposes a Continuous
Integration based Web Applications Maintenance Procedure (CIWAMP) to assist the system integration
operating. Based on CI characteristics, CIWAMP makes Web app can be deployed quickly, increase
stakeholder communication frequency, improve staff morale, and effectively reduce Web app maintenance
quality and efficiency
APPLYING CONTINUOUS INTEGRATION FOR INCREASING THE MAINTENANCE QUALITY AND EF...ijseajournal
In order to project resource management and time control, software system needs to be decomposed into subsystems, functional modules and basis components. Finally, all tested components have to integrate to be the complete system. Applying IID (Iterative Incremental Development) mechanism, agile development model becomes the practical method to reduce software project failure rate. Continuous integration (CI) is an IID implementation concept which can effectively reduce software development risk. Web app with high change characteristic is suitable to use agile development model as the development and maintenance methodology. The paper depth surveys CI operating environment and advantages. Introducing CI concept can make up the moving target problems to impact of Web app. For this, the paper proposes a Continuous
Integration based Web Applications Maintenance Procedure (CIWAMP) to assist the system integration operating. Based on CI characteristics, CIWAMP makes Web app can be deployed quickly, increase stakeholder communication frequency, improve staff morale, and effectively reduce Web app maintenance
quality and efficiency.
DESQA a Software Quality Assurance FrameworkIJERA Editor
In current software development lifecycles of heterogeneous environments, the pitfalls businesses have to face are that software defect tracking, measurements and quality assurance do not start early enough in the development process. In fact the cost of fixing a defect in a production environment is much higher than in the initial phases of the Software Development Life Cycle (SDLC) which is particularly true for Service Oriented Architecture (SOA). Thus the aim of this study is to develop a new framework for defect tracking and detection and quality estimation for early stages particularly for the design stage of the SDLC. Part of the objectives of this work is to conceptualize, borrow and customize from known frameworks, such as object-oriented programming to build a solid framework using automated rule based intelligent mechanisms to detect and classify defects in software design of SOA. The implementation part demonstrated how the framework can predict the quality level of the designed software. The results showed a good level of quality estimation can be achieved based on the number of design attributes, the number of quality attributes and the number of SOA Design Defects. Assessment shows that metrics provide guidelines to indicate the progress that a software system has made and the quality of design. Using these guidelines, we can develop more usable and maintainable software systems to fulfill the demand of efficient systems for software applications. Another valuable result coming from this study is that developers are trying to keep backwards compatibility when they introduce new functionality. Sometimes, in the same newly-introduced elements developers perform necessary breaking changes in future versions. In that way they give time to their clients to adapt their systems. This is a very valuable practice for the developers because they have more time to assess the quality of their software before releasing it. Other improvements in this research include investigation of other design attributes and SOA Design Defects which can be computed in extending the tests we performed.
A Review on Web Application Testing and its Current Research Directions IJECEIAES
Testing is an important part of every software development process on which companies devote considerable time and effort. The burgeoning web applications and their proliferating economic significance in the society made the area of web application testing an area of acute importance. The web applications generally tend to take faster and quicker release cycles making their testing very challenging. The main issues in testing are cost efficiency and bug detection efficiency. Coverage-based testing is the process of ensuring exercise of specific program elements. Coverage measurement helps determine the ―thoroughness‖ of testing achieved. An avalanche of tools, techniques, frameworks came into existence to ascertain the quality of web applications. A comparative study of some of the prominent tools, techniques and models for web application testing is presented. This work highlights the current research directions of some of the web application testing techniques.
Constructing a software requirements specification and design for electronic ...Ra'Fat Al-Msie'deen
Requirements engineering process intends to obtain software services and constraints. This process is essential to meet the customer's needs and expectations. This process includes three main activities in general. These are detecting requirements by interacting with software stakeholders, transferring these requirements into a standard document, and examining that the requirements really define the software that the client needs. Functional requirements are services that the software should deliver to the end-user. In addition, functional requirements describe how the software should respond to specific inputs, and how the software should behave in certain circumstances. This paper aims to develop a software requirements specification document of the electronic IT news magazine system. The electronic magazine provides users to post and view up-to-date IT news. Still, there is a lack in the literature of comprehensive studies about the construction of the electronic magazine software specification and design in conformance with the contemporary software development processes. Moreover, there is a need for a suitable research framework to support the requirements engineering process. The novelty of this paper is the construction of software specification and design of the electronic magazine by following the Al-Msie'deen research framework. All the documents of software requirements specification and design have been constructed to conform to the agile usage-centered design technique and the proposed research framework. A requirements specification and design are suggested and followed for the construction of the electronic magazine software. This study proved that involving users extensively in the process of software requirements specification and design will lead to the creation of dependable and acceptable software systems.
Mvc architecture driven design and agile implementation of a web based softwa...ijseajournal
This paper reports design and implementation of a web based software system for storing and managing
information related to time management and productivity of employees working on a project.
The system
has been designed and implemented w
ith best principles from model view
controller
and agile development.
Such system has practical use for any organization in terms of ease of use, efficiency, and cost savings. The
manuscript describes design of the system as well as its database and user i
nterface. Detailed snapshots of
the working system are provided too.
This document outlines the key steps in the web development process: context analysis, architecture design, web page design, and web maintenance. It discusses each step in detail. Context analysis is about understanding requirements, users, and the environment. Architecture design determines technical components and how they are linked. Web page design focuses on user needs and usability. Web maintenance includes updating content and systems over time as requirements evolve. The overall process aims to develop measurable and trackable web-based systems that meet user needs.
A User Story Quality Measurement Model for Reducing Agile Software Developmen...ijseajournal
The document discusses a user story quality measurement (USQM) model for reducing risks in agile software development. It proposes that user story quality is key to affecting development efficiency and handling requirements changes. The USQM model analyzes and collects critical user story quality factors, including clarity, complexity, modularity, configuration management, version control, and testability. By quantifying these factors, the USQM aims to identify quality defects and enhance user stories, thereby reducing development risks from requirements changes.
IMPLEMENTATION OF MOSRE FRAMEWORK FOR A WEB APPLICATION - A CASE STUDYijwscjournal
The Security Engineering discipline has become more and more important in the recent years. Security requirements engineering is essential to assure the Quality of the resulting software. An increasing part of the communication and sharing of information in our society utilize Web Applications. Last two years have seen a significant surge in the amount of Web Application specific vulnerabilities that are disclosed to the public because of the importance of Security Requirements Engineering for Web based systems and as it is still underestimated. Integration of Web and object technologies offer a foundation for expanding the Web to a new generation of applications. In this paper, we outline our proposed Model- Oriented Security Requirement Engineering (MOSRE) Framework for Web Applications. By applying Object-Oriented technologies and modeling to Security Requirement phase. So the completeness, consistency, traceability and reusability of Security Requirements can be cost effectively improved. We implemented our MOSRE Framework for E-Voting Application and set of Security Requirements are identified.
Research Inventy : International Journal of Engineering and Scienceinventy
Research Inventy : International Journal of Engineering and Science is published by the group of young academic and industrial researchers with 12 Issues per year. It is an online as well as print version open access journal that provides rapid publication (monthly) of articles in all areas of the subject such as: civil, mechanical, chemical, electronic and computer engineering as well as production and information technology. The Journal welcomes the submission of manuscripts that meet the general criteria of significance and scientific excellence. Papers will be published by rapid process within 20 days after acceptance and peer review process takes only 7 days. All articles published in Research Inventy will be peer-reviewed.
This document discusses different software processes and activities. It covers incremental development, which delivers software in increments and allows for early customer feedback. Reuse-oriented engineering focuses on integrating existing components. Key process activities include specification, design/implementation, validation, and evolution. Specification involves requirements analysis. Design translates requirements into a structure, while implementation creates an executable program. Validation verifies the system meets requirements through testing. Evolution allows software to change with changing needs.
Web Application Testing (Major Challenges and Techniques)Editor IJMTER
Web-based systems represent a young, but rapidly growing technology. As the number of
web applications continues to grow, these systems enter a critical role in a multitude of companies.
The way web systems impact business aspects, combined with an ever-growing internet user mass,
emphasize the importance of developing high-quality products. Thus, proper testing plays a distinctive
part in ensuring reliable, robust and high performing operation of web applications. Issues such as the
security of the web application, the basic functionality of the site, its accessibility to handicapped users
and fully able users, as well as readiness for expected traffic and number of users and the ability to
survive a massive spike in user traffic, both of which are related to load testing. The testing of web
based applications has much in common with the testing of desktop systems like testing of
functionality, configuration, and compatibility. Web application testing consists of the analysis of the
web fault compared to the generic software faults. Other faults are strictly dependent on the interaction
mode because of web application multi-tier architecture. Some web specific faults are authentication
problem, incorrect multi language support, hyperlink problem, cross-browser portability problem,
incorrect form construction, incorrect cookie value, incorrect session management, incorrect
generation of error page, etc.
Integrated Analysis of Traditional Requirements Engineering Process with Agil...zillesubhan
In the past few years, agile software development approach has emerged as a most attractive software development approach. A typical CASE environment consists of a number of CASE tools operating on a common hardware and software platform and note that there are a number of different classes of users of a CASE environment. In fact, some users such as software developers and managers wish to make use of CASE tools to support them in developing application systems and monitoring the progress of a project. This development approach has quickly caught the attention of a large number of software development firms. However, this approach particularly pays attention to development side of software development project while neglects critical aspects of requirements engineering process. In fact, there is no standard requirement engineering process in this approach and requirements engineering activities vary from situation to situation. As a result, there emerge a large number of problems which can lead the software development projects to failure. One of major drawbacks of agile approach is that it is suitable for small size projects with limited team size. Hence, it cannot be adopted for large size projects. We claim that this approach can be used for large size projects if traditional requirements engineering approach is combined with agile manifesto. In fact, the combination of traditional requirements engineering process and agile manifesto can also help resolve a large number of problems exist in agile development methodologies. As in software development the most important thing is to know the clear customer’s requirements and also through modeling (data modeling, functional modeling, behavior modeling). Using UML we are able to build efficient system starting from scratch towards the desired goal. Through UML we start from abstract model and develop the required system through going in details with different UML diagrams. Each UML diagram serves different goal towards implementing a whole project.
Automatic model transformation on multi-platform system development with mode...CSITiaesprime
Several difficulties commonly arise during the software development process. Among them are the lengthy technical process of developing a system, the limited number and technical capabilities of human resources, the possibility of bugs and errors during the testing and implementation phase, dynamic and frequently changing user requirements, and the need for a system that supports multi-platforms. Rapid application development (RAD) is the software development life cycle (SDLC) that emphasizes the production of a prototype in a short amount of time (30-90 days). This study discovered that implementing a model-driven architecture (MDA) approach into the RAD method can accelerate the model design and prototyping stages. The goal is to accelerate the SDLC process. It took roughly five weeks to construct the system by applying all of the RAD stages. This time frame does not include iteration and the cutover procedure. During the prototype test, there were no errors with the create, read, update, and delete (CRUD) procedure. It was demonstrated that automatic transformation in MDA can shorten the RAD phases for designing the model and developing an early prototype, reduce code errors in standard processes like CRUD, and construct a system that supports multi-platform.
This document summarizes reverse engineering theories and tools. It discusses how reverse engineering is used to understand legacy code without documentation by applying transformations backwards to abstract the code into more conceptual specifications. It also describes how code-level reverse engineering focuses on analyzing source code but does not capture all needed information. Automated tools are needed to help make reverse engineering more repeatable and mature.
Similar to Unified V- Model Approach of Re-Engineering to reinforce Web Application Development (20)
This document provides a technical review of secure banking using RSA and AES encryption methodologies. It discusses how RSA and AES are commonly used encryption standards for secure data transmission between ATMs and bank servers. The document first provides background on ATM security measures and risks of attacks. It then reviews related work analyzing encryption techniques. The document proposes using a one-time password in addition to a PIN for ATM authentication. It concludes that implementing encryption standards like RSA and AES can make transactions more secure and build trust in online banking.
This document analyzes the performance of various modulation schemes for achieving energy efficient communication over fading channels in wireless sensor networks. It finds that for long transmission distances, low-order modulations like BPSK are optimal due to their lower SNR requirements. However, as transmission distance decreases, higher-order modulations like 16-QAM and 64-QAM become more optimal since they can transmit more bits per symbol, outweighing their higher SNR needs. Simulations show lifetime extensions up to 550% are possible in short-range networks by using higher-order modulations instead of just BPSK. The optimal modulation depends on transmission distance and balancing the energy used by electronic components versus power amplifiers.
This document provides a review of mobility management techniques in vehicular ad hoc networks (VANETs). It discusses three modes of communication in VANETs: vehicle-to-infrastructure (V2I), vehicle-to-vehicle (V2V), and hybrid vehicle (HV) communication. For each communication mode, different mobility management schemes are required due to their unique characteristics. The document also discusses mobility management challenges in VANETs and outlines some open research issues in improving mobility management for seamless communication in these dynamic networks.
This document provides a review of different techniques for segmenting brain MRI images to detect tumors. It compares the K-means and Fuzzy C-means clustering algorithms. K-means is an exclusive clustering algorithm that groups data points into distinct clusters, while Fuzzy C-means is an overlapping clustering algorithm that allows data points to belong to multiple clusters. The document finds that Fuzzy C-means requires more time for brain tumor detection compared to other methods like hierarchical clustering or K-means. It also reviews related work applying these clustering algorithms to segment brain MRI images.
1) The document simulates and compares the performance of AODV and DSDV routing protocols in a mobile ad hoc network under three conditions: when users are fixed, when users move towards the base station, and when users move away from the base station.
2) The results show that both protocols have higher packet delivery and lower packet loss when users are either fixed or moving towards the base station, since signal strength is better in those scenarios. Performance degrades when users move away from the base station due to weaker signals.
3) AODV generally has better performance than DSDV, with higher throughput and packet delivery rates observed across the different user mobility conditions.
This document describes the design and implementation of 4-bit QPSK and 256-bit QAM modulation techniques using MATLAB. It compares the two techniques based on SNR, BER, and efficiency. The key steps of implementing each technique in MATLAB are outlined, including generating random bits, modulation, adding noise, and measuring BER. Simulation results show scatter plots and eye diagrams of the modulated signals. A table compares the results, showing that 256-bit QAM provides better performance than 4-bit QPSK. The document concludes that QAM modulation is more effective for digital transmission systems.
The document proposes a hybrid technique using Anisotropic Scale Invariant Feature Transform (A-SIFT) and Robust Ensemble Support Vector Machine (RESVM) to accurately identify faces in images. A-SIFT improves upon traditional SIFT by applying anisotropic scaling to extract richer directional keypoints. Keypoints are processed with RESVM and hypothesis testing to increase accuracy above 95% by repeatedly reprocessing images until the threshold is met. The technique was tested on similar and different facial images and achieved better results than SIFT in retrieval time and reduced keypoints.
This document studies the effects of dielectric superstrate thickness on microstrip patch antenna parameters. Three types of probes-fed patch antennas (rectangular, circular, and square) were designed to operate at 2.4 GHz using Arlondiclad 880 substrate. The antennas were tested with and without an Arlondiclad 880 superstrate of varying thicknesses. It was found that adding a superstrate slightly degraded performance by lowering the resonant frequency and increasing return loss and VSWR, while decreasing bandwidth and gain. Specifically, increasing the superstrate thickness or dielectric constant resulted in greater changes to the antenna parameters.
This document describes a wireless environment monitoring system that utilizes soil energy as a sustainable power source for wireless sensors. The system uses a microbial fuel cell to generate electricity from the microbial activity in soil. Two microbial fuel cells were created using different soil types and various additives to produce different current and voltage outputs. An electronic circuit was designed on a printed circuit board with components like a microcontroller and ZigBee transceiver. Sensors for temperature and humidity were connected to the circuit to monitor the environment wirelessly. The system provides a low-cost way to power remote sensors without needing battery replacement and avoids the high costs of wiring a power source.
1) The document proposes a model for a frequency tunable inverted-F antenna that uses ferrite material.
2) The resonant frequency of the antenna can be significantly shifted from 2.41GHz to 3.15GHz, a 31% shift, by increasing the static magnetic field placed on the ferrite material.
3) Altering the permeability of the ferrite allows tuning of the antenna's resonant frequency without changing the physical dimensions, providing flexibility to operate over a wide frequency range.
This document summarizes a research paper that presents a speech enhancement method using stationary wavelet transform. The method first classifies speech into voiced, unvoiced, and silence regions based on short-time energy. It then applies different thresholding techniques to the wavelet coefficients of each region - modified hard thresholding for voiced speech, semi-soft thresholding for unvoiced speech, and setting coefficients to zero for silence. Experimental results using speech from the TIMIT database corrupted with white Gaussian noise at various SNR levels show improved performance over other popular denoising methods.
This document reviews the design of an energy-optimized wireless sensor node that encrypts data for transmission. It discusses how sensing schemes that group nodes into clusters and transmit aggregated data can reduce energy consumption compared to individual node transmissions. The proposed node design calculates the minimum transmission power needed based on received signal strength and uses a periodic sleep/wake cycle to optimize energy when not sensing or transmitting. It aims to encrypt data at both the node and network level to further optimize energy usage for wireless communication.
This document discusses group consumption modes. It analyzes factors that impact group consumption, including external environmental factors like technological developments enabling new forms of online and offline interactions, as well as internal motivational factors at both the group and individual level. The document then proposes that group consumption modes can be divided into four types based on two dimensions: vertical (group relationship intensity) and horizontal (consumption action period). These four types are instrument-oriented, information-oriented, enjoyment-oriented, and relationship-oriented consumption modes. Finally, the document notes that consumption modes are dynamic and can evolve over time.
The document summarizes a study of different microstrip patch antenna configurations with slotted ground planes. Three antenna designs were proposed and their performance evaluated through simulation: a conventional square patch, an elliptical patch, and a star-shaped patch. All antennas were mounted on an FR4 substrate. The effects of adding different slot patterns to the ground plane on resonance frequency, bandwidth, gain and efficiency were analyzed parametrically. Key findings were that reshaping the patch and adding slots increased bandwidth and shifted resonance frequency. The elliptical and star patches in particular performed better than the conventional design. Three antenna configurations were selected for fabrication and measurement based on the simulations: a conventional patch with a slot under the patch, an elliptical patch with slots
1) The document describes a study conducted to improve call drop rates in a GSM network through RF optimization.
2) Drive testing was performed before and after optimization using TEMS software to record network parameters like RxLevel, RxQuality, and events.
3) Analysis found call drops were occurring due to issues like handover failures between sectors, interference from adjacent channels, and overshooting due to antenna tilt.
4) Corrective actions taken included defining neighbors between sectors, adjusting frequencies to reduce interference, and lowering the mechanical tilt of an antenna.
5) Post-optimization drive testing showed improvements in RxLevel, RxQuality, and a reduction in dropped calls.
This document describes the design of an intelligent autonomous wheeled robot that uses RF transmission for communication. The robot has two modes - automatic mode where it can make its own decisions, and user control mode where a user can control it remotely. It is designed using a microcontroller and can perform tasks like object recognition using computer vision and color detection in MATLAB, as well as wall painting using pneumatic systems. The robot's movement is controlled by DC motors and it uses sensors like ultrasonic sensors and gas sensors to navigate autonomously. RF transmission allows communication between the robot and a remote control unit. The overall aim is to develop a low-cost robotic system for industrial applications like material handling.
This document reviews cryptography techniques to secure the Ad-hoc On-Demand Distance Vector (AODV) routing protocol in mobile ad-hoc networks. It discusses various types of attacks on AODV like impersonation, denial of service, eavesdropping, black hole attacks, wormhole attacks, and Sybil attacks. It then proposes using the RC6 cryptography algorithm to secure AODV by encrypting data packets and detecting and removing malicious nodes launching black hole attacks. Simulation results show that after applying RC6, the packet delivery ratio and throughput of AODV increase while delay decreases, improving the security and performance of the network under attack.
The document describes a proposed modification to the conventional Booth multiplier that aims to increase its speed by applying concepts from Vedic mathematics. Specifically, it utilizes the Urdhva Tiryakbhyam formula to generate all partial products concurrently rather than sequentially. The proposed 8x8 bit multiplier was coded in VHDL, simulated, and found to have a path delay 44.35% lower than a conventional Booth multiplier, demonstrating its potential for higher speed.
This document discusses image deblurring techniques. It begins by introducing image restoration and focusing on image deblurring. It then discusses challenges with image deblurring being an ill-posed problem. It reviews existing approaches to screen image deconvolution including estimating point spread functions and iteratively estimating blur kernels and sharp images. The document also discusses handling spatially variant blur and summarizes the relationship between the proposed method and previous work for different blur types. It proposes using color filters in the aperture to exploit parallax cues for segmentation and blur estimation. Finally, it proposes moving the image sensor circularly during exposure to prevent high frequency attenuation from motion blur.
This document describes modeling an adaptive controller for an aircraft roll control system using PID, fuzzy-PID, and genetic algorithm. It begins by introducing the aircraft roll control system and motivation for developing an adaptive controller to minimize errors from noisy analog sensor signals. It then provides the mathematical model of aircraft roll dynamics and describes modeling the real-time flight control system in MATLAB/Simulink. The document evaluates PID, fuzzy-PID, and PID-GA (genetic algorithm) controllers for aircraft roll control and finds that the PID-GA controller delivers the best performance.
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTjpsjournal1
The rivalry between prominent international actors for dominance over Central Asia's hydrocarbon
reserves and the ancient silk trade route, along with China's diplomatic endeavours in the area, has been
referred to as the "New Great Game." This research centres on the power struggle, considering
geopolitical, geostrategic, and geoeconomic variables. Topics including trade, political hegemony, oil
politics, and conventional and nontraditional security are all explored and explained by the researcher.
Using Mackinder's Heartland, Spykman Rimland, and Hegemonic Stability theories, examines China's role
in Central Asia. This study adheres to the empirical epistemological method and has taken care of
objectivity. This study analyze primary and secondary research documents critically to elaborate role of
china’s geo economic outreach in central Asian countries and its future prospect. China is thriving in trade,
pipeline politics, and winning states, according to this study, thanks to important instruments like the
Shanghai Cooperation Organisation and the Belt and Road Economic Initiative. According to this study,
China is seeing significant success in commerce, pipeline politics, and gaining influence on other
governments. This success may be attributed to the effective utilisation of key tools such as the Shanghai
Cooperation Organisation and the Belt and Road Economic Initiative.
Adaptive synchronous sliding control for a robot manipulator based on neural ...IJECEIAES
Robot manipulators have become important equipment in production lines, medical fields, and transportation. Improving the quality of trajectory tracking for
robot hands is always an attractive topic in the research community. This is a
challenging problem because robot manipulators are complex nonlinear systems
and are often subject to fluctuations in loads and external disturbances. This
article proposes an adaptive synchronous sliding control scheme to improve trajectory tracking performance for a robot manipulator. The proposed controller
ensures that the positions of the joints track the desired trajectory, synchronize
the errors, and significantly reduces chattering. First, the synchronous tracking
errors and synchronous sliding surfaces are presented. Second, the synchronous
tracking error dynamics are determined. Third, a robust adaptive control law is
designed,the unknown components of the model are estimated online by the neural network, and the parameters of the switching elements are selected by fuzzy
logic. The built algorithm ensures that the tracking and approximation errors
are ultimately uniformly bounded (UUB). Finally, the effectiveness of the constructed algorithm is demonstrated through simulation and experimental results.
Simulation and experimental results show that the proposed controller is effective with small synchronous tracking errors, and the chattering phenomenon is
significantly reduced.
Hierarchical Digital Twin of a Naval Power SystemKerry Sado
A hierarchical digital twin of a Naval DC power system has been developed and experimentally verified. Similar to other state-of-the-art digital twins, this technology creates a digital replica of the physical system executed in real-time or faster, which can modify hardware controls. However, its advantage stems from distributing computational efforts by utilizing a hierarchical structure composed of lower-level digital twin blocks and a higher-level system digital twin. Each digital twin block is associated with a physical subsystem of the hardware and communicates with a singular system digital twin, which creates a system-level response. By extracting information from each level of the hierarchy, power system controls of the hardware were reconfigured autonomously. This hierarchical digital twin development offers several advantages over other digital twins, particularly in the field of naval power systems. The hierarchical structure allows for greater computational efficiency and scalability while the ability to autonomously reconfigure hardware controls offers increased flexibility and responsiveness. The hierarchical decomposition and models utilized were well aligned with the physical twin, as indicated by the maximum deviations between the developed digital twin hierarchy and the hardware.
Using recycled concrete aggregates (RCA) for pavements is crucial to achieving sustainability. Implementing RCA for new pavement can minimize carbon footprint, conserve natural resources, reduce harmful emissions, and lower life cycle costs. Compared to natural aggregate (NA), RCA pavement has fewer comprehensive studies and sustainability assessments.
A review on techniques and modelling methodologies used for checking electrom...nooriasukmaningtyas
The proper function of the integrated circuit (IC) in an inhibiting electromagnetic environment has always been a serious concern throughout the decades of revolution in the world of electronics, from disjunct devices to today’s integrated circuit technology, where billions of transistors are combined on a single chip. The automotive industry and smart vehicles in particular, are confronting design issues such as being prone to electromagnetic interference (EMI). Electronic control devices calculate incorrect outputs because of EMI and sensors give misleading values which can prove fatal in case of automotives. In this paper, the authors have non exhaustively tried to review research work concerned with the investigation of EMI in ICs and prediction of this EMI using various modelling methodologies and measurement setups.
Understanding Inductive Bias in Machine LearningSUTEJAS
This presentation explores the concept of inductive bias in machine learning. It explains how algorithms come with built-in assumptions and preferences that guide the learning process. You'll learn about the different types of inductive bias and how they can impact the performance and generalizability of machine learning models.
The presentation also covers the positive and negative aspects of inductive bias, along with strategies for mitigating potential drawbacks. We'll explore examples of how bias manifests in algorithms like neural networks and decision trees.
By understanding inductive bias, you can gain valuable insights into how machine learning models work and make informed decisions when building and deploying them.
Unified V- Model Approach of Re-Engineering to reinforce Web Application Development
1. IOSR Journal of Computer Engineering (IOSR-JCE)
e-ISSN: 2278-0661, p- ISSN: 2278-8727Volume 15, Issue 6 (Nov. - Dec. 2013), PP 09-17
www.iosrjournals.org
www.iosrjournals.org 9 | Page
Unified V- Model Approach of Re-Engineering to reinforce
Web Application Development
Poonam Dhiman1
, Nishu Singh2
, Kaushal Kumar3
1(Software Engineering, Delhi Technological University, India)
2(Software Engineering, Delhi Technological University, India)
3(Software Engineering, Delhi Technological University, India)
Abstract: The diverse and dynamic nature of elements and techniques used to develop Web Application, due to
the lack of testing technique and effective programming principles which are used for implementing
basic software engineering principles, and undisciplined development processes insure by the high pressure
of a very short time to satisfy market request to develop Web application. This paper represent approaches of
reengineering in web that how reengineering process can be carried out to evolution activities in legacy
system as well we propose the V-model for re-engineering process. This paper presents the need of the
technologies and approaches for building new web-services from existing web-applications. In this
paper we present the processing of V-model for Reengineering in web application which is the extension of V-
model used in software domain. In our approach V-model incorporates with the methodology
throughout the phases of web development process to re-engineer the web system.
Keywords:Re-engineering, reverse engineering, forward engineering, V-model, application migration.
I. Introduction
The technological evolution of the last year has made the Web Service of the ideal platform for the
appropriate support for their delivery and the development of Web-based applications. According to research [1]
and [2] the development of a Web application is a multi-faceted Activity, involving not only technical but also
organizational, managerial and even social and artistic issues. Web application development refers a set of
activities which applied in order to develop a web application of high quality having awaited characteristics, and
to accomplish this development efficiently and coherently. Web engineering is an important topic in these days
and is gaining more attention. It is fast developing area and not existing from centuries. Web maintenance and
web reengineering both falls in the scope of web engineering. The World Wide Web has ability to ubiquitously
provide and gather information to the economy globalization together with the need of new marketing strategies
has enormously boosted the development of Web Applications (WA). Software application is the backboned of
the WWW infrastructure.
Most web applications are developed under proper schedules and in a rapidly evolving environment.
The development is often ad-hoc in nature and the applications are poorly structured and poorly documented.
Maintenance of such applications becomes problematic and increases the complexity of the web application
grows. Creating appropriate design and architecture models is the solution to managing this complexity and
supporting evolution of web applications. Researchers have identified the need to re-engineer the system already
existing web applications into abstract design models.
Web application must cope with an extremely short development evolution life cycle: A high level of
flexibility, maintainability, and adaptability are actually necessary to compete and survive to market inflation.
Unfortunately, to accomplish tight timing schedules to deliver web services, web applications are usually
directly implemented without producing any useful documentation for their maintenance and evolution, and so
those requirements are never be satisfied. In order to satisfy a growing market request for Web applications and
to deal with their increased technological complexity, we require specific methods and techniques able to
support a disciplined and more effective development process. However, the high time pressure often forces the
developers to implement the code of the application directly, without using disciplined development process,
and this may have black effects on the delivered quality and documentation of the Web application. This
situation same as one occurring for traditional software produced in a short time, without respecting software
engineering principles and using no disciplined development process. Poor quality and poor documentation
must be considered the main factors essentially abortive and expensive maintenance, unattainability of applying
more structured and documentation-based approaches. Reverse engineering methods, techniques and tools have
proved useful to support the post-delivery life-cycle activities of traditional software systems, such as
maintenance and evolution. This paper has five sections. In this paper Section 2 presents background
2. Unified V- Model Approach of Re-Engineering to reinforce Web Application Development
www.iosrjournals.org 10 | Page
information including a conceptual knowledge of the main features of a Web application, and related works on
analysing existing Web applications. Section 3 describes how the re-engineering paradigm or approaches can be
used to define and implement Web application re-engineering processes, while Section 4 presents the V model
of re-engineering process, proposed to achieve the comprehension of existing web applications. While Section
5, finally, provides concluding remarks.
II. Reengineering
Reengineering is the analysis of existing software system and modifying it to constitute into a
new form. Chikofsky and Cross define reengineering as ‗the examination and alteration of a subject system to
reconstitute it in a new form and subsequent implementation of that form‘ [3].According to IEEE Std.
1998 ‗A system changing activity that results in creating a new system that either retains or does not retain the
individuality of the initial system‘ [4].
2.1 Nature and Scope of Reengineering
When maintenance cost is not feasible, we go for reengineering the software system. Reengineering
makes the
software system new. Reengineering has the following three stages.
1. Reverse engineering
2. Transformations or Transfiguration
3. Forward engineering
2.1.1 Reverse engineering
Reverse engineering is the process of analysing the system which helps in recovering its
design and specification. Reverse engineering is different from re-engineering. Reverse engineering is a
process of analysis to determine the relationship of the system component and create the components of the
system in another form or in a higher level of abstraction. The program itself is unchanged by the reverse
engineering process. The objective of reverse engineering is to obtain the design or specification of a
system from its source code although the objective of re-engineering is to produce a more maintainable
system. Reverse engineering is used to produce a better system and it is a part of the re-engineering process.
Reverse engineering is used during the re-engineering process to recover the program design specification
which engineers use to help them understand a program before re-constructing its structure. However propose
of a reverse engineering process for web encompassing the following phases:
1. Static Analysis
2. Dynamic Analysis
2.1.2 Static Analysis
The number and the importance of Web applications have increasing rapidly over year by year. At
the same time, the quantity and impact of security vulnerabilities in such applications have grown as well [5]
.Web application security is accomplished by static analysis and runtime analysis. Web application security has
been great challenge for this static analysis tool such as ASPWC is used to detect the attack and vulnerabilities
based on taint analysis [6]. Static analysis does not require the execution of the application. It recovers web
application architecture components and the static relations among them. HTML files, directory
structure, scripting language sources as well as any other static information (e.g., database connections, use of
applets/servlets) are processed. HTML pages and page sub elements (frames, forms, widgets) composing
the given page are localized, classified and recorded in an intermediate representation. Central to the reverse
engineering process is the mapping between web application elements and object oriented entities, according to
Conallen proposals [7] [8][9].
2.1.3 Dynamic analysis
The dynamic analysis phase relies on the static analysis results. The web application is executed and
dynamic interactions among the components are recorded. It is performed observing the execution of the web
application, tracing to source code any event. Traced events are those observed by the user or related to
components external to the web application (e.g., third party databases or WEB sites). Events are the
HTML pages/frames/forms visualization, the submission of forms, the processing of data, a link traversal,
or a database query, etc. All elements responsible of these actions (typically links, scripts, applets) are
localized .The sequences of actions fired by an event, deriving from web application code control flow (e.g.,
access to a database following a user form submission) or from user actions (e.g., clicking on a link or
submitting a form) are associated to sequences of messages exchanged between the objects of the web
application.
3. Unified V- Model Approach of Re-Engineering to reinforce Web Application Development
www.iosrjournals.org 11 | Page
2.2 Transformations and Transfiguration
This phase involves the transition, alteration, modification, reformation, reconstruction, remodelling of
the web application system. Web architecture is altered. It is modified, improved to cope with the new
technology and new environment. It is the architecture designing stage. In web these transformation can be
accomplished by changing in transaction design, adaptations of the code itself to the new computing platform,
redesign of the UI to better suit the new constraints of the target platform.
2.3 Forward engineering
In this stage, we move from higher level of abstraction to low level. In this stage web application
integrated according to new design. It is the traditional process of moving from higher level abstraction
and logical implementation to independent design to the physical implementation of a system. It follows a
sequence of going from requirement through designing its implementation.
III. Approach to Reengineering for web
Reengineering generally include some form of reverse engineering to attain more abstract description
which is followed by some form of forward engineering or restructuring. This may include modification w.r.t
new requirements not met by original system. With the advancement of technology continuous changes are
introducing in web industry and hence web application need to be cope with the latest technologies and
competing in market. The need to fulfil the market and additional requirement may lead to need for web
reengineering where the web application system is transformed from one state to another state.
3.1 Reengineering of web pages
Reengineering of web pages can be accomplished by detecting and analysing the interaction of objects
and then transforming these objects for the adaption of new platform itself and generating the source code
into new language. There are several presentation models that can be used to transform into another model for
different context supporting flexible reverse engineering process. The detection and transformation phases of
the reengineering process can be governed as
• Percolating the objects, tags and elements of the web pages that include selection of any HTML item, with
given properties that require to keep all control mechanism and discarding the unwanted tags and elements
from web pages.
• Transformation in the layout options and relationships that include alignment, balance (horizontal or
vertical balancing), centric which depend upon the position of the objects on the page.
• Content updating of the Webpages according to the requirement changes, market evolution, usage and
owner of website.
• Clustering of Webpages allow for the information obtained by analysis (static and dynamic) of
reengineering process can be used to produce a graph whose nodes represent the set of Web application
objects, and whose links specify the interaction between these objects. In [10], this kind of graph is
called WAG; Web Application connection Graph. WAG, analysis may support the comprehension of the
application. However, since this graph may be large (in terms of the number of nodes and edges) even in
the case of small size Web applications, in order to simplify the analysis of large WAG graphs, some kind
of automatic clustering [11] can be used to decompose this graph into smaller cohesive parts. In the
third step of the reverse engineering process, the automatic clustering approach proposed in [10] is
applied, in order to group software items of a WAG into meaningful (i.e. highly cohesive) and
independent (i.e. loosely coupled) clusters. This clustering approach evaluates the degree of coupling
between entities of the application (such as server pages, client pages and client modules) that are
interconnected by Submit, Build, Link, Load in Frame, Redirect, and Include relationships.
Among several clusters obtain by clustering algorithm we choose the most optimal cluster by
evaluating the degree of intra-connectivity and degree of inter-connectivity (minimizes intra- connectivity and
maximizes the intra-connectivity). The ‗optimal‘ configuration is considered the most suitable for including
clusters implementing functions at higher levels of abstraction than that of the cluster‘s single items. Validation
of the clusters, based on a Concept Assignment Process [12] has to be carried out.
• Grouping of objects that are close to each other because they are semantically related this process
is called association and ungroup objects that are isolated without any connection when they are unrelated
which is called dissociation.
3.2 Transaction Reengineering
In a transaction-oriented Web site, the user executes a series of activities in order to carry out a
specific task. One of the reasons for the success of e-commerce business today is the transactional behaviour
that the Web offers. Business processes are realized by means of transactions, which in this context can be
4. Unified V- Model Approach of Re-Engineering to reinforce Web Application Development
www.iosrjournals.org 12 | Page
interpreted as high-level workflows corresponding to user tasks (e.g. purchasing an airplane ticket). The
formalism underlying the process is a revised version of the UWA Transaction Design Model [13], which is
the portion of the UWA framework that focuses specifically on the design of Web application transactions.
The UWA design framework provides a complete design methodology for ubiquitous Web
applications that are multi-channel, multi-user, and context-aware. The UWA design framework organizes the
process of designing a Web application into four main activities [14]. (1) Requirements elicitation [15]. (2)
Hypermedia and operation design [16]. (3) Transaction design [17] and (4) customization design [18]. Using
the UWA methodology, the transaction design process produces two conceptual models: the Organization
Model and the Execution Model. The Organization Model describes a transaction from a static point of
view. It uses a particular UML class diagram [19] in which the Activities involved in the transaction are
represented by class stereotypes, which are arranged to form a tree. The Activity represented by the root of the
tree corresponds to the entire transaction; component activities and sub-Activities are intermediate nodes and
leaves of the tree that represent sub- transactions and elementary activities, respectively. The Execution Model
of a transaction defines the possible execution flow among its component activities and sub-activities. It is
a customized version of the UML Activity Diagram [6]. The sequence of activities is described by UML
Finite State Machines, in which activities and sub-activities are represented by states (ovals), and execution
flow between them is represented by state transition (arcs).
3.4 Application Migration Reengineering
Migrating applications to the newer technologies can give business a leading edge by removing
inefficient workflow and processes while preserving original objectives, model and investment. We can help
enterprises in migration of the legacy systems from old technologies to present day platforms. Reengineering
must keep into consideration the strategically designed to overcome the cross platform compatibilitychallenges.
Due to upcoming advance technology and growing business states, there is need for the migration
of legacy software systems to new technologies and environments. There are different kind of legacy system
re- engineering services that includes language and database migration, platform-to-platform porting and
system redevelopment.
A web application must follows the enterprises standard and rules implemented in a legacy
application, while transforming those to new business and architecture requirements, to produce a flexible,
tested or validated modified system. Re-engineering and Migration Benefits are the saving time and effort,
Enhancements in operational efficiency, Benefits of the latest technologies and platforms
Web application migration can include following services: Legacy application and reusable component analysis
New technologyand platform inspection
Platform, language, database and architecture migration
Design, development and integration
Version rendering Functionality enhancement Application and process organising
Language Migration
VB to VB.NET
C or C++ to .NET ASP to ASP.NET
Data Migration SQL Server 6.5 / 2000 to SQL Server 2005/2008
MS SQL Server to ORACLE
Architecture Migration
Client Server to N-TIER
Legacy to Web Services
Client Server to SOA ( Service Oriented Architecture) Legacy to Web
Enablement
Table 1: web migration services
3.5 Graphic design re-engineering
Re-engineering transforms a final user interface into a logical representation that can be changed to
direct forward engineering to allow a user interface from one computing platform to another one which is
having minimum effort. Re-engineering is used to adapt a UI to another format. To change user interface it
is not mandatory to start developing it from scratch. Some transcoding tools [20][21][22][23] automatically
transform a UI code from the original platform to a target platform. Portability and transcoding exhibits some
limitation as they do not need to consider constraints imposed by the target platform such as: operating system,
programming language, screen resolution, interaction capabilities To overcome these shortcomings UI
reverse engineering process can be combined with UI forward engineering process to produce not only more
usable UIs in a logical way, but also to benefit from the reverse engineering to port a UI to any other target
platform.
The Cameleon Reference Framework [24] locates UI development steps for context-sensitive
interactive applications. A context is defined an element of the environments set considered for the
interactive system, element of the platforms set considered for the interactive system and an element of the
5. Unified V- Model Approach of Re-Engineering to reinforce Web Application Development
www.iosrjournals.org 13 | Page
users set for the interactive system. A simplified version (Fig. 1) structures development for two contexts of
use, here for two platforms: the one on the left represents the source and the one on the right represents the
target. The development process can be decomposed into four steps:
Task and concepts describe the various tasks to be performed and the application-oriented notions
according to tasks to be performed. Logical UI is a basic symbol and notation for manipulation of the
application concepts and routine in a way that does not dependent upon the perspective interacts present on the
targets. The elements used in the logical UI are abstractions of existing product. Physical UI represents a
logical UI into real Interaction Objects to define product layout and interface navigation scheme. This interface
is now composed of existing UI product. Final UI produced at the last step of the objectification supported by
a multi-target development environment and represented as source code.
Figure1: UI development steps
IV. V model for web reengineering
The Re-engineered product goes through a complete web development life cycle and therefore it
becomes mandatory for it to pass through complete testing cycle. The legacy system or product is transformed
in new form by various means. The below figure illustrates the V-model for the Re-engineering process. A V-
model as described below is proposed for designing the testing strategies for this category. Similar to the
traditional V- model, left side of the Re-engineering V-model describes the stages of the design and coding
and right side defines the corresponding stages of validation process. It has following phases:
a) Requirement gathering for new web application
The first step involves the collection of the new requirements. This will list out the key
points why reengineering is required for the software under consideration. Client get start discussing with the
web development team about the newly generated requirement due to market evolution, technology changes
and for the product improvement for better performance. System is re-engineered in order to incorporate the
new business requirements which involve functional and non-functional requirement. In this phase developer
may make check list that deals with the various reason of re-engineering is required.
-Better performance
-Code restructuring
-New platform support
-Data migration
-New business requirement
6. Unified V- Model Approach of Re-Engineering to reinforce Web Application Development
www.iosrjournals.org 14 | Page
-Requirement change
This phase require interviews with the client , mails and supporting documents given by the client, discussion
notes, online chat, telephonic conversation, model sites/application etc.
Requirement analysis is carried out with new objective for re-engineering, cost involved in changes, supporting
document and the approval.
Moreover the analysis should cover all the aspects on how the web application is going to join the
existing system. The analysis should be done with in short time span having descriptive information. The
analysis should be cost effective. To achieve this, the analyst should concern the designers, developers and
testers to come up with an optimised work plan.
Figure 2: Web re-engineering V Model
b) Analysis of existing legacy system/specification building
The second stage is the study of the legacy system functionality and underlying design and come out
with the difference with new functions. The nature of re-engineering is to improve or transform existing system
so it can be understood, controlled & reused as new system. Web re-engineering is vital to restore & reuse the
things inherent in the existing system, put the cost of system maintenance to the lowest in the control &
establish a basis for the development of system in future.
Web application broadly classify into two forms static application and dynamic application. static
application implemented in HTML, & dynamic application provide client server interaction and consist of
DHTML pages, JAVA server pages, java servlet, PHP, CGI, ODBC, JDBC etc. So in this phase we carefully
analyse both perspective of web application (static & dynamic) and generate WAG graph for reusable
components and objects for re-engineering. This phase may consist of three major steps
-Identification of reused Components
-Encapsulation of Identified Components to new system
-Analyse interfaces of the recovered components and define specification
Identification of legacy component aims to identify web components from legacy systems. The
identified components should conform to specific user requirements that should relate to new functionality,
access and manipulate data, are free of side effects, and comply with specific pre and post conditions. We
encapsulate the recovered legacy system components to collections of object classes that encapsulate specific
functionality of the legacy system. As an object encapsulates into legacy system flexible systems can be easily
designed and implemented using the re-engineering paradigm. Furthermore, we analyse interfaces of the
recovered components and define a specification to represent their interfaces. The service specification provides
standard, enriched, and well-understood information about the interface and functionality of the offered services.
c) Migration planning and architectural transformation
7. Unified V- Model Approach of Re-Engineering to reinforce Web Application Development
www.iosrjournals.org 15 | Page
This phase addresses migration planning that is how to move from the start to the Target
Architectures by finalizing a detailed Implementation and Migration Plan. The objectives of migration planning
Phase are to:
-Finalize the Architecture Roadmap and the supporting Implementation and Migration Plan.
-Ensure that the Implementation and Migration Plan is coordinated with the business approach to managing and
implementing change in the business's overall change.
-Ensure that Transition Architectures is understood by key stakeholders
-Estimate Resource Requirements, Project Timings, cost estimation for introducing change
d) Re-engineering of application migration
When existing systems become redundant, business switch from legacy systems to modern and new systems
built on the latest technology / platforms. This switch is usually time consuming and expensive. A cost effective
alternative to such scenarios is to reengineer, migrate or port the legacy systems into the latest technology /
platforms.
Application re-engineering: In this we re-architect the product using new platforms and technologies such Web
2.0.
Technology Migration: This includes enterprises migrate applications to corporate standards and migration of
products from older legacy technologies to newer technologies to ensure integration with other tools.
Application Server Migration: To take care of all cross-platform compatibility challenges, while the Client stays
focused on product innovation.
Database Migration: This include migration of non-relational databases to industry-standard relational
databases such as DB2, MS-SQL Server, Oracle, MySQL, thus increasing business agility.
Migration of Middleware technologies: It helps to migrate from legacy systems to new industry standard
systems using implementation of middleware technologies (Server-side and Client-side) web services and
others.
` Code restructuring: To improve the coding paradigms it provides easy way of working. Code
restructuring paid more attention in adding new features and re-factoring. It deals with continuously refactoring
of web application which gives more flexible and maintainable code—Joshua Kerievsky, Refactoring to Patterns
[2].
e) Test planning & strategizing
The stage will include the test planning & test cases preparation if required as per the new
requirements and strategizing the test execution for functional and non-functional areas. Test strategizing play an
important role in carrying out the entire test execution program and involvement of high business risk, huge
investments and mission critical systems make it important to strategizing the test phase. The best way is to
identify the risky areas and the failure rate and then develop a test strategy
f) Test execution
This stage carry out the functional test execution as per plan defined in the previous stage. Test
execution carry out performance testing, if the major design changes are there or the new requirements are
related to improvement of performance. It test all the links in web pages, database connection, forms used
in the web pages for submitting or getting information from user, Cookie testing, Test for navigation, Content
checking, Interface Testing include Web server and application server interface, Application server and
Database server interface. Web testing includes functional testing, usability testing, interface testing,
compatibility testing, performance and security testing. Performance testing is an used to determine the
responsiveness, throughput, reliability, and scalability of a system under a given workload. Web application
should sustain to heavy load. Web performance testing should include Web Load Testing and Web Stress
Testing.
g) Regression testing
Regression means retesting the effect of change in other parts of the web application. Test cases are
executed again and again in order to check whether previous functionality of application work appropriately and
changes made have not introduced any new bugs or error. This test can be performed on a new reconstructed
system when new functionality added to it. A regression testing plan covers the updated functionalities.
Many automated tool for regression testing is available for web application.
e) User Acceptance testing
The purpose of user acceptance testing is to make sure that your application meets the user‘s
expectations. It ensures that the application is ready to deploy services and change has been done effectively.
8. Unified V- Model Approach of Re-Engineering to reinforce Web Application Development
www.iosrjournals.org 16 | Page
The activities for user acceptance testing ensure browser compatibility, make sure that mandatory fields are
given data in forms, check for time outs and field widths, and make sure that proper control is used to input data.
Figure 3: Description of V-Model stages
V. Conclusion
Web application development process is very complex and it faces a lot of challenging requirements. It
focuses more on planning, web architecture, system design, testing, evolution and continuous /frequent update
and maintenance of the system as per requirement. The problem becomes more complex when we require
maintaining it by adding new functionality, adapting it to the new platform or improving its performance.
Hence, therefore there exists the re-engineering of web application with the objective being expedite
maintenance process. Web reengineering is used to develop an associated and communal web application that is
related to existing legacy component and also partially replaces them. In this paper, we present the web re-
engineering approaches defining the different re-engineering processes that facilitate legacy evolution in web
environment. The different re-engineering processes like re-engineering of web pages, transaction re-
engineering, application migration reengineering, graphic design reengineering gives the direction to
reconstruct, refactor and reengineer the legacy web system. In this paper we proposed the V-model for web re-
engineering. V-model for re-engineering provides effective and easy way of reconstruction and re-testing web
application, it also provides flexibility and reusability that takes into consideration all the user and business
standard. The proposed V-model of re-engineering process is proposed for designing the testing protocol for
web application development. By introducing V- model into web reengineering process, it increases the
maintainability and the effectiveness of the website because V-model led to better validation and verification
and produce in accordance to web development life cycle. The structure of V- model enables testing process
starts from unit testing to acceptance testing. Re-engineering V-model will save the time for reconstructing or
refactoring the web due to strong testing and validation.
9. Unified V- Model Approach of Re-Engineering to reinforce Web Application Development
www.iosrjournals.org 17 | Page
References:
[1] P. Fraternali, Tools and Approaches for Developing Data-Intensive web Applications: a Survey, ACM Computing Surveys, 1999. [2]
S. Selmi, N. Kraiem, and H. Ben Ghezala, Toward a comprehension view of web engineering, 2005.
[3] E. Chikofsky and J.H.Cross, Reverse Engineering and Design Recovery: A Taxonomy, IEEE Software Engineering journal, (Jan.
1990), pp 13-17.
[4] IEEE Std 1219-1998, In IEEE Standards Software Engineering, 1999 Edition, Volume Two, Process Standards, IEEE Press.
[5] Jovanovic, N. ; Secure Syst. Lab., Tech. Univ. of Vienna ; Kruegel, C. ; Kirda, E., a static analysis tool for detecting Web
application vulnerabilities, Security and Privacy, 2006 IEEE.
[6] X. Zhang ; Comput. & Inf. Eng. Coll., Hohai Univ., Nanjing, China ; Z. Wang, e-Business and Information System Security
(EBISS), 2010 2nd International Conference on 22-23 may 2010.
[7] J. Conallen, Building Web Applications with UML, Addison- Wesley Publishing Company, Reading, MA, 1999.
[8] J. Conallen, Modelling web application architectures with uml, Communications of the Association for Computing Machinery,
42(10), October 1999.
[9] J. Conallen, Modelling web application with uml, White paper, Conallen Inc.
http://www.conallen.com/whitepapers/webapps/Modellingwebapplication.htm, March 1999.
[10] D. Lucca GA, Fasolino AR, De Carlini U, Pace F, Tramontana P. Comprehending, Web applications by a clustering based
approach, Proceedings 10th Workshop on Program Comprehension. IEEE Computer Society Press: Los Alamitos CA,2002; 261–
270.
[11] N. Anquetil, T. Lethbridge, Experiments with clustering as a software remodularisation method,Proceedings 6th Working
Conference on Reverse Engineering. IEEE Computer Society Press: Los Alamitos CA, 1999; 235–255.
[12] T. Biggerstaff, B. Mitbander, D. Webster, Program understanding and the concept assignment problem. Communications of the
ACM 1993; 37(5):72–83.
[13] D. Distante, Reengineering Legacy Applications and Web Transactions: An extended version of the UWA Transaction Design
Model, Ph.D. Dissertation, University of Lecce, Italy. June 2004.
[14] UWA (Ubiquitous Web Applications) Project, ―Deliverable D3 Requirements Investigation for Bank121 pilot application‖,
http://www.uwaproject.org, 2001.
[15] UWA (Ubiquitous Web Applications) Project, Deliverable D6: Requirements Elicitation: Model, Notation and Tool Architecture,
2001. www.uwaproject.org.
[16] UWA (Ubiquitous Web Applications) Project, Deliverable D7: Hypermedia and Operation design: model and tool architecture,
www.uwaproject.org, 2001.
[17] UWA (Ubiquitous Web Applications) Project, Deliverable D8: Transaction design, www.uwaproject.org, 2001. [18] UWA
(Ubiquitous Web Applications) Project, Deliverable D9: Customization Design Model, Notation and Tool Architecture,
www.uwaproject.org, 2001.
[19] G. Booch, J. Rumbugh, I. Jacobson, The Unified Modelling Language User Guide, (Rational CorporationSoftware), Addison-
Wesley.
[20] M. Moore, Representation Issues for Reengineering Interactive Systems, ACM Computing Surveys Special issue: position
statements on strategic directions in computing research, Vol. 28, No. 4, Dec 1996, article # 199, ACM Press, New York, NY, USA.
[21] M. Moore and S. Rugaber, Using Knowledge Representation to Understand Interactive Systems, in Proc. of the Fifth
International Workshop on Program Comprehension IWPC‘97 (Dearborn, 28-30 May 1997), IEEE Computer Society Press, Los
Alamitos, 1997
[22] G. Mori, F. Paternò, C. Santoro, Tool support for designing nomadic applications, Proc. of the 2003 international conference
on Intelligent user interfaces, Jan 2003, (Miami, USA), ACM Press, New York, USA, pp141-148
[23] L. Paganelli, F. Paterno, Automatic reconstruction of the underlying interaction design of web applications, in Proc.Of the
14th international conference on Software engineering and knowledge engineering, (July 2002, Ischia, Italy), ACM Press, New
York, USA, pp 439 – 445.
[24] G. Calvary, J. Coutaz, D. Thevenin, Q. Limbourg, N. Souchon, L. Bouillon, J. Vanderdonckt, Plasticity of User Interfaces: A
Revised Reference Framework, in Proc. of 1st International Workshop on Task Model and Diagrams for user interface
design Tamodia‘2002 (Bucharest, 18-19 Jul 2002), INFOREC Publishing House Bucharest, Romania,2002.