This document discusses elements that contribute to legacy program complexity. It identifies factors such as difficulty understanding old code, high cost of maintenance and replacement, large size, poor design, integration challenges with new technologies, lack of documentation, inflexibility, long processing times, unavailability of original staff, reliability issues, and bugs. The paper explores each of these elements in detail and argues that legacy programs are complex due to a combination of these interrelated factors such as large size, complex designs with many interconnected parts, and difficulty integrating old code and platforms with new technologies.
THE REMOVAL OF NUMERICAL DRIFT FROM SCIENTIFIC MODELSIJSEA
Computer programs often behave differently under different compilers or in different computing
environments. Relative debugging is a collection of techniques by which these differences are analysed.
Differences may arise because of different interpretations of errors in the code, because of bugs in the
compilers or because of numerical drift, and all of these were observed in the present study. Numerical
drift arises when small and acceptable differences in values computed by different systems are
integrated, so that the results drift apart. This is well understood and need not degrade the validity of the
program results. Coding errors and compiler bugs may degrade the results and should be removed. This
paper describes a technique for the comparison of two program runs which removes numerical drift and
therefore exposes coding and compiler errors. The procedure is highly automated and requires very little
intervention by the user. The technique is applied to the Weather Research and Forecasting model, the
most widely used weather and climate modelling code.
This project is on “E-Filling System”. The purpose of the project is faster movement of files and documents through different layers of non-government and government offices. This system used to store and track electronic documents or file. It is very effective because centralized source of information, Import Security, Cost-effectiveness, improved workflow, maximized customer satisfaction, easy retrieval and flexible search. The aim of the Electronic-file System is to execute Paperless Communication system throughout the organization. Paperless Communication helps to accommodate the system smoothly and to ensure quick disposal of proposal. However, the goal of our system is not to become a slave of paper - searching for it, filing it, approved it, storing it and losing it at inconvenient times - but rather to handle paper electronically to lower its intrinsic administrative cost.
THE REMOVAL OF NUMERICAL DRIFT FROM SCIENTIFIC MODELSIJSEA
Computer programs often behave differently under different compilers or in different computing
environments. Relative debugging is a collection of techniques by which these differences are analysed.
Differences may arise because of different interpretations of errors in the code, because of bugs in the
compilers or because of numerical drift, and all of these were observed in the present study. Numerical
drift arises when small and acceptable differences in values computed by different systems are
integrated, so that the results drift apart. This is well understood and need not degrade the validity of the
program results. Coding errors and compiler bugs may degrade the results and should be removed. This
paper describes a technique for the comparison of two program runs which removes numerical drift and
therefore exposes coding and compiler errors. The procedure is highly automated and requires very little
intervention by the user. The technique is applied to the Weather Research and Forecasting model, the
most widely used weather and climate modelling code.
This project is on “E-Filling System”. The purpose of the project is faster movement of files and documents through different layers of non-government and government offices. This system used to store and track electronic documents or file. It is very effective because centralized source of information, Import Security, Cost-effectiveness, improved workflow, maximized customer satisfaction, easy retrieval and flexible search. The aim of the Electronic-file System is to execute Paperless Communication system throughout the organization. Paperless Communication helps to accommodate the system smoothly and to ensure quick disposal of proposal. However, the goal of our system is not to become a slave of paper - searching for it, filing it, approved it, storing it and losing it at inconvenient times - but rather to handle paper electronically to lower its intrinsic administrative cost.
A methodology to evaluate object oriented software systems using change requi...ijseajournal
It is a well known fact that software maintenance plays a major role and finds importance in software
development life cycle. As object
-
oriented programming has become the standard, it is very important to
understand th
e problems of maintaining object
-
oriented software systems. This paper aims at evaluating
object
-
oriented software system through change requirement traceability
–
based impact analysis
methodology
for non functional requirements using functional requirem
ents
. The major issues have been
related to change impact algorithms and inheritance of functionality.
A SURVEY ON ACCURACY OF REQUIREMENT TRACEABILITY LINKS DURING SOFTWARE DEVELO...ijiert bestjournal
There are number of routing protocols proposed for the data transmission in WSN. Initially single path routing schemes with number of variations are proposed. Sti ll there were some drawbacks in single path routing . Single path routing was unable to provide the reliability and h igh throughput. Also security level was not conside red while routing. Recently,to remove the drawbacks of the s ingle path routing new routing technique is propose d called as multipath routing. In this paper we discussed the different multipath routing protocols with number of variants. Initiall y multipath routing was proposed for the purpose of guaranteed delivery of packet to sink in case of link or node failure. There are other protocols which are proposed for the reli ability,energy saving,security and high throughpu t. Some multipath routing protocols have discussed the load balancing and security during packet transmission.
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
A Methodology To Manage Victim Components Using Cbo Measureijseajournal
The current practices of software industry demands development of a software within time and budget
which is highly productive. The traditional approach of developing a software from scratch requires
considerable amount of effort. To overcome the drawback a reuse drive software development approach is
adopted. However there is a dire need for realizing effective software reuse. This paper presents several
measures of reusability and presents a methodology of reconfiguring the victim components. The CBO
measure helps in identifying the component to be reconfigured. The proposed strategy is simulated using
HR portal domain specific component system.
A LOG-BASED TRACE AND REPLAY TOOL INTEGRATING SOFTWARE AND INFRASTRUCTUREijseajournal
We propose a log-based analysis tool for evaluating web application computer system. A feature of the tool is an integration software log with infrastructure log. Software engineers alone can resolve system faults in the tool, even if the faults are complicated by both software problems and infrastructure problems. The tool consists of 5 steps: preparation software, preparation infrastructure, collecting logs, replaying the log data, and tracing the log data. The tool was applied to a simple web application system in a small-scale
local area network. We confirmed usefulness of the tool when a software engineer detects faults of the system failures such as “404” and “no response” errors. In addition, the tool was partially applied to a real large-scale computer system with many web applications and large network environment. Using the replaying and the tracing in the tool, we found causes of a real authentication error. The causes were combined an infrastructure problem with a software problem. Even if the failure is caused by not only a software problem but also an infrastructure problem, we confirmed that software engineers distinguish between a software problem and an infrastructure problem using the tool.
Contributors to Reduce Maintainability Cost at the Software Implementation PhaseWaqas Tariq
Software maintenance is important and difficult to measure. The cost of maintenance is the most ever during the phases of software development. One of the most critical processes in software development is the reduction of software maintainability cost based on the quality of source code during design step, however, a lack of quality models and measures can help asses the quality attributes of software maintainability process. Software maintainability suffers from a number of challenges such as lack source code understanding, quality of software code, and adherence to programming standards in maintenance. This work describes model based-factors to assess the software maintenance, explains the steps followed to obtain and validate them. Such a method can be used to eliminate the software maintenance cost. The research results will enhance the quality of the source code. It will increase software understandability, eliminate maintenance time, cost, and give confidence for software reusability.
The International Journal of Engineering and Science (IJES)theijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
Secret keys and the packets transportation for privacy data forwarding method...eSAT Journals
Abstract The Cloud computing is the process of storing the data in the Remote server. This process doesn‘t speak much about confidentiality and robustness of the data. To improve the security and confidentiality the uploaded file from a data owner is splitted into multiple packets and stored in multiple cloud servers. These packets are encrypted using the primary key. These different keys are also distributed in multiple key servers. User id is appended for verification. If the data owner forwards the file then the keys are verified for the data access. In this we are proposing sending the secret key as SMS to the shared or forwarded nodes for the process of proper Security. This technique integrates the concepts of encryption, encoding and forwarding. Keywords-cloud computing, encryption, storage system
Development of improved pid controller for single effect evaporatoreSAT Journals
Abstract
Kraft pulping is the pulping process that is mostly used in paper mills. This is because the chemicals used in cooking are being
recovered, and environmental pollution is minimized, as the chemicals are not discarded into the environment, but rather recycled.
These chemicals are recovered by burning the black liquor that leaves the digester in a reactor called a recovery boiler.
Feeding the black liquor directly from the digester to the recovery boiler will cause a water/smelt explosion. This explosion
happens if the percentage of water in the liquor is high, the solid content of the black liquor leaving the digester is usually
between 13-18%. To avoid such disaster the concentration of the black liquor has to be increased, which should in the range of
60-70%, this is done in a single or multiple effect evaporator. In this paper, a single effect evaporator control has been designed
using a Proportional Integral Derivative (PID) controller. During the first design a simple PID controller with a feedback control
was used. Then taking into consideration the load disturbance in the process, a feed forward control has then been introduced
into the controller design. Subsequently the concept of boiling point rise (BPR) to measure concentration has then been used in
the controller design of a single effect evaporator.
Keywords: Boiling Point Rise, Concentration, Evaporators, Feed Forward Control, Paper Mill, PID Controller,
Pulping Process, Single-Effect Evaporator
Abstract Ocean currents are an enormous source of green energy. This energy from marine currents can be extracted by means of tidal turbines. This paper explains different types of tidal current turbines. This paper discusses about tidal energy and site selection criteria for tidal current turbine in general. This paper gives general overview about tidal current turbine design methods such as the blade element momentum theory and computational fluid dynamics. Keywords: Tidal energy, Tidal current turbines, Site selection, BEMT, CFD
A methodology to evaluate object oriented software systems using change requi...ijseajournal
It is a well known fact that software maintenance plays a major role and finds importance in software
development life cycle. As object
-
oriented programming has become the standard, it is very important to
understand th
e problems of maintaining object
-
oriented software systems. This paper aims at evaluating
object
-
oriented software system through change requirement traceability
–
based impact analysis
methodology
for non functional requirements using functional requirem
ents
. The major issues have been
related to change impact algorithms and inheritance of functionality.
A SURVEY ON ACCURACY OF REQUIREMENT TRACEABILITY LINKS DURING SOFTWARE DEVELO...ijiert bestjournal
There are number of routing protocols proposed for the data transmission in WSN. Initially single path routing schemes with number of variations are proposed. Sti ll there were some drawbacks in single path routing . Single path routing was unable to provide the reliability and h igh throughput. Also security level was not conside red while routing. Recently,to remove the drawbacks of the s ingle path routing new routing technique is propose d called as multipath routing. In this paper we discussed the different multipath routing protocols with number of variants. Initiall y multipath routing was proposed for the purpose of guaranteed delivery of packet to sink in case of link or node failure. There are other protocols which are proposed for the reli ability,energy saving,security and high throughpu t. Some multipath routing protocols have discussed the load balancing and security during packet transmission.
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
A Methodology To Manage Victim Components Using Cbo Measureijseajournal
The current practices of software industry demands development of a software within time and budget
which is highly productive. The traditional approach of developing a software from scratch requires
considerable amount of effort. To overcome the drawback a reuse drive software development approach is
adopted. However there is a dire need for realizing effective software reuse. This paper presents several
measures of reusability and presents a methodology of reconfiguring the victim components. The CBO
measure helps in identifying the component to be reconfigured. The proposed strategy is simulated using
HR portal domain specific component system.
A LOG-BASED TRACE AND REPLAY TOOL INTEGRATING SOFTWARE AND INFRASTRUCTUREijseajournal
We propose a log-based analysis tool for evaluating web application computer system. A feature of the tool is an integration software log with infrastructure log. Software engineers alone can resolve system faults in the tool, even if the faults are complicated by both software problems and infrastructure problems. The tool consists of 5 steps: preparation software, preparation infrastructure, collecting logs, replaying the log data, and tracing the log data. The tool was applied to a simple web application system in a small-scale
local area network. We confirmed usefulness of the tool when a software engineer detects faults of the system failures such as “404” and “no response” errors. In addition, the tool was partially applied to a real large-scale computer system with many web applications and large network environment. Using the replaying and the tracing in the tool, we found causes of a real authentication error. The causes were combined an infrastructure problem with a software problem. Even if the failure is caused by not only a software problem but also an infrastructure problem, we confirmed that software engineers distinguish between a software problem and an infrastructure problem using the tool.
Contributors to Reduce Maintainability Cost at the Software Implementation PhaseWaqas Tariq
Software maintenance is important and difficult to measure. The cost of maintenance is the most ever during the phases of software development. One of the most critical processes in software development is the reduction of software maintainability cost based on the quality of source code during design step, however, a lack of quality models and measures can help asses the quality attributes of software maintainability process. Software maintainability suffers from a number of challenges such as lack source code understanding, quality of software code, and adherence to programming standards in maintenance. This work describes model based-factors to assess the software maintenance, explains the steps followed to obtain and validate them. Such a method can be used to eliminate the software maintenance cost. The research results will enhance the quality of the source code. It will increase software understandability, eliminate maintenance time, cost, and give confidence for software reusability.
The International Journal of Engineering and Science (IJES)theijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
Secret keys and the packets transportation for privacy data forwarding method...eSAT Journals
Abstract The Cloud computing is the process of storing the data in the Remote server. This process doesn‘t speak much about confidentiality and robustness of the data. To improve the security and confidentiality the uploaded file from a data owner is splitted into multiple packets and stored in multiple cloud servers. These packets are encrypted using the primary key. These different keys are also distributed in multiple key servers. User id is appended for verification. If the data owner forwards the file then the keys are verified for the data access. In this we are proposing sending the secret key as SMS to the shared or forwarded nodes for the process of proper Security. This technique integrates the concepts of encryption, encoding and forwarding. Keywords-cloud computing, encryption, storage system
Development of improved pid controller for single effect evaporatoreSAT Journals
Abstract
Kraft pulping is the pulping process that is mostly used in paper mills. This is because the chemicals used in cooking are being
recovered, and environmental pollution is minimized, as the chemicals are not discarded into the environment, but rather recycled.
These chemicals are recovered by burning the black liquor that leaves the digester in a reactor called a recovery boiler.
Feeding the black liquor directly from the digester to the recovery boiler will cause a water/smelt explosion. This explosion
happens if the percentage of water in the liquor is high, the solid content of the black liquor leaving the digester is usually
between 13-18%. To avoid such disaster the concentration of the black liquor has to be increased, which should in the range of
60-70%, this is done in a single or multiple effect evaporator. In this paper, a single effect evaporator control has been designed
using a Proportional Integral Derivative (PID) controller. During the first design a simple PID controller with a feedback control
was used. Then taking into consideration the load disturbance in the process, a feed forward control has then been introduced
into the controller design. Subsequently the concept of boiling point rise (BPR) to measure concentration has then been used in
the controller design of a single effect evaporator.
Keywords: Boiling Point Rise, Concentration, Evaporators, Feed Forward Control, Paper Mill, PID Controller,
Pulping Process, Single-Effect Evaporator
Abstract Ocean currents are an enormous source of green energy. This energy from marine currents can be extracted by means of tidal turbines. This paper explains different types of tidal current turbines. This paper discusses about tidal energy and site selection criteria for tidal current turbine in general. This paper gives general overview about tidal current turbine design methods such as the blade element momentum theory and computational fluid dynamics. Keywords: Tidal energy, Tidal current turbines, Site selection, BEMT, CFD
Stone texture classification and discrimination by edge direction movementeSAT Journals
Abstract
Texture discrimination is the rich field in the area pattern recognition and pattern analysis. The texture classification is the one of
the major field in texture discrimination. In this paper derive an approach for texture group classification based on the direction
movement. The edge movements are identified in each 3×3 window of the texture image. Based on the edge direction movements
the texture images are categorized. Two texture groups used in this paper. Texture group 1 consists of Bark, Sand, Raffia and
Pigskin images and Straw, Bsand, Wgrain and Grass image are treated as texture group2. In this paper, Horizontal, Vertical
direction and also Right, Left Diagonal Edge direction movements are identified.
Key Words: Edge Direction movements, texture classification, pattern recognition, texture group
Numerical investigation of winglet angles influence on vortex sheddingeSAT Journals
Abstract Wingtip devices are usually intended to improve the efficiency of fixed-wing aircraft. There are several types of wing tip devices, and although they function in different manners, the intended effect is always to reduce the aircraft's drag by partial recovery of the tip vortex energy. Wingtip devices can also improve aircraft handling characteristics and enhance safety for following aircraft. Such devices increase the effective aspect ratio of a wing without materially increasing the wingspan. The purpose of this project is to analyse different types of winglets using Auto CAD, GAMBIT and FLUENT and then, to find out what happens when winglets are linked to wingtips and to study about the aerodynamic properties of spiroid winglets which are still under research. One wing model without winglet and three wing models with winglets were created and drawn in Auto CAD and they were meshed in GAMBIT using geometry data gathered from research papers. Those models were read into Fluent where flow boundary conditions were applied.
The wing without winglet, the wings with spiroid winglets and the wings with other kinds of winglets performance were analyzed in several angles of attack and the drag coefficient was compared when the aircraft is taking off. Wingtip vortices from models were checked, studied and compared. The best winglet model reduces drag coefficient and wingtip vortices from wing without winglet model and it will be pointed out at the end of this project
Review of various adaptive modulation and coding techniques in wireless networkeSAT Journals
Abstract Adaptive modulation and coding (AMC) is a method which adapts its transmitting parameters according to the channel state and is used in various modern wireless communications to maximize spectrum efficiency by minimizing the error rate. One of the driving strengths of AMC is the Signal-to-Noise Ratio (SNR) estimation and feedback channel for adaptation. Degrading effects caused due to sudden variation in the channel over time sometimes require the transmission link to react appropriately so it can minimize the Bit Error Rate (BER). There are various adaptive methods to implement the same like variable rate, variable error probability, and variable coding or hybrid technique. This article focuses upon variable power technique and describes two to three different power techniques and tries to make a comparison between two of them i.e, channel inversion and water-filling. Keywords- Adaptive Modulation and Coding (AMC), Bit Error Rate (BER), channel inversion, water-filling
Optimization of machining parameters on tool tip temperature by using vegetab...eSAT Journals
Abstract
In this research the experiments were performed by using vegetable oil to know the effect of different machining parameters on
tool tip temperature. The quality of work piece material is main contributing factor in these days which may be influence by heat
generated at the tool tip temperature. For this experimental work different machining parameters were used depth of cut, speed
and feed rate, and work piece on the tool tip temperature during a machining ofEN8 material. To study the influence of each
parameter on tool tip temperature total 9 experiments were performed in order to find the minimum tool tip temperature. The
experiments were performed by varying one parameter while, the keep two parameter were kept constant. So the influence of tool
tip temperature on varying machining parameters is studied in this research work. The main objective of this study is to
investigate the effect of cutting parameters and to find out optimal conditions of spindle speed, feed, and depth of cut and work
piece material for achieving minimum temperature and to increase the quality of work peice material by reducing the temperature
using vegetable . In this experiment tool tip temperature measurement instrument is used to find the optimum parametric
conditions like spindle speed, feed and depth of cut on EN-8 material. The analysis reveals that spindle speed has major effect on
tool tip temperature while using Soya oil as lubricant.
Key Words: Tool tip temperature, spindle speed, feed rate and Depth of cut.
Effect of potato powder supplementation and spices addition on physical and s...eSAT Journals
Abstract
The objective of the study was to determine the acceptability of potato powder supplemented and spices added cookies based on their sensory and physical properties. Sensory evaluation was done based on hedonic scale. Potato powder supplementation was ranged from 10-50% (wt. of flour/wt. of potato powder). Cardamom and clove were used as flavoring agent, which ranges from 0.5% to 1% (wt. of flour/wt. of spice). Increment in potato powder supplementation was lead to substantial decrement in physical properties and sensory evaluation (P<0.05)><0.05).><0.05).
Keywords: cookies, potato powder supplementation, spices addition, sensory & physical analysis, ANOVA analysis
Determination of safe grade elevation by using hec ras case study mutha rivereSAT Journals
Abstract
Flood is a naturally occurring disastrous event causing damages, losses and destruction to property, life and environment.
Hundred millions of money are spent every year in flood control and flood forecasting. For construction of any structure near
by a water body or in between a water body and for determination of safe levels of construction to protect structure from
flood water, safe grade elevation is required.
In order to evaluate or estimate, mitigate and handle the floods, the present paper presents a methodology for assessment of flood
line to produce safe grade elevation by using River Analysis System made by Hydrologic Engineering Center (HEC-RAS)
software which is predominately used in the field of hydraulic analysis for floodplain delineation. The general parameter affecting
flood is runoff gauge, discharge, rainfall and land use as spatial data. This paper explains the use of the HEC-RAS for producing
the safe grade elevation for Mutha River from its origin at downstream side of Kadakwasla dam till Mahtre Bridge. It explains the
methodology to construct a table model and how to validate it. The methodology developed can be applied for regions if only
predominant factors affecting the flood in that region is consider, to decide the best economical safe grade elevation for the
building or structure near or on the river and would help in planning priorities prerequisites for managing flood efficiently.
Keywords: Safe Grade elevation, Parameters, Mutha River, Flood, spatial data, Zoning.
Semantic approach utilizing data mining and case based reasoning for it suppo...eSAT Journals
Abstract Information Technology (IT) plays a very important role in all organizations. IT executives are constantly faced with problems that are difficult to tackle. Failure in IT service can interrupt the functioning of an organization. Case-Based Reasoning (CBR) is a problem solving methodology where experience in the form of past cases can be used to solve problems, thereby assisting the automation of problem solving and experience management. Furthermore, the performance, quality and efficiency of CBR systems can be enhanced through data mining. In order to support the IT team for faster and efficient problem resolution, a case-based reasoning approach integrated with data mining techniques could be utilized. In this paper, the study done on various CBR systems and data mining techniques for problem and experience management is explained. A system is proposed for IT experience and problem management with semantic retrieval in order to increase the efficiency and quality of the IT support service. Keywords: Case-based reasoning, Data Mining, Experience management, IT problem management, IT support.
Forest type mapping of bidar forest division, karnataka using geoinformatics ...eSAT Journals
Abstract
The study demonstrate the potentiality of satellite remote sensing technique for the generation of baseline information on forest types
including tree plantation details in Bidar forest division, Karnataka covering an area of 5814.60Sq.Kms. The Total Area of Bidar
forest division is 5814Sq.Kms analysis of the satellite data in the study area reveals that about 84% of the total area is Covered by
crop land, 1.778% of the area is covered by dry deciduous forest, 1.38 % of mixed plantation, which is very threatening to the
environmental stability of the forest, future plantation site has been mapped. With the use of latest Geo-informatics technology proper
and exact condition of the trees can be observed and necessary precautions can be taken for future plantation works in an appropriate
manner
Keywords:-RS, GIS, GPS, Forest Type, Tree Plantation
Investigations on the performance of diesel in an air gap ceramic coated dies...eSAT Journals
Abstract The world’s rapidly diminishing petroleum supplies, their raising costs and budding danger of environmental pollution have led to an intensive search for an alternative fuels or increasing the efficiency of the available diesel engines. It is a known fact that about 30% of energy supplied to the diesel engine is lost through the coolant and 30% is wasted through friction and exhaust and other losses, thus leaving only 40% of energy utilization for the useful purposes. If this lost heat rejection is reduced, the thermal efficiency can be improved. With the insulation of the combustion chamber Walls with ceramics, the transfer of heat can be restricted and can be used further for heating the incoming fresh charge and the same thing can be observed with exhaust gases. This increases the combustion efficiency and reduces the emissions. Hence in the present work, a ceramic coated engine is developed by incorporating air gap between the piston skirt and crown, cylinder liner and jacket, ceramic coating on cylinder head and valves. Therefore, a solemn attempt is made in this research work to investigate the performance and emission characteristics of diesel engine with diesel as fuel. Further the performance of the engine depends on the heat in the combustion chamber. This intends depend on piston material and the turbulence generated in the engine. So, further an attempt is made with brass piston insert and brass insert with six grooves which replaces the aluminum piston in the engine. Among all the pistons tested the brass insert with six grooves is proved to be the best in terms of performance and emissions point of view. But with the higher temperatures in the chamber drop in volumetric efficiency and lubricating oil deterioration are the main problems. However they can overcome by turbo compounding and with the development of new lubricants. Keywords: Ceramic engines, Air gap, New Lubricants, Piston inserts, Low heat rejection engines
The solution of problem of parameterization of the proximity function in ace ...eSAT Journals
Abstract
In this work, a new approach for defining the value of the proximity function, which is carried out in the second step of the
Algorithms for Calculating Estimates (ACE) in the area of Pattern Recognition, is presented. The value of the proximity function
is defined as a part of corresponding features of two objects. The main attention is paid to essential features of the polytypic in a
given training set. One of the important problems of the ACE is to compare the values of fuzzy attributes. The main idea of this
approach is considering the proximity the corresponding quantitative and qualitative features together. Here a complexity of
comparing the qualitative features and an approach of overcoming such complexity are considered. Such features include the
features with fuzzy values. The membership function of fuzzy set theory is used for determining membership degrees of the feature
values describing with linguistic values for improve the quality of ACE. The steps of the algorithm for transfer the results is
obtained from the comparison of the two values of fuzzy feature by using membership function to the proximity function. The
membership function with two parameters (b and c) is used. For defining optimal values of these parameters evolutionary
algorithms for solving optimization problems are used, one of them is Genetic algorithm. By using genetic algorithm initial
parameters’ values of the membership function are generated and transmitted to the proximity function. The ACE is run and value
of functional quality is defined during the training process with given training set. If the value of the functional quality is not
sufficiently high than the values obtained by Genetic algorithm, these values are regenerated using special operators (selection,
crossover, mutation) of the Genetic algorithm. The algorithm for selection optimal values of the parameters of the membership
function using the Genetic algorithm is given.
Key Words: ACE, proximity function, Genetic algorithm, membership function, parameters, operators.
Necessity of integrated transport system to namma metro at byapanahalli – a s...eSAT Journals
Abstract
Mass Rapid Transit is one of the major Transportation system proposed in metropolitan city like Bangalore in order to be beneficial
in reducing various traffic problems and result in reduction of Travel time etc. The efficiency of this system can be increased by
attracting more number of Trip makers by a suitable Integrated Transport System. Feeder system is one of these techniques proposed
for Namma Metro in Bangalore which includes Feeder bus (Minibus) operating throughout the radial areas of Metro stations. The
present study includes the necessity of these buses as par with Public Transport Buses currently operating in these areas with respect
to the willingness of commuters, Frequency and Travel Time.
Designing multi agent based linked state machineeSAT Journals
Abstract Every industrial process control algorithm can be represented as an ASM chart and every ASM chart can be implemented using the state agent based approach. This paper describes a simplified approach to implementing complex process control systems using the agent-based paradigm. It shows that when the system is complex, it can be broken up into a series of ASM charts, and each of those ASM charts can be implemented using agent-based approach comprising of one process agent and a number of state agents. These agent based systems together perform the complex industrial process control. This structure can be interconnected to form a linked state machine in order to realize a multi-agent based linked state structure. This is a ‘divide and conquer’ approach that is much simpler than tackling the control system implementation as one very complex effort. Index Terms: Process control, ASM chart, State Transition Table (STT), Multi-agent, Link state machines, Beverage blending
Rna secondary structure prediction, a cuckoo search approacheSAT Journals
Abstract
RNA secondary structure prediction uses techniques like crystallography, NMR spectroscopy etc. Computation based techniques
estimate the possible base pairs that could be formed in RNA. Soft computing techniques generally select some random pair or
pair sequences and then check them according to some parameters. The final sequence of RNA which is closest to the required
fitness is selected as the final structure. The cuckoo search approach is good for finding the feasible search space locations.
Cuckoo search approach feasibly provides results for the detection of base pairs in the RNA and the RNA secondary structure.
Keywords: DNA, RNA, base pairs, pseudo-knots, structure, soft computing, techniques
Steady state stability analysis and enhancement of three machine nine bus pow...eSAT Journals
Abstract
Power System stability study is the important parameter of economic, reliable and secure system planning and operation. Studies
are important during the planning and conceptual design stages of the project as well as during the operating life of the plant
periodically. This paper presents the power system steady state stability analysis for IEEE- 9 bus test system and examines
influence of TCPS FACTS device based controller on test system. It is assumed that system under study has been perturbed from a
steady state equilibrium that prevailed prior to the application of the disturbance. If system is stable, we would expect that for
temporary or permanent disturbance, system will acquire initial or new operating state after a transient period. The stability
study is accessed using Lyapunov’s first method. The effectiveness of damping controller in enhancing the steady state stability is
investigated by incorporating available constraints. For analysis MATLAB software is employed. The conclusions have been
drawn here, based on theoretical and mathematical analysis so as to provide an insight and better understanding of steady state
stability of considered multi machine power system.
Key Words: Lyapunov’s first method, Steady-state stability, Phase portrait, FACTS device, supplementary modulation
controller, eigen value, synchronizing power coefficient, IEEE-9 Bus Test System, Load Flow Study, Differential
algebraic equation.
Abstract Convolutional encoding is a Forward Error Correction (FEC) technique used in continuous one-way and real time communication links. Wireless devices such as hand phones and broadband modems rely heavily on forward error correction techniques for their proper functioning, thus sending and receiving information with minimal or no error, while utilizing the available bandwidth. Major requirements for modern digital wireless communication systems include high throughput, low power consumption and physical size. This paper presents the simulation of convolutional encoder using MATLAB. The performance and analysis has done by changing rates of convolutional encoder and error of binary symmetric channel. Keywords: Constraint Length, Convolutional Encoder, Data rates, Generator polynomials, Poly2trellis structure.
Developing reusable software components for distributed embedded systemseSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
A Comparative Study of Forward and Reverse Engineeringijsrd.com
With the software development at its boom compared to 20 years in the past, software developed in the past may or may not have a well-supported documentation during the software evolution. This may increase the specification gap between the document and the legacy code to make further evolutions and updates. Understanding the legacy code of the underlying decisions made during development is the prime motto, which is very well supported by Reverse Engineering. In this paper, we compare the Transformational Forward engineering, where a stepwise abstraction is obtained with the Transformational Reverse Methodology. While the forward transformation process produces overlap of the decisions, performance is affected. Hence, the use of transformational method of Reverse Engineering which is a backwards Forward Engineering process is suitable. Besides the design recognition obtained is a domain knowledge which can be used in future by the forward engineers.
Improved Strategy for Distributed Processing and Network Application Developm...Editor IJCATR
The complexity of software development abstraction and the new development in multi-core computers have shifted the burden of distributed software performance from network and chip designers to software architectures and developers. We need to look at software development strategies that will integrate parallelization of code, concurrency factors, multithreading, distributed resources allocation and distributed processing. In this paper, a new software development strategy that integrates these factors is further experimented on parallelism. The strategy is multidimensional aligns distributed conceptualization along a path. This development strategy mandates application developers to reason along usability, simplicity, resource distribution, parallelization of code where necessary, processing time and cost factors realignment as well as security and concurrency issues in a balanced path from the originating point of the network application to its retirement.
Improved Strategy for Distributed Processing and Network Application DevelopmentEditor IJCATR
The complexity of software development abstraction and the new development in multi-core computers have shifted the
burden of distributed software performance from network and chip designers to software architectures and developers. We need to
look at software development strategies that will integrate parallelization of code, concurrency factors, multithreading, distributed
resources allocation and distributed processing. In this paper, a new software development strategy that integrates these factors is
further experimented on parallelism. The strategy is multidimensional aligns distributed conceptualization along a path. This
development strategy mandates application developers to reason along usability, simplicity, resource distribution, parallelization of
code where necessary, processing time and cost factors realignment as well as security and concurrency issues in a balanced path from
the originating point of the network application to its retirement.
Reducing Technical Debt: Using Persuasive Technology for Encouraging Software...Hayim Makabee
Technical debt is a metaphor for the gap between the current state of a software system and its hypothesized ‘ideal’ state. One of the significant and under-investigated elements of technical debt is documentation debt, which
may occur when code is created without supporting internal documentation, such as code comments. Studies have shown that outdated or lacking documentation is a considerable contributor to increased costs of software
systems maintenance. The importance of comments is often overlooked by software developers, resulting in a notably slower growth rate of comments compared to the growth rate of code in software projects. This research aims to explore and better understand developers’ reluctance to document code, and accordingly to propose efficient ways of using persuasive technology to
encourage programmers to document their code. The results may assist software practitioners and project managers to control and reduce documentation debt.
Implementation of reducing features to improve code change based bug predicti...eSAT Journals
Abstract Today, we are getting plenty of bugs in the software because of variations in the software and hardware technologies. Bugs are nothing but Software faults, existing a severe challenge for system reliability and dependability. To identify the bugs from the software bug prediction is convenient approach. To visualize the presence of a bug in a source code file, recently, Machine learning classifiers approach is developed. Because of a huge number of machine learned features current classifier-based bug prediction have two major problems i) inadequate precision for practical usage ii) measured prediction time. In this paper we used two techniques first, cos-triage algorithm which have a go to enhance the accuracy and also lower the price of bug prediction and second, feature selection methods which eliminate less significant features. Reducing features get better the quality of knowledge extracted and also boost the speed of computation. Keywords: Efficiency, Bug Prediction, Classification, Feature Selection, Accuracy
Unsustainable Regaining Control of Uncontrollable AppsCAST
The ever-growing cost to maintain systems continues to crush IT organizations robbing their ability to fund innovation while increasing risks across the organization. There are, however, some tactics to reduce application total ownership cost, reduce complexity and improve sustainability across your portfolio.
Mechanical properties of hybrid fiber reinforced concrete for pavementseSAT Journals
Abstract
The effect of addition of mono fibers and hybrid fibers on the mechanical properties of concrete mixture is studied in the present
investigation. Steel fibers of 1% and polypropylene fibers 0.036% were added individually to the concrete mixture as mono fibers and
then they were added together to form a hybrid fiber reinforced concrete. Mechanical properties such as compressive, split tensile and
flexural strength were determined. The results show that hybrid fibers improve the compressive strength marginally as compared to
mono fibers. Whereas, hybridization improves split tensile strength and flexural strength noticeably.
Keywords:-Hybridization, mono fibers, steel fiber, polypropylene fiber, Improvement in mechanical properties.
Material management in construction – a case studyeSAT Journals
Abstract
The objective of the present study is to understand about all the problems occurring in the company because of improper application
of material management. In construction project operation, often there is a project cost variance in terms of the material, equipments,
manpower, subcontractor, overhead cost, and general condition. Material is the main component in construction projects. Therefore,
if the material management is not properly managed it will create a project cost variance. Project cost can be controlled by taking
corrective actions towards the cost variance. Therefore a methodology is used to diagnose and evaluate the procurement process
involved in material management and launch a continuous improvement was developed and applied. A thorough study was carried
out along with study of cases, surveys and interviews to professionals involved in this area. As a result, a methodology for diagnosis
and improvement was proposed and tested in selected projects. The results obtained show that the main problem of procurement is
related to schedule delays and lack of specified quality for the project. To prevent this situation it is often necessary to dedicate
important resources like money, personnel, time, etc. To monitor and control the process. A great potential for improvement was
detected if state of the art technologies such as, electronic mail, electronic data interchange (EDI), and analysis were applied to the
procurement process. These helped to eliminate the root causes for many types of problems that were detected.
Managing drought short term strategies in semi arid regions a case studyeSAT Journals
Abstract
Drought management needs multidisciplinary action. Interdisciplinary efforts among the experts in various fields of the droughts
prone areas are helpful to achieve tangible and permanent solution for this recurring problem. The Gulbarga district having the total
area around 16, 240 sq.km, and accounts 8.45 per cent of the Karnataka state area. The district has been situated with latitude 17º 19'
60" North and longitude of 76 º 49' 60" east. The district is situated entirely on the Deccan plateau positioned at a height of 300 to
750 m above MSL. Sub-tropical, semi-arid type is one among the drought prone districts of Karnataka State. The drought
management is very important for a district like Gulbarga. In this paper various short term strategies are discussed to mitigate the
drought condition in the district.
Keywords: Drought, South-West monsoon, Semi-Arid, Rainfall, Strategies etc.
Life cycle cost analysis of overlay for an urban road in bangaloreeSAT Journals
Abstract
Pavements are subjected to severe condition of stresses and weathering effects from the day they are constructed and opened to traffic
mainly due to its fatigue behavior and environmental effects. Therefore, pavement rehabilitation is one of the most important
components of entire road systems. This paper highlights the design of concrete pavement with added mono fibers like polypropylene,
steel and hybrid fibres for a widened portion of existing concrete pavement and various overlay alternatives for an existing
bituminous pavement in an urban road in Bangalore. Along with this, Life cycle cost analyses at these sections are done by Net
Present Value (NPV) method to identify the most feasible option. The results show that though the initial cost of construction of
concrete overlay is high, over a period of time it prove to be better than the bituminous overlay considering the whole life cycle cost.
The economic analysis also indicates that, out of the three fibre options, hybrid reinforced concrete would be economical without
compromising the performance of the pavement.
Keywords: - Fatigue, Life cycle cost analysis, Net Present Value method, Overlay, Rehabilitation
Laboratory studies of dense bituminous mixes ii with reclaimed asphalt materialseSAT Journals
Abstract
The issue of growing demand on our nation’s roadways over that past couple of decades, decreasing budgetary funds, and the need to
provide a safe, efficient, and cost effective roadway system has led to a dramatic increase in the need to rehabilitate our existing
pavements and the issue of building sustainable road infrastructure in India. With these emergency of the mentioned needs and this
are today’s burning issue and has become the purpose of the study.
In the present study, the samples of existing bituminous layer materials were collected from NH-48(Devahalli to Hassan) site.The
mixtures were designed by Marshall Method as per Asphalt institute (MS-II) at 20% and 30% Reclaimed Asphalt Pavement (RAP).
RAP material was blended with virgin aggregate such that all specimens tested for the, Dense Bituminous Macadam-II (DBM-II)
gradation as per Ministry of Roads, Transport, and Highways (MoRT&H) and cost analysis were carried out to know the economics.
Laboratory results and analysis showed the use of recycled materials showed significant variability in Marshall Stability, and the
variability increased with the increase in RAP content. The saving can be realized from utilization of recycled materials as per the
methodology, the reduction in the total cost is 19%, 30%, comparing with the virgin mixes.
Keywords: Reclaimed Asphalt Pavement, Marshall Stability, MS-II, Dense Bituminous Macadam-II
Laboratory investigation of expansive soil stabilized with natural inorganic ...eSAT Journals
Abstract
Soil stabilization has proven to be one of the oldest techniques to improve the soil properties. Literature review conducted revealed
that uses of natural inorganic stabilizers are found to be one of the best options for soil stabilization. In this regard an attempt has
been made to evaluate the influence of RBI-81 stabilizer on properties of black cotton soil through laboratory investigations. Black
cotton soil with varying percentages of RBI-81 viz., 0, 0.5, 1, 1.5, 2, and 2.5 percent were studied for moisture density relationships
and strength behaviour of soils. Also the effect of curing period was evaluated as literature review clearly emphasized the strength
gain of soils stabilized with RBI-81 over a period of time. The results obtained shows that the unconfined compressive strength of
specimens treated with RBI-81 increased approximately by 250% for a curing period of 28 days as compared to virgin soil. Further
the CBR value improved approximately by 400%. The studies indicated an increasing trend for soil strength behaviour with
increasing percentage of RBI-81 suggesting its potential applications in soil stabilization.
Influence of reinforcement on the behavior of hollow concrete block masonry p...eSAT Journals
Abstract
Reinforced masonry was developed to exploit the strength potential of masonry and to solve its lack of tensile strength. Experimental
and analytical studies have been carried out to investigate the effect of reinforcement on the behavior of hollow concrete block
masonry prisms under compression and to predict ultimate failure compressive strength. In the numerical program, three dimensional
non-linear finite elements (FE) model based on the micro-modeling approach is developed for both unreinforced and reinforced
masonry prisms using ANSYS (14.5). The proposed FE model uses multi-linear stress-strain relationships to model the non-linear
behavior of hollow concrete block, mortar, and grout. Willam-Warnke’s five parameter failure theory has been adopted to model the
failure of masonry materials. The comparison of the numerical and experimental results indicates that the FE models can successfully
capture the highly nonlinear behavior of the physical specimens and accurately predict their strength and failure mechanisms.
Keywords: Structural masonry, Hollow concrete block prism, grout, Compression failure, Finite element method,
Numerical modeling.
Influence of compaction energy on soil stabilized with chemical stabilizereSAT Journals
Abstract
Increase in traffic along with heavier magnitude of wheel loads cause rapid deterioration in pavements. There is a need to improve
density, strength of soil subgrade and other pavement layers. In this study an attempt is made to improve the properties of locally
available loamy soil using twin approaches viz., i) increasing the compaction of soil and ii) treating the soil with chemical stabilizer.
Laboratory studies are carried out on both untreated and treated soil samples compacted by different compaction efforts. Studies
show that increase in compaction effort results in increase in density of soil. However in soil treated with chemical stabilizer, rate of
increase in density is not significant. The soil treated with chemical stabilizer exhibits improvement in both strength and performance
properties.
Keywords: compaction, density, subgradestabilization, resilient modulus
Geographical information system (gis) for water resources managementeSAT Journals
Abstract
Water resources projects are inherited with overlapping and at times conflicting objectives. These projects are often of varied sizes
ranging from major projects with command areas of millions of hectares to very small projects implemented at the local level. Thus,
in all these projects there is seldom proper coordination which is essential for ensuring collective sustainability.
Integrated watershed development and management is the accepted answer but in turn requires a comprehensive framework that can
enable planning process involving all the stakeholders at different levels and scales is compulsory. Such a unified hydrological
framework is essential to evaluate the cause and effect of all the proposed actions within the drainage basins.
The present paper describes a hydrological framework developed in the form of a Hydrologic Information System (HIS) which is
intended to meet the specific information needs of the various line departments of a typical State connected with water related aspects.
The HIS consist of a hydrologic information database coupled with tools for collating primary and secondary data and tools for
analyzing and visualizing the data and information. The HIS also incorporates hydrological model base for indirect assessment of
various entities of water balance in space and time. The framework would be maintained and updated to reflect fully the most
accurate ground truth data and the infrastructure requirements for planning and management.
Keywords: Hydrological Information System (HIS); WebGIS; Data Model; Web Mapping Services
Forest type mapping of bidar forest division, karnataka using geoinformatics ...eSAT Journals
Abstract
The study demonstrate the potentiality of satellite remote sensing technique for the generation of baseline information on forest types
including tree plantation details in Bidar forest division, Karnataka covering an area of 5814.60Sq.Kms. The Total Area of Bidar
forest division is 5814Sq.Kms analysis of the satellite data in the study area reveals that about 84% of the total area is Covered by
crop land, 1.778% of the area is covered by dry deciduous forest, 1.38 % of mixed plantation, which is very threatening to the
environmental stability of the forest, future plantation site has been mapped. With the use of latest Geo-informatics technology proper
and exact condition of the trees can be observed and necessary precautions can be taken for future plantation works in an appropriate
manner
Keywords:-RS, GIS, GPS, Forest Type, Tree Plantation
Factors influencing compressive strength of geopolymer concreteeSAT Journals
Abstract
To study effects of several factors on the properties of fly ash based geopolymer concrete on the compressive strength and also the
cost comparison with the normal concrete. The test variables were molarities of sodium hydroxide(NaOH) 8M,14M and 16M, ratio of
NaOH to sodium silicate (Na2SiO3) 1, 1.5, 2 and 2.5, alkaline liquid to fly ash ratio 0.35 and 0.40 and replacement of water in
Na2SiO3 solution by 10%, 20% and 30% were used in the present study. The test results indicated that the highest compressive
strength 54 MPa was observed for 16M of NaOH, ratio of NaOH to Na2SiO3 2.5 and alkaline liquid to fly ash ratio of 0.35. Lowest
compressive strength of 27 MPa was observed for 8M of NaOH, ratio of NaOH to Na2SiO3 is 1 and alkaline liquid to fly ash ratio of
0.40. Alkaline liquid to fly ash ratio of 0.35, water replacement of 10% and 30% for 8 and 16 molarity of NaOH and has resulted in
compressive strength of 36 MPa and 20 MPa respectively. Superplasticiser dosage of 2 % by weight of fly ash has given higher
strength in all cases.
Keywords: compressive strength, alkaline liquid, fly ash
Experimental investigation on circular hollow steel columns in filled with li...eSAT Journals
Abstract
Composite Circular hollow Steel tubes with and without GFRP infill for three different grades of Light weight concrete are tested for
ultimate load capacity and axial shortening , under Cyclic loading. Steel tubes are compared for different lengths, cross sections and
thickness. Specimens were tested separately after adopting Taguchi’s L9 (Latin Squares) Orthogonal array in order to save the initial
experimental cost on number of specimens and experimental duration. Analysis was carried out using ANN (Artificial Neural
Network) technique with the assistance of Mini Tab- a statistical soft tool. Comparison for predicted, experimental & ANN output is
obtained from linear regression plots. From this research study, it can be concluded that *Cross sectional area of steel tube has most
significant effect on ultimate load carrying capacity, *as length of steel tube increased- load carrying capacity decreased & *ANN
modeling predicted acceptable results. Thus ANN tool can be utilized for predicting ultimate load carrying capacity for composite
columns.
Keywords: Light weight concrete, GFRP, Artificial Neural Network, Linear Regression, Back propagation, orthogonal
Array, Latin Squares
Experimental behavior of circular hsscfrc filled steel tubular columns under ...eSAT Journals
Abstract
This paper presents an outlook on experimental behavior and a comparison with predicted formula on the behaviour of circular
concentrically loaded self-consolidating fibre reinforced concrete filled steel tube columns (HSSCFRC). Forty-five specimens were
tested. The main parameters varied in the tests are: (1) percentage of fiber (2) tube diameter or width to wall thickness ratio (D/t
from 15 to 25) (3) L/d ratio from 2.97 to 7.04 the results from these predictions were compared with the experimental data. The
experimental results) were also validated in this study.
Keywords: Self-compacting concrete; Concrete-filled steel tube; axial load behavior; Ultimate capacity.
Evaluation of punching shear in flat slabseSAT Journals
Abstract
Flat-slab construction has been widely used in construction today because of many advantages that it offers. The basic philosophy in
the design of flat slab is to consider only gravity forces; this method ignores the effect of punching shear due to unbalanced moments
at the slab column junction which is critical. An attempt has been made to generate generalized design sheets which accounts both
punching shear due to gravity loads and unbalanced moments for cases (a) interior column; (b) edge column (bending perpendicular
to shorter edge); (c) edge column (bending parallel to shorter edge); (d) corner column. These design sheets are prepared as per
codal provisions of IS 456-2000. These design sheets will be helpful in calculating the shear reinforcement to be provided at the
critical section which is ignored in many design offices. Apart from its usefulness in evaluating punching shear and the necessary
shear reinforcement, the design sheets developed will enable the designer to fix the depth of flat slab during the initial phase of the
design.
Keywords: Flat slabs, punching shear, unbalanced moment.
Evaluation of performance of intake tower dam for recent earthquake in indiaeSAT Journals
Abstract
Intake towers are typically tall, hollow, reinforced concrete structures and form entrance to reservoir outlet works. A parametric
study on dynamic behavior of circular cylindrical towers can be carried out to study the effect of depth of submergence, wall thickness
and slenderness ratio, and also effect on tower considering dynamic analysis for time history function of different soil condition and
by Goyal and Chopra accounting interaction effects of added hydrodynamic mass of surrounding and inside water in intake tower of
dam
Key words: Hydrodynamic mass, Depth of submergence, Reservoir, Time history analysis,
Evaluation of operational efficiency of urban road network using travel time ...eSAT Journals
Abstract
Efficiency of the road network system is analyzed by travel time reliability measures. The study overlooks on an important measure of
travel time reliability and prioritizing Tiruchirappalli road network. Traffic volume and travel time were collected using license plate
matching method. Travel time measures were estimated from average travel time and 95th travel time. Effect of non-motorized vehicle
on efficiency of road system was evaluated. Relation between buffer time index and traffic volume was created. Travel time model has
been developed and travel time measure was validated. Then service quality of road sections in network were graded based on
travel time reliability measures.
Keywords: Buffer Time Index (BTI); Average Travel Time (ATT); Travel Time Reliability (TTR); Buffer Time (BT).
Estimation of surface runoff in nallur amanikere watershed using scs cn methodeSAT Journals
Abstract
The development of watershed aims at productive utilization of all the available natural resources in the entire area extending from
ridge line to stream outlet. The per capita availability of land for cultivation has been decreasing over the years. Therefore, water and
the related land resources must be developed, utilized and managed in an integrated and comprehensive manner. Remote sensing and
GIS techniques are being increasingly used for planning, management and development of natural resources. The study area, Nallur
Amanikere watershed geographically lies between 110 38’ and 110 52’ N latitude and 760 30’ and 760 50’ E longitude with an area of
415.68 Sq. km. The thematic layers such as land use/land cover and soil maps were derived from remotely sensed data and overlayed
through ArcGIS software to assign the curve number on polygon wise. The daily rainfall data of six rain gauge stations in and around
the watershed (2001-2011) was used to estimate the daily runoff from the watershed using Soil Conservation Service - Curve Number
(SCS-CN) method. The runoff estimated from the SCS-CN model was then used to know the variation of runoff potential with different
land use/land cover and with different soil conditions.
Keywords: Watershed, Nallur watershed, Surface runoff, Rainfall-Runoff, SCS-CN, Remote Sensing, GIS.
Estimation of morphometric parameters and runoff using rs & gis techniqueseSAT Journals
Abstract
Land and water are the two vital natural resources, the optimal management of these resources with minimum adverse environmental
impact are essential not only for sustainable development but also for human survival. Satellite remote sensing with geographic
information system has a pragmatic approach to map and generate spatial input layers of predicting response behavior and yield of
watershed. Hence, in the present study an attempt has been made to understand the hydrological process of the catchment at the
watershed level by drawing the inferences from moprhometric analysis and runoff. The study area chosen for the present study is
Yagachi catchment situated in Chickamaglur and Hassan district lies geographically at a longitude 75⁰52’08.77”E and
13⁰10’50.77”N latitude. It covers an area of 559.493 Sq.km. Morphometric analysis is carried out to estimate morphometric
parameters at Micro-watershed to understand the hydrological response of the catchment at the Micro-watershed level. Daily runoff
is estimated using USDA SCS curve number model for a period of 10 years from 2001 to 2010. The rainfall runoff relationship of the
study shows there is a positive correlation.
Keywords: morphometric analysis, runoff, remote sensing and GIS, SCS - method
-
Effect of variation of plastic hinge length on the results of non linear anal...eSAT Journals
Abstract The nonlinear Static procedure also well known as pushover analysis is method where in monotonically increasing loads are applied to the structure till the structure is unable to resist any further load. It is a popular tool for seismic performance evaluation of existing and new structures. In literature lot of research has been carried out on conventional pushover analysis and after knowing deficiency efforts have been made to improve it. But actual test results to verify the analytically obtained pushover results are rarely available. It has been found that some amount of variation is always expected to exist in seismic demand prediction of pushover analysis. Initial study is carried out by considering user defined hinge properties and default hinge length. Attempt is being made to assess the variation of pushover analysis results by considering user defined hinge properties and various hinge length formulations available in literature and results compared with experimentally obtained results based on test carried out on a G+2 storied RCC framed structure. For the present study two geometric models viz bare frame and rigid frame model is considered and it is found that the results of pushover analysis are very sensitive to geometric model and hinge length adopted. Keywords: Pushover analysis, Base shear, Displacement, hinge length, moment curvature analysis
Effect of use of recycled materials on indirect tensile strength of asphalt c...eSAT Journals
Abstract
Depletion of natural resources and aggregate quarries for the road construction is a serious problem to procure materials. Hence
recycling or reuse of material is beneficial. On emphasizing development in sustainable construction in the present era, recycling of
asphalt pavements is one of the effective and proven rehabilitation processes. For the laboratory investigations reclaimed asphalt
pavement (RAP) from NH-4 and crumb rubber modified binder (CRMB-55) was used. Foundry waste was used as a replacement to
conventional filler. Laboratory tests were conducted on asphalt concrete mixes with 30, 40, 50, and 60 percent replacement with RAP.
These test results were compared with conventional mixes and asphalt concrete mixes with complete binder extracted RAP
aggregates. Mix design was carried out by Marshall Method. The Marshall Tests indicated highest stability values for asphalt
concrete (AC) mixes with 60% RAP. The optimum binder content (OBC) decreased with increased in RAP in AC mixes. The Indirect
Tensile Strength (ITS) for AC mixes with RAP also was found to be higher when compared to conventional AC mixes at 300C.
Keywords: Reclaimed asphalt pavement, Foundry waste, Recycling, Marshall Stability, Indirect tensile strength.
Water scarcity is the lack of fresh water resources to meet the standard water demand. There are two type of water scarcity. One is physical. The other is economic water scarcity.
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptxR&R Consult
CFD analysis is incredibly effective at solving mysteries and improving the performance of complex systems!
Here's a great example: At a large natural gas-fired power plant, where they use waste heat to generate steam and energy, they were puzzled that their boiler wasn't producing as much steam as expected.
R&R and Tetra Engineering Group Inc. were asked to solve the issue with reduced steam production.
An inspection had shown that a significant amount of hot flue gas was bypassing the boiler tubes, where the heat was supposed to be transferred.
R&R Consult conducted a CFD analysis, which revealed that 6.3% of the flue gas was bypassing the boiler tubes without transferring heat. The analysis also showed that the flue gas was instead being directed along the sides of the boiler and between the modules that were supposed to capture the heat. This was the cause of the reduced performance.
Based on our results, Tetra Engineering installed covering plates to reduce the bypass flow. This improved the boiler's performance and increased electricity production.
It is always satisfying when we can help solve complex challenges like this. Do your systems also need a check-up or optimization? Give us a call!
Work done in cooperation with James Malloy and David Moelling from Tetra Engineering.
More examples of our work https://www.r-r-consult.dk/en/cases-en/
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...Amil Baba Dawood bangali
Contact with Dawood Bhai Just call on +92322-6382012 and we'll help you. We'll solve all your problems within 12 to 24 hours and with 101% guarantee and with astrology systematic. If you want to take any personal or professional advice then also you can call us on +92322-6382012 , ONLINE LOVE PROBLEM & Other all types of Daily Life Problem's.Then CALL or WHATSAPP us on +92322-6382012 and Get all these problems solutions here by Amil Baba DAWOOD BANGALI
#vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore#blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #blackmagicforlove #blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #Amilbabainuk #amilbabainspain #amilbabaindubai #Amilbabainnorway #amilbabainkrachi #amilbabainlahore #amilbabaingujranwalan #amilbabainislamabad
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Dr.Costas Sachpazis
Terzaghi's soil bearing capacity theory, developed by Karl Terzaghi, is a fundamental principle in geotechnical engineering used to determine the bearing capacity of shallow foundations. This theory provides a method to calculate the ultimate bearing capacity of soil, which is the maximum load per unit area that the soil can support without undergoing shear failure. The Calculation HTML Code included.
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
Cosmetic shop management system project report.pdfKamal Acharya
Buying new cosmetic products is difficult. It can even be scary for those who have sensitive skin and are prone to skin trouble. The information needed to alleviate this problem is on the back of each product, but it's thought to interpret those ingredient lists unless you have a background in chemistry.
Instead of buying and hoping for the best, we can use data science to help us predict which products may be good fits for us. It includes various function programs to do the above mentioned tasks.
Data file handling has been effectively used in the program.
The automated cosmetic shop management system should deal with the automation of general workflow and administration process of the shop. The main processes of the system focus on customer's request where the system is able to search the most appropriate products and deliver it to the customers. It should help the employees to quickly identify the list of cosmetic product that have reached the minimum quantity and also keep a track of expired date for each cosmetic product. It should help the employees to find the rack number in which the product is placed.It is also Faster and more efficient way.
Quality defects in TMT Bars, Possible causes and Potential Solutions.PrashantGoswami42
Maintaining high-quality standards in the production of TMT bars is crucial for ensuring structural integrity in construction. Addressing common defects through careful monitoring, standardized processes, and advanced technology can significantly improve the quality of TMT bars. Continuous training and adherence to quality control measures will also play a pivotal role in minimizing these defects.
Quality defects in TMT Bars, Possible causes and Potential Solutions.
Elements of legacy program complexity
1. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 04 Issue: 03 | Mar-2015, Available @ http://www.ijret.org 501
ELEMENTS OF LEGACY PROGRAM COMPLEXITY
Harmeet Kaur1
, Shahanawaj Ahamad2
, Gurvinder N. Verma3
1
Ph.D. (Computer Applications) Research Scholar, Punjab Technical University, Jalandhar, Punjab, India
2
Assistant Professor, Dept. of CS & SWE, College of Computer Sc. & Engg., University of Ha’il, Ha’il, Saudi Arabia
3
Professor and Principal, Shri Sukhmani Institute of Engineering & Technology Derabassi, Punjab, India
Abstract
Researchers in the field of software engineering, business process improvement and information engineering all want to
drastically modernize software life-cycle processes and technologies to correct the problems and to improve the quality of
software. Research goals have included ancillary issues, such as improving user services through conversion to new platforms
and facilitating software processes by adopting automated tools. Automated tools for software development, understanding,
maintenance, and documentation add to process maturity, leading to better quality and reliability of computer services and
greater customer satisfaction. This paper focuses on critical issues of legacy program improvement. The program improvement
needs the estimation of program from various perspectives. The paper highlights various elements of legacy program complexity
which further can be taken in account for further program development.
Keywords: Legacy, Program, Software complexity, Code, Integration.
--------------------------------------------------------------------***----------------------------------------------------------------------
1. PREFACE
This section will explain the contribution and motivation of
research.
1.1 Motivation
As indicated by research study conducted earlier, the best
future test for organizations is in managing complexity. As
per the present scenario the markets are more volatile, more
uncertain and complex, constraining organizations to
respond to market changes quickly. The heads of the
companies come up against these dangers by empowering
their organizations to rapidly respond to these changes.
Moreover, the organization's software systems must be
empowered to catch up to the fast changes also.
1.2 Contribution of the Paper
This paper introduces an approach to identify elements of
legacy program complexity for effective reengineering of
the legacy program. The literature has been explored for
mapping the elements responsible for enhancing the
complexity of the software and also carried a research to
know the elements responsible for increasing the complexity
of legacy program. The paper is divided into four sections.
Section-I entails introduction and literature survey. Section-
II describes how to map elements enhancing the complexity
of legacy program. Section-III introduces various elements
and provides the description of each element. Section-IV
summarizes the paper and provides an Outlook on open
research issues.
2. INTRODUCTION
The legacy program is an essential component of legacy
systems. The legacy systems are those systems which were
developed before the invention of advanced software
engineering techniques. These legacy systems are still
workable and usable but need improvement to accommodate
the changes according to current computing needs. The
work considers that legacy programs were developed in
earlier languages, but even now the program developed in
VB and Java in early 90s are also legacy. Legacy software
may be characterized as software we don't recognize what to
do with however it is performing a valuable job. The
inference is that the ideal solution is to abandon the software
totally and begin again with new software. This may not be
suitable in all cases for instance
A. The software shows years of cumulative experience
which is unrepresented somewhere else so
discarding the software will likewise abandon this
knowledge however clumsily it is shown.
B. The manual system which was replaced by the
software no more exists so system analysis must be
embraced on the software itself.
C. The software may truly function admirably and its
behavior may be well comprehended. A new
system may perform significantly poorly in the
beginning. Consequently it might be worth
recovering a portion of the features of the legacy
system.
D. A normal legacy software system has numerous
users who normally have misused undocumented
features and negative effects in the software. It may
not be adequate to request that clients attempt a
significant rework for no discernable benefit.
Therefore it might be critical to hold the interfaces
and definite usefulness of the legacy code both
unequivocal and implied.
E. Users may favor an evolutionary instead of a
progressive methodology.
2. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 04 Issue: 03 | Mar-2015, Available @ http://www.ijret.org 502
Notwithstanding, moving from present software practice
dominated by extensive, complex legacy applications with
advancement and support accumulations to a future focused
around software reuse and automatic project generation has
proven to be exceptionally troublesome. One issue is that
the advocated transition techniques require dedicated
professionals with extensive knowledge of both the
application domain and the software technology.
These professionals are charged with continuously making
computerized knowledge base of potential application
requirements. Such a knowledge base would be supported
by a library of versatile, reusable software components and a
set of decision rules and computerized methods for selecting
and collecting the segments into the desired software
application. This methodology, which creates a domain and
software reuse environment from the very beginning, has so
far demonstrated expensive and unsafe, with long-delayed
payback periods for recuperating the initial investments.
3. METHODOLOGY
To study the elements which are responsible for legacy
program complexity we have gone through the plethora of
literature which is available online as well as in the libraries
also. We have consulted the books on this aspect. Through
the in-depth study of the literature as well as books we were
able to find the elements which are responsible in enhancing
the complexity of the legacy program. It took us few months
to find and explore the elements of legacy program
complexity.
4. ELEMENTS OF LEGACY PROGRAM
COMPLEXITY
Following are the elements which are responsible for legacy
program complexity.
4.1 Difficulty in Understanding the Code
Difficulty in understanding the code of the legacy systems
makes the legacy system complex. Sneed [6] stated that
legacy code is difficult to understand and maintain. It will
lead to low quality of the software. The difficulty in
understanding the code leads to complexity enhancement
and thus lowers the quality of the software or system.
Bernard [8] showed that the legacy code is written in
assembly language or using any of the third generation
languages although the system might be doing beneficial job
for the organization what it is hard to understand the code.
However, in many cases, constant changes cause the code to
become convoluted, without organization or structure,
adhering to no standards [6], [36], [10]. The resulting code
is often called spaghetti code and is virtually impossible to
unravel.
4.2 Cost
As the legacy system is large and complex so redesigning or
replacing, upgrading the system is very costly. If legacy runs
on obsolete hardware, the expense of maintaining the system
might inevitably exceed the expense of replacing both the
software and hardware unless some type of imitating or
regressive similarity permits the software to run on new
hardware [38].As matter of fact, the elevated amount of
entropy joined with uncertain documentation about the
design and architecture make their maintenance more
complicated, time consuming, and expensive. Besides, these
systems have significant economical value and many are
vital to their owners [3]. For the high cost of lost previous
investment and business knowledge implanted in those
systems, in some cases replacing the legacy systems are
replace with new systems which are developed by using new
technology is not right decisions as these systems are
important assets of the organization.
4.3 Size
The size of the system is also one of the elements that affect
the complexity of the legacy system. As the size of the
software increases, the complexity of the system also
increases. As far as size is concerned there are two factors
which are to be considered while dealing with the
complexity of the software.
4.4 No of Control Structures
The number of control structures also impacts the
complexity of the system. If the program is large in size but
no of control structures its complexity will be less on the
other hand if the size is small but the number of control
structures is large it will increase the complexity.
4.5 Recursive Functions
The size of new projects will basically rely upon the
quantity of inputs, outputs and the interfaces, the system
has. As their number increases so is the complexity of the
legacy program or vice-versa. The attributes of the data
structure and procedures in the software makes the software
complex and hard to understand.(Curtis et.al,1979).
4.6 Design
The poor design of the system makes it more complex and
hence it is hard to change and expensive to support the
program. The objective behind every system, including data
information system, is to bring about required data
information based on input data and its processing. A long
time of experience in creating information systems brought
about understanding that the amount of documents and the
complexity of a given document inside a business system by
one means or another establish the complexity of designing,
as well as creating a information system. By and large, to
completely comprehend them, more time and effort is
required for systems with various documents, than for those
with fewer documents. Relatively, the project development
is more intricate and tedious.
4.7 Integration
Since the legacy systems are based on old platforms that is
legacy platforms these platforms are difficult to understand
3. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 04 Issue: 03 | Mar-2015, Available @ http://www.ijret.org 503
and hence increases complexity and it is very difficult to
integrate these systems with the new platforms is very
complex. Incorporation crosswise technique is truly
common in computing, yet integration between new
technologies and substantially older ones is not normal.
There may basically not be sufficient interest for integration
technology to be created. Some of this "glue" code is seldom
created by sellers and enthusiasts of specific legacy
technologies.
4.8 Security
As the legacy systems have older operating system it may be
susceptible to attacks due to lack of security patches which
are being applied the security issues may be caused by
production setups. These issues can put the legacy systems
at danger of being traded off by attackers or knowledgeable
insiders [4].As legacy systems are based on obsolete
technology they are complex and are liable to be less secure
that is complexity hinders the security of the legacy
programs. The code which is complex ought to be larger
than the simple code that means there are more chances for
accidents, omissions and manifestation of code errors. As
complexity increases, it is common to just affirm that a
system or product is secure as it becomes less and less
conceivable to really make it secure despite complexity.
Regardless of an abundance of testing tools that assert to
grab bugs, the complexity of software makes security
imperfections and errors unavoidable and progressively
normal.
4.9 Performance
Performance is either a moderate degradation of software
performance about whether or its decreasing responsiveness
that will in the long run lead to software getting to be
flawed, unusable, or overall called "legacy" and in need of
updating. This is not a physical phenomenon: the software
does not really rot, yet rather experiences an absence of
being responsive and redesigned as for the changing
environment in which it operates. software can fall apart in
"performance" over the time and becomes "legacy" as it
runs and becomes error prone; this is not for the most part
considered software decay, however it may have a portion
of the same outcomes. Generally, such a state can be
overcome by totally reinitializing its state (as by a complete
reinstallation of all applicable programming segments,
perhaps including working framework programming). The
another element which is responsible for making software or
program legacy is the performance as the software
deteriorates slowly over the period of time its
responsiveness diminishes and makes the software error
prone, useless or legacy which needs up-gradation to meet
the current technological requirements. As the time passes
the software detoriates and its performance decreases and is
not able to meet the expectations of the user so it is required
that the software must be upgraded rather than discarding it
totally. To enhance the performance of the software install
the new software’s or modifies the code.
4.10 Lack of Documentation
These systems are difficult to maintain, enhance, and extend
due to absence of understanding of the system; the staffs
who were professionals on it have resigned or forgotten
what they knew about it, and staff who entered the field
after it became "legacy" never learned it in any case. This
can be exacerbated by need or loss of documentation.
Comair Airline Company blamed its CEO in 2004 because
of the failure of an obsolete legacy team scheduling system
that ran into a restriction not known to anybody in the
company.
4.11 Flexibility
As the legacy systems are working on old or outdated
technology thus are not able to compete with the changing
technology and are not flexible. It was observed by [10]
notes that legacy systems use old technology, are lacking in
flexibility, are highly complex and possibly diverge with
corporate strategy.
4.12 Time
As less time for processing and more processing speed is the
aim for making optimized use of resources. Time is also
important element in complexity of the legacy program the
programs which takes more time for processing are more
complex as compare to the programs which takes less time
for processing or completion of the execution. The time of
computation in legacy program is high and thus it
contributes in enhancing the complexity of the program.
4.13 Lack of Staff
As legacy programs are built using outdated techniques and
thus it is not easy to understand this program so the un
availability of the staff increases the complexity. The staff
either has left the job and they are not interested in teaching
the other persons or the staff has retired from the job.
4.14 Reliability
Reliability means ability of the program to perform its
intended functions and operations without failure. As the
legacy programs are complex so these systems tend not to
entirely understand. if the system is difficult to understand
than it is difficult to find the ways by which the system can
be compromised by the attackers or intruders. It is not easy
to prevent insecure operating modes in the legacy systems
and that too in cheaply. Generally it has been observed that
complex systems are considered to less prone to attacks or
they are secure but the fact is although numbers of tools are
available to test the legacy programs but the complexity of
the legacy programs makes a security flaws and errors
nearly unavoidable and normal.
4.15 Bugs or Errors
A software bug is an error, failure, or fault in a computer
program or system that makes it to create a wrong or
unexpected result, or to behave in unexpected ways. Most
4. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 04 Issue: 03 | Mar-2015, Available @ http://www.ijret.org 504
bugs emerge from mistakes and errors made by individuals
in either a program's source code or its design, or in systems
and operating systems utilized by such programs, and a
couple of them are created by compilers creating wrong
code. A program that contains number of bugs or the bugs
that genuinely intervene with its functionality is said to be
buggy. Reports specifying bugs in a system are generally
known as bug reports, defect reports, fault reports, problem
reports, trouble reports so on and so forth. The outcomes of
bugs may be amazingly genuine. In 1996, the European
Space Agency's Us$1 billion model Ariane 5 rocket must be
crushed short of what a moment after dispatch, in view of a
bug in the ready for machine program. In June 1994, a
Royal Air Force Chinook impacted the Mull of Kintyre,
slaughtering 29. This was from the get go discharged as
pilot slip, however an examination by Computer Weekly
uncovered sufficient affirmation to induce a House of Lords
request that it may have been brought about by a product
bug in the airplane's motor control computer. In 2002, a
study commissioned by the US Department of Commerce'
National Institute of Standards and Technology concluded
that "software bugs, or errors, are so prevalent and so
detrimental that they cost the US economy an estimated
$59 billion annually, or about 0.6 percent of the gross
domestic product"[37].
4.16 Unpredictability
It is the inability to know what will happen. As legacy
programs are obsolete and are hard to understand.
4.17 Interdependent Parts/Interfaces
The interdependence between various modules of the
program makes the program complex. According to
Webster’s Encyclopedic Unabridged Dictionary, 2001
complexity is characterized by complicated arrangement of
interconnected parts. More the number of interconnected
parts more will be the complexity of the program and it is
difficult to understand the program. Most of legacy
programs are complex involving lots of interrelationships or
interdependence and changing requirements.
5. CONCLUSION
From the study it was observed that difficulty in
understanding the code, size of the legacy program and cost
are the factors which are responsible for the complexity of
the legacy program. The lack of documentations and
interdependence of the paths or interfaces also contributes in
enhancing the complexity of the program. Last but not the
least the errors or the bugs in the legacy program increases
the complexity of the program as it is very difficult to find
the errors or bugs in the legacy program as they are very
complex. As far as the lack of staff and unpredictability is
concerned these also have an impact on increasing the
complexity of the legacy program. This study will help the
research scholars in predicting the complexity of the
program. The study will be further enhanced using
mathematical tools in the future using these aspects as per
the demand of our research.
REFERENCES
[1] The RENAISSANCE Framework, RENAISSANCE
project deliverable, 1997.
[2] Lamb, John (June 2008). "Legacy systems continue
to have a place in the enterprise". Computer Weekly.
Retrieved 27 October 2014.
[3] Stephanie Overby (2005-05-01). "Comair's
Christmas Disaster: Bound To Fail - CIO.com -
Business Technology Leadership". CIO.com.
Retrieved 2012-04-29.
[4] Razermouse (2011-05-03). "The Danger of Legacy
Systems". Mousesecurity.com. Retrieved 2012-04-
29.
[5] [9].( B. Foote, J. Yoder, “Big Ball of Mud”, Pattern
Languages of Program Design, Vol. 4, N. Harrison,
B. Foote, H. Rohnert, (Eds.), Addison-Wesley,
2000.)
[6] Sneed, H., 1995 “Planning the reengineering of
legacy systems” January IEEE Software.
[7] Brooks,F. 1975 The Mythical Man-Month. Addison-
Wesley.
[8] Bernard, V.L. 1995. “The Feltham-Ohlson
Framework: Implications for Empiricists.”
[9] Contemporary Accounting Research 11: 733-747.
[10] Bancroft,N.,H.Seip,A.Sprengel , 1997 Implementing
SAP/R3. Manning publications. Also available:
http://www.browsebooks.com/Bancroft/Contents.htm
l.
[11] Elliot Chikofski and James H. Cross II "Reverse
Engineering and Design Recovery: A Taxonomy"
IEEE Software January 1990 7(1):13-17.
[12] V. R. Basili and H. D. Mills "Understanding and
Documenting Program" IEEE Transactions on
Software Engineering May 1982. SE-8(3):270-283.
[13] Ward, M., “Abstracting a Specification from Code”,
Joumal of Software Maintenance, 5(2), 1993.
[14] J. Cordy, I. Carmichael, and R. Halliday. The txl
programming language (version 8). Technical report,
Legasys Corp., Kingston, Apr. 1995.
[15] R. S. Arnold, editor. Software Reengineering. IEEE
Computer Society Press, 1992.
[16] Cremer, K., “A Tool Supporting the Re-Design of
Legacy Applications”, in Proceedings of 2”d
Euromicro Conference on Software Maintenance and
Reengineering - CSMR’98, IEEE. Florence, Italy,
1998, pp. 142-49.
[17] Sneed, H. M., Planning the Reengineering of Legacy
Systems, IEEE Software, January 1995, pp. 24-34.
[18] Perspectives on Legacy System Reengineering,
Software Engineering Institute, Carnegie Mellon
University, 1995.
[19] Assessing the Evolveability of a Legacy System,
Software Engineering Institute, Carnegie Mellon
University, 1996.
[20] Parnas, D.L. Software Aging. in 16th International
Conference on Software Engineering. May 1994.
Sorrento, Italy.
[21] Simon, H.A., The Architecture of Complexity. The
American Philosophical Society, December 1962.
5. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 04 Issue: 03 | Mar-2015, Available @ http://www.ijret.org 505
[22] Henderson, R.M. and K.B. Clark, Architectural
Innovation: The Reconfiguration of Existing Product
Technologies and the Failure of Established Firms.
Administrative Science Quarterly, 1990. 35: p. pp. 9-
30.
[23] Hughes, T.P., The Evolution of Large Technological
Systems, in The Social Construction of
Technological Systems, W.E. Bijker, T.P. Hughes,
and T.J. Pinch, Editors. 1987, MIT Press: Cambridge,
MA.
[24] Simon, H.A., The Architecture of Complexity. The
American Philosophical Society, December 1962.
[25] S Comella-Dorda. K Wallnau, R. Seacord, and J
Robert. “A Survey of Legacy System Modernization
Approaches”. SEI Technical Note CMU/SEI-00-TN-
003. Software Engineering Institute, Carnegie Mellon
University, Apr. 2000.
[26] S. Tilley, and D Smith, “Legacy System
Reengineering, Software Engineering Institute”,
Carnegie Mellon University, Presented at the
International Conference on Software Maintenance,
Nov. 4-8, 1996.
[27] L. Raccoon., “The Complexity Gap”. SIGSOFT
Software Engineering Notes, 20(3), Jul. 1995, pp. 37-
44.
[28] N Weiderman, J Bergey, D. Smith, B. Dennis and S
Tilley, “Approaches to Legacy System Evolution”
[29] (CMU/SEI-97-TR-014). Software Engineering
Institute, Carnegie Mellon University, 1997.
[30] R. Pressman. Software Engineering: A Practitioner's
Approach, 4th Edition. New York, NY: McGraw-
Hill, 1997.
[31] J. Ransom and I. Warren. “A Method for Assessing
Legacy Systems for Evolution,” Proceedings, Second
Euromicro Conference on Software Maintenance and
Reengineering (CSMR98), 1998.
[32] Arnold,R.1989“Software Restructuring” Proceedings
of the IEEE, April Vol. 77, No. 4, pp 607 – 617.
[33] "Software bugs cost US economy dear".
Web.archive.org. June 10, 2009. Retrieved
September 24, 2012.
[34] Bisbal, J.; Lawless, D.; Bing Wu; Grimson, J.,
“Legacy information systems: issues and directions”,
Software, IEEE , vol.16, no.5, pp.103-111, Sep/Oct
1999.
[35] http://en.wikipedia.org/wiki/Legacy_system