Abstract One of the main goals of the memory management is to allow multiprogramming. Several strategies are used to allocate a memory to the process needed. These strategies require the entire process should be in the memory before execution and some of these strategies suffer from fragmentation. Virtual memory does not require entire process to be in memory before execution. It loads only those pages of the process in to the memory that are needed for execution. This can be achieved using paging scheme having number of frames in the memory to accommodate for each pages. Whenever one of the page in the memory need to be replaced, one of the page replacement algorithms are used. The performance of these page replacement algorithms depends on the total number of page faults achieved. In this paper, a new algorithm called Comparison Counting FIFO (CC-FIFO) and Distribution Counting FIFO (DC-FIFO) has been proposed using input enhancement technique. Our results and calculations show that the proposed algorithm minimizes the page fault rate compared to FIFO, LRU and Optimal page replacement algorithms. Keywords-Page replacement; Page fault; FIFO; OPT; LRU; Comparison Counting; Distribution Counting;
Hybrid branch prediction for pipelined MIPS processor IJECEIAES
In the modern microprocessors that designed with pipeline stages, the performance of these types of processors will be affected when executing branch instructions, because in this case there will be stalls in the pipeline. In turn this causes in reducing the Cycle Per Instruction (CPI) of the processor. In the case of executing a branch instruction, the processor needs an extra clocks to know if that branch will happen (Taken) or not (Not Taken) and also it requires calculating the new address in the case of the branch is Taken. The prediction that the branch is T / NT is an important stage in enhancing the processor performance. In this research more than one method of branch prediction (hybrid) is used and the designed circuit will choose different types of prediction algoritms depending on the type of the branch. Some of these methods were used are static while the other are dynamic. All circuits were built practically and examined by applying different programs on the designed predictor algorithm to compute the performance of the processor.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Managing Statistics for Optimal Query PerformanceKaren Morton
Half the battle of writing good SQL is in understanding how the Oracle query optimizer analyzes your code and applies statistics in order to derive the “best” execution plan. The other half of the battle is successfully applying that knowledge to the databases that you manage. The optimizer uses statistics as input to develop query execution plans, and so these statistics are the foundation of good plans. If the statistics supplied aren’t representative of your actual data, you can expect bad plans. However, if the statistics are representative of your data, then the optimizer will probably choose an optimal plan.
Issues in Query Processing and OptimizationEditor IJMTER
The paper identifies the various issues in query processing and optimization while
choosing the best database plan. It is unlike preceding query optimization techniques that uses only a
single approach for identifying best query plan by extracting data from database. Our approach takes
into account various phases of query processing and optimization, heuristic estimation techniques
and cost function for identifying the best execution plan. A review report on various phases of query
processing, goals of optimizer, various rules for heuristic optimization and cost components involved
are presented in this paper.
An unsupervised feature selection algorithm with feature ranking for maximizi...Asir Singh
Prediction plays a vital role in decision making. Correct prediction leads to right decision making to save the life, energy,
efforts, money and time. The right decision prevents physical and material losses and it is practiced in all the fields including medical,
finance, environmental studies, engineering and emerging technologies. Prediction is carried out by a model called classifier. The
predictive accuracy of the classifier highly depends on the training datasets utilized for training the classifier. The irrelevant and
redundant features of the training dataset reduce the accuracy of the classifier. Hence, the irrelevant and redundant features must be
removed from the training dataset through the process known as feature selection. This paper proposes a feature selection algorithm
namely unsupervised learning with ranking based feature selection (FSULR). It removes redundant features by clustering and eliminates
irrelevant features by statistical measures to select the most significant features from the training dataset. The performance of this
proposed algorithm is compared with the other seven feature selection algorithms by well known classifiers namely naive Bayes (NB),
instance based (IB1) and tree based J48. Experimental results show that the proposed algorithm yields better prediction accuracy for
classifiers.
SCHEDULING DIFFERENT CUSTOMER ACTIVITIES WITH SENSING DEVICEijait
Most periodic tasks are assigned to processors using partition scheduling policy after checking feasibility conditions. A new approach is proposed for scheduling different activities with one periodic task within the system. In this paper, control strategies are identified for allocating different types of tasks (activities) to
individual computing elements like Smartphone or microphones. In our simulation model, each periodic task generates an aperiodic tasks are taken into consideration. Different sets of periodic tasks and aperiodic tasks are scheduled together. This new approach proves that when all different activities are
scheduled with one periodic tasks leads to better performance.
Hybrid branch prediction for pipelined MIPS processor IJECEIAES
In the modern microprocessors that designed with pipeline stages, the performance of these types of processors will be affected when executing branch instructions, because in this case there will be stalls in the pipeline. In turn this causes in reducing the Cycle Per Instruction (CPI) of the processor. In the case of executing a branch instruction, the processor needs an extra clocks to know if that branch will happen (Taken) or not (Not Taken) and also it requires calculating the new address in the case of the branch is Taken. The prediction that the branch is T / NT is an important stage in enhancing the processor performance. In this research more than one method of branch prediction (hybrid) is used and the designed circuit will choose different types of prediction algoritms depending on the type of the branch. Some of these methods were used are static while the other are dynamic. All circuits were built practically and examined by applying different programs on the designed predictor algorithm to compute the performance of the processor.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Managing Statistics for Optimal Query PerformanceKaren Morton
Half the battle of writing good SQL is in understanding how the Oracle query optimizer analyzes your code and applies statistics in order to derive the “best” execution plan. The other half of the battle is successfully applying that knowledge to the databases that you manage. The optimizer uses statistics as input to develop query execution plans, and so these statistics are the foundation of good plans. If the statistics supplied aren’t representative of your actual data, you can expect bad plans. However, if the statistics are representative of your data, then the optimizer will probably choose an optimal plan.
Issues in Query Processing and OptimizationEditor IJMTER
The paper identifies the various issues in query processing and optimization while
choosing the best database plan. It is unlike preceding query optimization techniques that uses only a
single approach for identifying best query plan by extracting data from database. Our approach takes
into account various phases of query processing and optimization, heuristic estimation techniques
and cost function for identifying the best execution plan. A review report on various phases of query
processing, goals of optimizer, various rules for heuristic optimization and cost components involved
are presented in this paper.
An unsupervised feature selection algorithm with feature ranking for maximizi...Asir Singh
Prediction plays a vital role in decision making. Correct prediction leads to right decision making to save the life, energy,
efforts, money and time. The right decision prevents physical and material losses and it is practiced in all the fields including medical,
finance, environmental studies, engineering and emerging technologies. Prediction is carried out by a model called classifier. The
predictive accuracy of the classifier highly depends on the training datasets utilized for training the classifier. The irrelevant and
redundant features of the training dataset reduce the accuracy of the classifier. Hence, the irrelevant and redundant features must be
removed from the training dataset through the process known as feature selection. This paper proposes a feature selection algorithm
namely unsupervised learning with ranking based feature selection (FSULR). It removes redundant features by clustering and eliminates
irrelevant features by statistical measures to select the most significant features from the training dataset. The performance of this
proposed algorithm is compared with the other seven feature selection algorithms by well known classifiers namely naive Bayes (NB),
instance based (IB1) and tree based J48. Experimental results show that the proposed algorithm yields better prediction accuracy for
classifiers.
SCHEDULING DIFFERENT CUSTOMER ACTIVITIES WITH SENSING DEVICEijait
Most periodic tasks are assigned to processors using partition scheduling policy after checking feasibility conditions. A new approach is proposed for scheduling different activities with one periodic task within the system. In this paper, control strategies are identified for allocating different types of tasks (activities) to
individual computing elements like Smartphone or microphones. In our simulation model, each periodic task generates an aperiodic tasks are taken into consideration. Different sets of periodic tasks and aperiodic tasks are scheduled together. This new approach proves that when all different activities are
scheduled with one periodic tasks leads to better performance.
In a computer operating system that uses paging for virtual memory management, page replacement algorithms decide which memory pages to page out, sometimes called swap out, or write to disk, when a page of memory needs to be allocated
The role of Dataset in training ANFIS System for Course AdvisorAM Publications
Adaptive Network based Fuzzy Inference System (ANFIS) is used in the field of decision making to help
the students to choose the best course according to his/her requirements. The structure of ANFIS system and the
datasets used to train the system play a vital role in evaluating the performance of the system. This paper is based on
the design of Sugeno type ANFIS with grid partitioning and the usage of different datasets to train the system using
MATLAB. Results demonstrate that proper dataset is needed for training the ANFIS model
Mechanical properties of hybrid fiber reinforced concrete for pavementseSAT Journals
Abstract
The effect of addition of mono fibers and hybrid fibers on the mechanical properties of concrete mixture is studied in the present
investigation. Steel fibers of 1% and polypropylene fibers 0.036% were added individually to the concrete mixture as mono fibers and
then they were added together to form a hybrid fiber reinforced concrete. Mechanical properties such as compressive, split tensile and
flexural strength were determined. The results show that hybrid fibers improve the compressive strength marginally as compared to
mono fibers. Whereas, hybridization improves split tensile strength and flexural strength noticeably.
Keywords:-Hybridization, mono fibers, steel fiber, polypropylene fiber, Improvement in mechanical properties.
Material management in construction – a case studyeSAT Journals
Abstract
The objective of the present study is to understand about all the problems occurring in the company because of improper application
of material management. In construction project operation, often there is a project cost variance in terms of the material, equipments,
manpower, subcontractor, overhead cost, and general condition. Material is the main component in construction projects. Therefore,
if the material management is not properly managed it will create a project cost variance. Project cost can be controlled by taking
corrective actions towards the cost variance. Therefore a methodology is used to diagnose and evaluate the procurement process
involved in material management and launch a continuous improvement was developed and applied. A thorough study was carried
out along with study of cases, surveys and interviews to professionals involved in this area. As a result, a methodology for diagnosis
and improvement was proposed and tested in selected projects. The results obtained show that the main problem of procurement is
related to schedule delays and lack of specified quality for the project. To prevent this situation it is often necessary to dedicate
important resources like money, personnel, time, etc. To monitor and control the process. A great potential for improvement was
detected if state of the art technologies such as, electronic mail, electronic data interchange (EDI), and analysis were applied to the
procurement process. These helped to eliminate the root causes for many types of problems that were detected.
More Related Content
Similar to An input enhancement technique to maximize the performance of page replacement algorithms
In a computer operating system that uses paging for virtual memory management, page replacement algorithms decide which memory pages to page out, sometimes called swap out, or write to disk, when a page of memory needs to be allocated
The role of Dataset in training ANFIS System for Course AdvisorAM Publications
Adaptive Network based Fuzzy Inference System (ANFIS) is used in the field of decision making to help
the students to choose the best course according to his/her requirements. The structure of ANFIS system and the
datasets used to train the system play a vital role in evaluating the performance of the system. This paper is based on
the design of Sugeno type ANFIS with grid partitioning and the usage of different datasets to train the system using
MATLAB. Results demonstrate that proper dataset is needed for training the ANFIS model
Mechanical properties of hybrid fiber reinforced concrete for pavementseSAT Journals
Abstract
The effect of addition of mono fibers and hybrid fibers on the mechanical properties of concrete mixture is studied in the present
investigation. Steel fibers of 1% and polypropylene fibers 0.036% were added individually to the concrete mixture as mono fibers and
then they were added together to form a hybrid fiber reinforced concrete. Mechanical properties such as compressive, split tensile and
flexural strength were determined. The results show that hybrid fibers improve the compressive strength marginally as compared to
mono fibers. Whereas, hybridization improves split tensile strength and flexural strength noticeably.
Keywords:-Hybridization, mono fibers, steel fiber, polypropylene fiber, Improvement in mechanical properties.
Material management in construction – a case studyeSAT Journals
Abstract
The objective of the present study is to understand about all the problems occurring in the company because of improper application
of material management. In construction project operation, often there is a project cost variance in terms of the material, equipments,
manpower, subcontractor, overhead cost, and general condition. Material is the main component in construction projects. Therefore,
if the material management is not properly managed it will create a project cost variance. Project cost can be controlled by taking
corrective actions towards the cost variance. Therefore a methodology is used to diagnose and evaluate the procurement process
involved in material management and launch a continuous improvement was developed and applied. A thorough study was carried
out along with study of cases, surveys and interviews to professionals involved in this area. As a result, a methodology for diagnosis
and improvement was proposed and tested in selected projects. The results obtained show that the main problem of procurement is
related to schedule delays and lack of specified quality for the project. To prevent this situation it is often necessary to dedicate
important resources like money, personnel, time, etc. To monitor and control the process. A great potential for improvement was
detected if state of the art technologies such as, electronic mail, electronic data interchange (EDI), and analysis were applied to the
procurement process. These helped to eliminate the root causes for many types of problems that were detected.
Managing drought short term strategies in semi arid regions a case studyeSAT Journals
Abstract
Drought management needs multidisciplinary action. Interdisciplinary efforts among the experts in various fields of the droughts
prone areas are helpful to achieve tangible and permanent solution for this recurring problem. The Gulbarga district having the total
area around 16, 240 sq.km, and accounts 8.45 per cent of the Karnataka state area. The district has been situated with latitude 17º 19'
60" North and longitude of 76 º 49' 60" east. The district is situated entirely on the Deccan plateau positioned at a height of 300 to
750 m above MSL. Sub-tropical, semi-arid type is one among the drought prone districts of Karnataka State. The drought
management is very important for a district like Gulbarga. In this paper various short term strategies are discussed to mitigate the
drought condition in the district.
Keywords: Drought, South-West monsoon, Semi-Arid, Rainfall, Strategies etc.
Life cycle cost analysis of overlay for an urban road in bangaloreeSAT Journals
Abstract
Pavements are subjected to severe condition of stresses and weathering effects from the day they are constructed and opened to traffic
mainly due to its fatigue behavior and environmental effects. Therefore, pavement rehabilitation is one of the most important
components of entire road systems. This paper highlights the design of concrete pavement with added mono fibers like polypropylene,
steel and hybrid fibres for a widened portion of existing concrete pavement and various overlay alternatives for an existing
bituminous pavement in an urban road in Bangalore. Along with this, Life cycle cost analyses at these sections are done by Net
Present Value (NPV) method to identify the most feasible option. The results show that though the initial cost of construction of
concrete overlay is high, over a period of time it prove to be better than the bituminous overlay considering the whole life cycle cost.
The economic analysis also indicates that, out of the three fibre options, hybrid reinforced concrete would be economical without
compromising the performance of the pavement.
Keywords: - Fatigue, Life cycle cost analysis, Net Present Value method, Overlay, Rehabilitation
Laboratory studies of dense bituminous mixes ii with reclaimed asphalt materialseSAT Journals
Abstract
The issue of growing demand on our nation’s roadways over that past couple of decades, decreasing budgetary funds, and the need to
provide a safe, efficient, and cost effective roadway system has led to a dramatic increase in the need to rehabilitate our existing
pavements and the issue of building sustainable road infrastructure in India. With these emergency of the mentioned needs and this
are today’s burning issue and has become the purpose of the study.
In the present study, the samples of existing bituminous layer materials were collected from NH-48(Devahalli to Hassan) site.The
mixtures were designed by Marshall Method as per Asphalt institute (MS-II) at 20% and 30% Reclaimed Asphalt Pavement (RAP).
RAP material was blended with virgin aggregate such that all specimens tested for the, Dense Bituminous Macadam-II (DBM-II)
gradation as per Ministry of Roads, Transport, and Highways (MoRT&H) and cost analysis were carried out to know the economics.
Laboratory results and analysis showed the use of recycled materials showed significant variability in Marshall Stability, and the
variability increased with the increase in RAP content. The saving can be realized from utilization of recycled materials as per the
methodology, the reduction in the total cost is 19%, 30%, comparing with the virgin mixes.
Keywords: Reclaimed Asphalt Pavement, Marshall Stability, MS-II, Dense Bituminous Macadam-II
Laboratory investigation of expansive soil stabilized with natural inorganic ...eSAT Journals
Abstract
Soil stabilization has proven to be one of the oldest techniques to improve the soil properties. Literature review conducted revealed
that uses of natural inorganic stabilizers are found to be one of the best options for soil stabilization. In this regard an attempt has
been made to evaluate the influence of RBI-81 stabilizer on properties of black cotton soil through laboratory investigations. Black
cotton soil with varying percentages of RBI-81 viz., 0, 0.5, 1, 1.5, 2, and 2.5 percent were studied for moisture density relationships
and strength behaviour of soils. Also the effect of curing period was evaluated as literature review clearly emphasized the strength
gain of soils stabilized with RBI-81 over a period of time. The results obtained shows that the unconfined compressive strength of
specimens treated with RBI-81 increased approximately by 250% for a curing period of 28 days as compared to virgin soil. Further
the CBR value improved approximately by 400%. The studies indicated an increasing trend for soil strength behaviour with
increasing percentage of RBI-81 suggesting its potential applications in soil stabilization.
Influence of reinforcement on the behavior of hollow concrete block masonry p...eSAT Journals
Abstract
Reinforced masonry was developed to exploit the strength potential of masonry and to solve its lack of tensile strength. Experimental
and analytical studies have been carried out to investigate the effect of reinforcement on the behavior of hollow concrete block
masonry prisms under compression and to predict ultimate failure compressive strength. In the numerical program, three dimensional
non-linear finite elements (FE) model based on the micro-modeling approach is developed for both unreinforced and reinforced
masonry prisms using ANSYS (14.5). The proposed FE model uses multi-linear stress-strain relationships to model the non-linear
behavior of hollow concrete block, mortar, and grout. Willam-Warnke’s five parameter failure theory has been adopted to model the
failure of masonry materials. The comparison of the numerical and experimental results indicates that the FE models can successfully
capture the highly nonlinear behavior of the physical specimens and accurately predict their strength and failure mechanisms.
Keywords: Structural masonry, Hollow concrete block prism, grout, Compression failure, Finite element method,
Numerical modeling.
Influence of compaction energy on soil stabilized with chemical stabilizereSAT Journals
Abstract
Increase in traffic along with heavier magnitude of wheel loads cause rapid deterioration in pavements. There is a need to improve
density, strength of soil subgrade and other pavement layers. In this study an attempt is made to improve the properties of locally
available loamy soil using twin approaches viz., i) increasing the compaction of soil and ii) treating the soil with chemical stabilizer.
Laboratory studies are carried out on both untreated and treated soil samples compacted by different compaction efforts. Studies
show that increase in compaction effort results in increase in density of soil. However in soil treated with chemical stabilizer, rate of
increase in density is not significant. The soil treated with chemical stabilizer exhibits improvement in both strength and performance
properties.
Keywords: compaction, density, subgradestabilization, resilient modulus
Geographical information system (gis) for water resources managementeSAT Journals
Abstract
Water resources projects are inherited with overlapping and at times conflicting objectives. These projects are often of varied sizes
ranging from major projects with command areas of millions of hectares to very small projects implemented at the local level. Thus,
in all these projects there is seldom proper coordination which is essential for ensuring collective sustainability.
Integrated watershed development and management is the accepted answer but in turn requires a comprehensive framework that can
enable planning process involving all the stakeholders at different levels and scales is compulsory. Such a unified hydrological
framework is essential to evaluate the cause and effect of all the proposed actions within the drainage basins.
The present paper describes a hydrological framework developed in the form of a Hydrologic Information System (HIS) which is
intended to meet the specific information needs of the various line departments of a typical State connected with water related aspects.
The HIS consist of a hydrologic information database coupled with tools for collating primary and secondary data and tools for
analyzing and visualizing the data and information. The HIS also incorporates hydrological model base for indirect assessment of
various entities of water balance in space and time. The framework would be maintained and updated to reflect fully the most
accurate ground truth data and the infrastructure requirements for planning and management.
Keywords: Hydrological Information System (HIS); WebGIS; Data Model; Web Mapping Services
Forest type mapping of bidar forest division, karnataka using geoinformatics ...eSAT Journals
Abstract
The study demonstrate the potentiality of satellite remote sensing technique for the generation of baseline information on forest types
including tree plantation details in Bidar forest division, Karnataka covering an area of 5814.60Sq.Kms. The Total Area of Bidar
forest division is 5814Sq.Kms analysis of the satellite data in the study area reveals that about 84% of the total area is Covered by
crop land, 1.778% of the area is covered by dry deciduous forest, 1.38 % of mixed plantation, which is very threatening to the
environmental stability of the forest, future plantation site has been mapped. With the use of latest Geo-informatics technology proper
and exact condition of the trees can be observed and necessary precautions can be taken for future plantation works in an appropriate
manner
Keywords:-RS, GIS, GPS, Forest Type, Tree Plantation
Factors influencing compressive strength of geopolymer concreteeSAT Journals
Abstract
To study effects of several factors on the properties of fly ash based geopolymer concrete on the compressive strength and also the
cost comparison with the normal concrete. The test variables were molarities of sodium hydroxide(NaOH) 8M,14M and 16M, ratio of
NaOH to sodium silicate (Na2SiO3) 1, 1.5, 2 and 2.5, alkaline liquid to fly ash ratio 0.35 and 0.40 and replacement of water in
Na2SiO3 solution by 10%, 20% and 30% were used in the present study. The test results indicated that the highest compressive
strength 54 MPa was observed for 16M of NaOH, ratio of NaOH to Na2SiO3 2.5 and alkaline liquid to fly ash ratio of 0.35. Lowest
compressive strength of 27 MPa was observed for 8M of NaOH, ratio of NaOH to Na2SiO3 is 1 and alkaline liquid to fly ash ratio of
0.40. Alkaline liquid to fly ash ratio of 0.35, water replacement of 10% and 30% for 8 and 16 molarity of NaOH and has resulted in
compressive strength of 36 MPa and 20 MPa respectively. Superplasticiser dosage of 2 % by weight of fly ash has given higher
strength in all cases.
Keywords: compressive strength, alkaline liquid, fly ash
Experimental investigation on circular hollow steel columns in filled with li...eSAT Journals
Abstract
Composite Circular hollow Steel tubes with and without GFRP infill for three different grades of Light weight concrete are tested for
ultimate load capacity and axial shortening , under Cyclic loading. Steel tubes are compared for different lengths, cross sections and
thickness. Specimens were tested separately after adopting Taguchi’s L9 (Latin Squares) Orthogonal array in order to save the initial
experimental cost on number of specimens and experimental duration. Analysis was carried out using ANN (Artificial Neural
Network) technique with the assistance of Mini Tab- a statistical soft tool. Comparison for predicted, experimental & ANN output is
obtained from linear regression plots. From this research study, it can be concluded that *Cross sectional area of steel tube has most
significant effect on ultimate load carrying capacity, *as length of steel tube increased- load carrying capacity decreased & *ANN
modeling predicted acceptable results. Thus ANN tool can be utilized for predicting ultimate load carrying capacity for composite
columns.
Keywords: Light weight concrete, GFRP, Artificial Neural Network, Linear Regression, Back propagation, orthogonal
Array, Latin Squares
Experimental behavior of circular hsscfrc filled steel tubular columns under ...eSAT Journals
Abstract
This paper presents an outlook on experimental behavior and a comparison with predicted formula on the behaviour of circular
concentrically loaded self-consolidating fibre reinforced concrete filled steel tube columns (HSSCFRC). Forty-five specimens were
tested. The main parameters varied in the tests are: (1) percentage of fiber (2) tube diameter or width to wall thickness ratio (D/t
from 15 to 25) (3) L/d ratio from 2.97 to 7.04 the results from these predictions were compared with the experimental data. The
experimental results) were also validated in this study.
Keywords: Self-compacting concrete; Concrete-filled steel tube; axial load behavior; Ultimate capacity.
Evaluation of punching shear in flat slabseSAT Journals
Abstract
Flat-slab construction has been widely used in construction today because of many advantages that it offers. The basic philosophy in
the design of flat slab is to consider only gravity forces; this method ignores the effect of punching shear due to unbalanced moments
at the slab column junction which is critical. An attempt has been made to generate generalized design sheets which accounts both
punching shear due to gravity loads and unbalanced moments for cases (a) interior column; (b) edge column (bending perpendicular
to shorter edge); (c) edge column (bending parallel to shorter edge); (d) corner column. These design sheets are prepared as per
codal provisions of IS 456-2000. These design sheets will be helpful in calculating the shear reinforcement to be provided at the
critical section which is ignored in many design offices. Apart from its usefulness in evaluating punching shear and the necessary
shear reinforcement, the design sheets developed will enable the designer to fix the depth of flat slab during the initial phase of the
design.
Keywords: Flat slabs, punching shear, unbalanced moment.
Evaluation of performance of intake tower dam for recent earthquake in indiaeSAT Journals
Abstract
Intake towers are typically tall, hollow, reinforced concrete structures and form entrance to reservoir outlet works. A parametric
study on dynamic behavior of circular cylindrical towers can be carried out to study the effect of depth of submergence, wall thickness
and slenderness ratio, and also effect on tower considering dynamic analysis for time history function of different soil condition and
by Goyal and Chopra accounting interaction effects of added hydrodynamic mass of surrounding and inside water in intake tower of
dam
Key words: Hydrodynamic mass, Depth of submergence, Reservoir, Time history analysis,
Evaluation of operational efficiency of urban road network using travel time ...eSAT Journals
Abstract
Efficiency of the road network system is analyzed by travel time reliability measures. The study overlooks on an important measure of
travel time reliability and prioritizing Tiruchirappalli road network. Traffic volume and travel time were collected using license plate
matching method. Travel time measures were estimated from average travel time and 95th travel time. Effect of non-motorized vehicle
on efficiency of road system was evaluated. Relation between buffer time index and traffic volume was created. Travel time model has
been developed and travel time measure was validated. Then service quality of road sections in network were graded based on
travel time reliability measures.
Keywords: Buffer Time Index (BTI); Average Travel Time (ATT); Travel Time Reliability (TTR); Buffer Time (BT).
Estimation of surface runoff in nallur amanikere watershed using scs cn methodeSAT Journals
Abstract
The development of watershed aims at productive utilization of all the available natural resources in the entire area extending from
ridge line to stream outlet. The per capita availability of land for cultivation has been decreasing over the years. Therefore, water and
the related land resources must be developed, utilized and managed in an integrated and comprehensive manner. Remote sensing and
GIS techniques are being increasingly used for planning, management and development of natural resources. The study area, Nallur
Amanikere watershed geographically lies between 110 38’ and 110 52’ N latitude and 760 30’ and 760 50’ E longitude with an area of
415.68 Sq. km. The thematic layers such as land use/land cover and soil maps were derived from remotely sensed data and overlayed
through ArcGIS software to assign the curve number on polygon wise. The daily rainfall data of six rain gauge stations in and around
the watershed (2001-2011) was used to estimate the daily runoff from the watershed using Soil Conservation Service - Curve Number
(SCS-CN) method. The runoff estimated from the SCS-CN model was then used to know the variation of runoff potential with different
land use/land cover and with different soil conditions.
Keywords: Watershed, Nallur watershed, Surface runoff, Rainfall-Runoff, SCS-CN, Remote Sensing, GIS.
Estimation of morphometric parameters and runoff using rs & gis techniqueseSAT Journals
Abstract
Land and water are the two vital natural resources, the optimal management of these resources with minimum adverse environmental
impact are essential not only for sustainable development but also for human survival. Satellite remote sensing with geographic
information system has a pragmatic approach to map and generate spatial input layers of predicting response behavior and yield of
watershed. Hence, in the present study an attempt has been made to understand the hydrological process of the catchment at the
watershed level by drawing the inferences from moprhometric analysis and runoff. The study area chosen for the present study is
Yagachi catchment situated in Chickamaglur and Hassan district lies geographically at a longitude 75⁰52’08.77”E and
13⁰10’50.77”N latitude. It covers an area of 559.493 Sq.km. Morphometric analysis is carried out to estimate morphometric
parameters at Micro-watershed to understand the hydrological response of the catchment at the Micro-watershed level. Daily runoff
is estimated using USDA SCS curve number model for a period of 10 years from 2001 to 2010. The rainfall runoff relationship of the
study shows there is a positive correlation.
Keywords: morphometric analysis, runoff, remote sensing and GIS, SCS - method
-
Effect of variation of plastic hinge length on the results of non linear anal...eSAT Journals
Abstract The nonlinear Static procedure also well known as pushover analysis is method where in monotonically increasing loads are applied to the structure till the structure is unable to resist any further load. It is a popular tool for seismic performance evaluation of existing and new structures. In literature lot of research has been carried out on conventional pushover analysis and after knowing deficiency efforts have been made to improve it. But actual test results to verify the analytically obtained pushover results are rarely available. It has been found that some amount of variation is always expected to exist in seismic demand prediction of pushover analysis. Initial study is carried out by considering user defined hinge properties and default hinge length. Attempt is being made to assess the variation of pushover analysis results by considering user defined hinge properties and various hinge length formulations available in literature and results compared with experimentally obtained results based on test carried out on a G+2 storied RCC framed structure. For the present study two geometric models viz bare frame and rigid frame model is considered and it is found that the results of pushover analysis are very sensitive to geometric model and hinge length adopted. Keywords: Pushover analysis, Base shear, Displacement, hinge length, moment curvature analysis
Effect of use of recycled materials on indirect tensile strength of asphalt c...eSAT Journals
Abstract
Depletion of natural resources and aggregate quarries for the road construction is a serious problem to procure materials. Hence
recycling or reuse of material is beneficial. On emphasizing development in sustainable construction in the present era, recycling of
asphalt pavements is one of the effective and proven rehabilitation processes. For the laboratory investigations reclaimed asphalt
pavement (RAP) from NH-4 and crumb rubber modified binder (CRMB-55) was used. Foundry waste was used as a replacement to
conventional filler. Laboratory tests were conducted on asphalt concrete mixes with 30, 40, 50, and 60 percent replacement with RAP.
These test results were compared with conventional mixes and asphalt concrete mixes with complete binder extracted RAP
aggregates. Mix design was carried out by Marshall Method. The Marshall Tests indicated highest stability values for asphalt
concrete (AC) mixes with 60% RAP. The optimum binder content (OBC) decreased with increased in RAP in AC mixes. The Indirect
Tensile Strength (ITS) for AC mixes with RAP also was found to be higher when compared to conventional AC mixes at 300C.
Keywords: Reclaimed asphalt pavement, Foundry waste, Recycling, Marshall Stability, Indirect tensile strength.
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Dr.Costas Sachpazis
Terzaghi's soil bearing capacity theory, developed by Karl Terzaghi, is a fundamental principle in geotechnical engineering used to determine the bearing capacity of shallow foundations. This theory provides a method to calculate the ultimate bearing capacity of soil, which is the maximum load per unit area that the soil can support without undergoing shear failure. The Calculation HTML Code included.
6th International Conference on Machine Learning & Applications (CMLA 2024)ClaraZara1
6th International Conference on Machine Learning & Applications (CMLA 2024) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of on Machine Learning & Applications.
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
Hierarchical Digital Twin of a Naval Power SystemKerry Sado
A hierarchical digital twin of a Naval DC power system has been developed and experimentally verified. Similar to other state-of-the-art digital twins, this technology creates a digital replica of the physical system executed in real-time or faster, which can modify hardware controls. However, its advantage stems from distributing computational efforts by utilizing a hierarchical structure composed of lower-level digital twin blocks and a higher-level system digital twin. Each digital twin block is associated with a physical subsystem of the hardware and communicates with a singular system digital twin, which creates a system-level response. By extracting information from each level of the hierarchy, power system controls of the hardware were reconfigured autonomously. This hierarchical digital twin development offers several advantages over other digital twins, particularly in the field of naval power systems. The hierarchical structure allows for greater computational efficiency and scalability while the ability to autonomously reconfigure hardware controls offers increased flexibility and responsiveness. The hierarchical decomposition and models utilized were well aligned with the physical twin, as indicated by the maximum deviations between the developed digital twin hierarchy and the hardware.
Forklift Classes Overview by Intella PartsIntella Parts
Discover the different forklift classes and their specific applications. Learn how to choose the right forklift for your needs to ensure safety, efficiency, and compliance in your operations.
For more technical information, visit our website https://intellaparts.com
We have compiled the most important slides from each speaker's presentation. This year’s compilation, available for free, captures the key insights and contributions shared during the DfMAy 2024 conference.
Final project report on grocery store management system..pdfKamal Acharya
In today’s fast-changing business environment, it’s extremely important to be able to respond to client needs in the most effective and timely manner. If your customers wish to see your business online and have instant access to your products or services.
Online Grocery Store is an e-commerce website, which retails various grocery products. This project allows viewing various products available enables registered users to purchase desired products instantly using Paytm, UPI payment processor (Instant Pay) and also can place order by using Cash on Delivery (Pay Later) option. This project provides an easy access to Administrators and Managers to view orders placed using Pay Later and Instant Pay options.
In order to develop an e-commerce website, a number of Technologies must be studied and understood. These include multi-tiered architecture, server and client-side scripting techniques, implementation technologies, programming language (such as PHP, HTML, CSS, JavaScript) and MySQL relational databases. This is a project with the objective to develop a basic website where a consumer is provided with a shopping cart website and also to know about the technologies used to develop such a website.
This document will discuss each of the underlying technologies to create and implement an e- commerce website.
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
Cosmetic shop management system project report.pdfKamal Acharya
Buying new cosmetic products is difficult. It can even be scary for those who have sensitive skin and are prone to skin trouble. The information needed to alleviate this problem is on the back of each product, but it's thought to interpret those ingredient lists unless you have a background in chemistry.
Instead of buying and hoping for the best, we can use data science to help us predict which products may be good fits for us. It includes various function programs to do the above mentioned tasks.
Data file handling has been effectively used in the program.
The automated cosmetic shop management system should deal with the automation of general workflow and administration process of the shop. The main processes of the system focus on customer's request where the system is able to search the most appropriate products and deliver it to the customers. It should help the employees to quickly identify the list of cosmetic product that have reached the minimum quantity and also keep a track of expired date for each cosmetic product. It should help the employees to find the rack number in which the product is placed.It is also Faster and more efficient way.
An input enhancement technique to maximize the performance of page replacement algorithms
1. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 04 Issue: 06 | June-2015, Available @ http://www.ijret.org 302
AN INPUT ENHANCEMENT TECHNIQUE TO MAXIMIZE THE
PERFORMANCE OF PAGE REPLACEMENT ALGORITHMS
Mahesh Kumar M R1
, Renuka Rajendra B2
1
Department of CSE, JSSATE, Bengaluru, India
2
Department of CSE, JSSATE, Bengaluru, India
Abstract
One of the main goals of the memory management is to allow multiprogramming. Several strategies are used to allocate a
memory to the process needed. These strategies require the entire process should be in the memory before execution and some of
these strategies suffer from fragmentation. Virtual memory does not require entire process to be in memory before execution. It
loads only those pages of the process in to the memory that are needed for execution. This can be achieved using paging scheme
having number of frames in the memory to accommodate for each pages. Whenever one of the page in the memory need to be
replaced, one of the page replacement algorithms are used. The performance of these page replacement algorithms depends on
the total number of page faults achieved. In this paper, a new algorithm called Comparison Counting FIFO (CC-FIFO) and
Distribution Counting FIFO (DC-FIFO) has been proposed using input enhancement technique. Our results and calculations
show that the proposed algorithm minimizes the page fault rate compared to FIFO, LRU and Optimal page replacement
algorithms.
Keywords-Page replacement; Page fault; FIFO; OPT; LRU; Comparison Counting; Distribution Counting;
-------------------------------------------------------------------***-------------------------------------------------------------------
1. INTRODUCTION
Memory management allows multiprogramming so that
operating systems keeps several processes in the memory.
We need to allocate memory in the most efficient way
possible. Memory can be allocated using multiple partition
schemes, variable partition scheme (first fit, best fit, and
worst fit) [4]. But some of these strategies suffer from
fragmentation problem. We can avoid this problem using
paging and segmentation memory management scheme. All
these schemes require that entire process to be in the
physical memory before we can execute the process.
Virtual memory does not require the entire process to be in
the memory before we can execute because sometimes a
user does not require the entire process in the memory. User
may be interested in a few pages of the process rather than
the entire process at any point of time. In such a case, bring
only those pages from the disk into the memory that is
demanded during the execution. If the pages of the process
are not demanded, then these pages are never loaded in to
the memory. This technique is called demand paging [4].
If the process tries to access a page in the memory that was
not brought from the disk, then such page is called page
fault. In order to identify the page fault, we will use valid /
invalid bit scheme. If the bit is set as valid to one of the
page, then that page is in the memory. Otherwise, the page is
not valid and is currently on the disk not in the memory [4].
The page fault may occur when the desired page is not in the
memory or the desired page is currently on the disk and is
ready to bring it into the memory but the memory is full. At
this point to allow the execution of the desired page, either
bring the page from the disk into the memory or replace
some of the pages that are already in the memory using one
of the page replacement algorithms. Among all the page
replacement algorithms, optimal page replacement is the
most efficient algorithm which has less page fault rate.
In an attempt towards reducing the page fault compared to
the existing algorithms, we have proposed a new algorithm
CC-FIFO and DC-FIFO using input enhancement
techniques. Our algorithm takes less number of page fault
rate compared to FIFO, OPT and LRU replacement
algorithms.
The reminder of the paper is organised as follows. Section II
describes the different page replacement algorithms. Section
III describes the related work done in the area of page
replacement algorithm. Section IV input enhancement
techniques for comparison counting and distribution
counting. Section V describes the proposed algorithm.
Section VI describes results and observations for few test
cases. Section VII concludes with the summary.
2. PAGE REPLACEMENT ALGORITHMS
There are several page replacement algorithms that
determine which page to replace to accommodate a desired
page in to the memory.
2.1 First In First Out (FIFO)
This algorithm replaces the page that has been in the
memory for the longest period of time. In order to do this
the system must keeps track of the order in which pages
enter memory [5]. This algorithm will result in more page
2. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 04 Issue: 06 | June-2015, Available @ http://www.ijret.org 303
faults even the number of frames are increased called
belady’s anomaly.
2.2 Optimal or OPT
This algorithm replaces the page that will not be referenced
again for the longest period of time in the future. This
algorithm never suffers from belady’s anomaly.
2.3 Least Recently Used (LRU)
This algorithm replaces the page by considering the past
behaviour of the pages in the memory. This will replace the
page that has spent longest period of time in the memory
being unused.
2.4 Second Chance Algorithm
This algorithm is a variation of FIFO. It checks the bit of the
old page. If the bit is off, the algorithm immediately selects
the page for replacement. Otherwise the algorithm turns off
the bit and moves the page to the end of the FIFO queue [5].
2.5 Least Frequently Used (LFU)
This algorithm replaces the page that is least frequently
used. To do this, we will count the number of references
being made to each page. So finally replace the page that
having smallest count.
2.6 Most Frequently Used (MFU)
This algorithm replaces the page that is most frequently used
that results in largest count.
3. RELATED WORK DONE
In the recent years many researchers have came forward
with their work and ideas in the area of page replacement
algorithms.
Wang hong, compared and analysed the page fault
and page fault frequency of different algorithm [1].
Pooja khulbe, shruti pant, suggested a advanced
version of least recently used algorithm by reducing
the overall page fault rate [8].
Yogesh Niranjan, Shailendra Tiwari, proposed a new
page replacement algorithm for proxy server [6].
Anupam Battacharjee, Bideb Kumar Debnath,
proposed a new randomized algorithm approximating
random replament and least recently used scheme for
page replacement in web caches [3].
Ali Khasrozadeh, Sanaz Pashmforoush, proposed a
page replacement algorithm based on LRu by
considering certain parameter that can offer better
improvement [7].
Guangxia Xu, Lingling Ren, Yanbing Liu, proposed
an efficient page replacement algorithm called
FAPRA for NAND flas memory based storage
devices [2].
4. INPUT ENHANCEMENT TECHNIQUES
The idea is to pre-process the problem’s input and store the
additional information obtained to accelerate solving the
problem afterward [9]. The following two input
enhancement methods are used for our proposed algorithm.
4.1 Comparison Counting
In this sorting, for each element in a list to be sorted, count
the total number of element that are smaller than this
element and store the results in a table. These numbers
indicate the position of the elements in the sorted list [9].
Consider the page reference string { 7, 0, 1, 2, 0, 3, 0,
4, 2, 3, 0, 3, 2, 1, 2, 0, 1, 7, 0, 1}. The number of
occurrence of each element is {7-21, 0-62, 1-43, 2-44,
3-35, 4-16}. Subscript number is for future references.
The number of occurrence obtained in the above step
{21, 62, 43, 44, 35, 16} is an input to the comparison
counting algorithm.
Once we apply the algorithm, the number of
occurrence gets sorted {16, 21, 35, 44, 43, 62} in
increasing order.
For each element in the sorted list, replace with
corresponding number of occurrences of element
from the first step as shown below. The new page
reference string is {4, 7, 7, 3, 3, 3, 2, 2, 2, 2, 1, 1, 1,
1, 0, 0, 0, 0, 0, 0}. This is considered as a new page
reference string in our proposed algorithm.
4.2 Distribution Counting
This algorithm sorts the n elements in the array. This
algorithm is recommended when the input array elements
consist of duplicate, random elements and also the elements
in the array are known to come from a set. Consider the
array A elements: {1, 2, 3, 4, 2, 1, 5, 6, 2, 1, 2, 3, 7, 6, 3, 2,
1, 2, 3, 6}
The elements in the array elements are known to
come from the set S {1, 2, 3, 4, 5, 6, 7} where lower
bound is 1 and upper bound is 7. Suppose if one of
the elements in the set say 5 is not in the input array
elements, then the input array elements are not
derived from the set and hence this algorithm cannot
be applied.
After analyzing the input, we have to calculate the
frequency of each element in the set from the input
array and distribution values for the element in the
set. For the above example the calculations are
shown below.
Table 1
Array
values Set S 1 2 3 4 5 6 7
Frequency 4 6 4 1 1 3 1
Distribution
Values D
4 10 14 15 16 19 20
Now process the input array elements right to left.
Consider the last element in the array. In the above
3. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 04 Issue: 06 | June-2015, Available @ http://www.ijret.org 304
example A[19] = 6. Subtract it from the lower bound
that is 6 – 1 = 5. This value is used to access the
element stored in D[5] = 19 and subtract it by 1 that
is 19 – 1 = 18. So insert the element A[19] in the
position of new array B[18]. This new position
indicates the position in the sorted array B. This will
be repeated for each element in the given array from
right to left.
Now the entire elements in the new array B are in
sorted array. For the above array elements, the final
sorted order after applying this algorithm is {1, 1, 1,
1, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 4, 5, 6, 6, 6, 7}. This is
considered as a new page reference string in our
proposed algorithm.
Since the array value is fixed, the time efficiency of
this algorithm is linear.
5. PROPOSED ALGORITHM
The main goal of our proposed algorithm is to minimize the
total number of page faults by obtaining a new page
reference string from the above algorithm compared to
FIFO, OPT and LRU page replacement algorithms.
5.1 Comparison Counting FIFO (CC-FIFO)
Algorithm
If the reference string contains random, duplicate elements
and the elements are not derived from the set, then we will
apply the below algorithm.
1) Read an array of page reference string and the
number of frames in the memory from the user.
2) Analyse the input carefully.
3) Apply comparison counting algorithm to sort the list
(shown in section IV).
4) The elements are now in the sorted order.
5) Apply FIFO page replacement algorithm to the new
page reference string obtained in the above step.
6) The number of page faults obtained using our
algorithm is less than to that of other page
replacement algorithm.
5.2 Distribution Counting FIFO (DC-FIFO)
Algorithm
If the page reference string contains random, duplicate
elements and the elements in the array are known to come
from the set, then we apply the below algorithm.
1) Read array of page reference string and the number
of frames in the memory from the user.
2) Analyse the input carefully.
3) Apply distribution counting algorithm to sort the
entire array (shown in section IV).
4) The elements are now in the sorted order.
5) Apply FIFO page replacement algorithm to the new
page reference string obtained in the above step.
6) The number of page faults obtained using our
algorithm is less than to that of other page
replacement algorithm.
6. RESULTS AND OBSERVATIONS
Before we implement the steps performed in the proposed
method, we have taken two different cases for both
comparison and distribution counting FIFO algorithm. For
each case we will implement the traditional algorithm and
also the proposed algorithm and calculate the page faults for
each algorithm. In the next section we will implement the
proposed algorithm and compare the results obtained in
these two cases with our new algorithm to show the
improved performance.
Case 1: Consider the page reference string
{7, 0, 1, 2, 0, 3, 0, 4, 2, 3, 0, 3, 2, 1, 2, 0, 1, 7, 0, 1} Assume
three frames in the memory. Fig. 1 to Fig. 3 show the
representation of FIFO, OPT and LRU page replacement
algorithm.
Fig 1: FIFO Page Replacement algorithm
Total number of page faults in FIFO = 15
Fig 2: Optimal Page Replacement algorithm
Total number of page faults in Optimal = 9
4. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 04 Issue: 06 | June-2015, Available @ http://www.ijret.org 305
Fig 3: LRU Page Replacement algorithm
Total number of page faults in LRU = 12
Case 2: Consider the page reference string
{1, 2, 3, 4, 2, 1, 5, 6, 2, 1, 2, 3, 7, 6, 3, 2, 1, 2, 3, 6} Assume
three frames in the memory. Fig. 4 to Fig. 6 show the
representation of FIFO, OPT and LRU page replacement
algorithm.
Fig 4: FIFO Page Replacement algorithm
Total number of page faults in FIFO = 16
Fig 5: Optimal Page Replacement algorithm
Total number of page faults in Optimal = 11
Fig 6: LRU Page Replacement algorithm
Total number of page faults in LRU = 15
6.1 Proposed Method
In this section, we will apply the proposed algorithm for the
two cases to show how the pages can be accessed faster.
Case 1: Consider the page reference string
{7, 0, 1, 2, 0, 3, 0, 4, 2, 3, 0, 3, 2, 1, 2, 0, 1, 7, 0, 1} Assume
three frames in the memory.
Since the page reference string are in random order and
contain duplicate elements. These elements are not known
from the set. Then apply the CC-FIFO algorithm shown in
section IV to obtain the new page reference string. The new
page reference string after sorting is {4, 7, 7, 3, 3, 3, 2, 2, 2,
2, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0}. Now apply the remaining steps
in the algorithm to complete the task. Fig. 7 shows the
representation of CC-FIFO algorithm for the new page
reference string. Table II show the comparison of all the
algorithms with our proposed algorithm for case 1.
Fig 7: CC-FIFO algorithm
Total number of page faults in CC-FIFO = 6
5. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 04 Issue: 06 | June-2015, Available @ http://www.ijret.org 306
Table 2
Algorithms Total number of Page Faults
FIFO 15
Optimal 09
LRU 12
CC-FIFO 06
Case 2: Consider the page reference string
{1, 2, 3, 4, 2, 1, 5, 6, 2, 1, 2, 3, 7, 6, 3, 2, 1, 2, 3, 6} Assume
three frames in the memory.
Since the page reference string are in random order and
contain duplicate elements. The elements are said to be
come from the set {1, 2, 3, 4, 5, 6, 7}. Then apply the DC-
FIFO algorithm shown in section IV to obtain the new page
reference string. The new page reference string after
applying algorithm is {1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 4,
5, 6, 6, 6, 7}. Now apply the remaining steps in the
algorithm to complete the task. Fig. 8 shows the
representation of DC-FIFO algorithm for the new page
reference string. Table III show the comparison of all the
algorithms with our proposed algorithm for case 2.
Fig 8: DC-FIFO algorithm
Total number of page faults in DC-FIFO = 7
Table 3
Algorithms Total number of Page Faults
FIFO 16
Optimal 11
LRU 15
DC-FIFO 07
6.2 Comparison
Fig. 9 shows the comparison of total number of page faults
in FIFO, Optimal and LRU with our algorithm called CC-
FIFO replacement algorithm for case 1. Thus from this
figure we showed that our new algorithm CC-FIFO results
in reduced page faults compared to other replacement
algorithms.
Fig 9: Comparison of page faults for case1
Fig. 10 shows the comparison of total number of page faults
in FIFO, Optimal and LRU with our algorithm called DC-
FIFO replacement algorithm for case 2. Thus from this
figure we showed that our new algorithm DC-FIFO results
in reduced page faults compared to other replacement
algorithms.
Fig 10: Comparison of page faults for case2
7. CONCLUSION
The performance page replacement algorithms like FIFO,
OPT, LRU, Second Chance, LFU and MFU is measured in
terms of number of page faults rate. We select one of the
algorithm that result in low page fault rate. Among all these
algorithms, optimal page replacement algorithm is efficient
since it results in lowest page fault rate of all the algorithms.
In an attempt towards this, we have proposed a new
algorithm called Comparison Counting FIFO (CC-FIFO)
and Distribution Counting FIFO (DC-FIFO) using input
enhancement technique and implemented the same for two
different reference strings. Our results and calculations show
that proposed algorithm reduces the total number of page
fault rate compared to the general approach of page
replacement algorithms.
6. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
_______________________________________________________________________________________
Volume: 04 Issue: 06 | June-2015, Available @ http://www.ijret.org 307
REFERENCES
[1] Wang Hong, “Study of Page Replacement Algorithm
based on Experiment”, International conference on
mechanical Engineering and Automation, Advances
in Biomedical Engineering, volume 10, 2012.
[2] Guangxia Xu, Lingling Ren and Yanbing Liu, “Flash
Aware Page Replacement Algorithm”, Mathematical
Problems in Engineering, Hindawi Publishing
Corporation, volume 2014.
[3] Anupam Bhattacharjee, Biolab Kumar Debnath, “A
New Web Cache Replacement Algorithm”, IEEE
Conference on Communications, Computers and
Signal Processing, August, 2005.
[4] Silberchatz, Galvin and Gagne, “Operating Systems
Concepts”, 8th edition, John Wiley and Sons, 2012.
[5] Dietel, Dietel and Choffnes, “Operating Systems”, 3rd
edition, Pearson education, 2009.
[6] Yogesh Niranjan and Shailendra Tiwari, “Design and
Implementation of Page Replacement Algorithm for
Web Proxy Caching”, IJCTA, Volume 4, April 2013.
[7] Ali Khosrozadeh, Sanaz Pashmforoush, “Presenting a
novel Page Replacement Algorithm based on LRU”,
Journal of Basic and Applied Scientific Research,
2012.
[8] Pooja Khulbe, Shruti Pant, “Hybrid (LRU) page
Replacement Algorithm”, IJCA, Volume 91, April
2014.
[9] Anany Levitin, “Introduction to the Design and
Analysis of Algorithms”, second edition, Pearson,
2008.