Abstract Spatio-temporal data is any information regarding space and time. It is frequently updated data with 1TB/hr, are greatly challenging our ability to digest the data. Thereupon, it is unable to gain exact information from that data. So this research offers an innovative idea to discover the trend on multi-dimensional spatio-temporal datasets. Here it briefly describes the scope and relevancy of spatio-temporal data. From that, gain the depth knowledge of spatio-temporal recent research process to discover the trend. Keywords: Spatio-temporal Data, Applications of Spatio-temporal data, Problem Definition, Contributions
A Semantic Approach to Retrieving, Linking, and Integrating Heterogeneous Ge...Craig Knoblock
Ying Zhang, Yao-Yi Chang, Pedro Szekely, and Craig A. Knoblock, A Semantic Approach to Retrieving, Linking, and Integrating Heterogeneous Geospatial Data, Proceedings of the 2013 IJCAI Workshop on Semantic Cities, 2013
Gravitational search algorithm with chaotic map (gsa cm) for solving optimiza...eSAT Journals
Abstract
Gravitational Search Algorithm (GSA) is a newly heuristic algorithm inspired by nature which utilizes Newtonian gravity law and
mass interactions. It has captured much attention since it has provided higher performance in solving various optimization
problems. This study hybridizes the GSA and chaotic equations. Ten chaotic-based GSA (GSA-CM) methods, which define the
random selections by different chaotic maps, have been developed. The proposed methods have been applied to the minimization
of benchmark problems and the results have been compared. The obtained numeric results show that most of the proposed
algorithms have increased the performance of GSA and have developed its quality of solution.
Keywords: Computational Intelligence, Evolutionary Computation, Heuristic Algorithms, Chaotic Maps, Optimization
Methods.
A hybrid approach for analysis of dynamic changes in spatial dataijdms
Any geographic location undergoes changes over a period of time. These changes can be observed by
naked eye, only if they are huge in number spread over a small area. However, when the changes are small
and spread over a large area, it is very difficult to observe or extract the changes. Presently, there are few
methods available for tackling these types of problems, such as GRID, DBSCAN etc. However, these
existing mechanisms are not adequate for finding an accurate changes or observation which is essential
with respect to most important geometrical changes such as deforestations and land grabbing etc.,. This
paper proposes new mechanism to solve the above problem. In this proposed method, spatial image
changes are compared over a period of time taken by the satellite. Partitioning the satellite image in to
grids, employed in the proposed hybrid method, provides finer details of the image which are responsible
for improving the precision of clustering compared to whole image manipulation, used in DBSCAN, at a
time .The simplicity of DBSCAN explored while processing portioned grid portion.
International Journal of Engineering Research and Development (IJERD)IJERD Editor
journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJERD, journal of science and technology, how to get a research paper published, publishing a paper, publishing of journal, publishing of research paper, reserach and review articles, IJERD Journal, How to publish your research paper, publish research paper, open access engineering journal, Engineering journal, Mathemetics journal, Physics journal, Chemistry journal, Computer Engineering, Computer Science journal, how to submit your paper, peer reviw journal, indexed journal, reserach and review articles, engineering journal, www.ijerd.com, research journals,
yahoo journals, bing journals, International Journal of Engineering Research and Development, google journals, hard copy of journal
Principle Component Analysis Based on Optimal Centroid Selection Model for Su...ijtsrd
Clustering a large sparse and large scale data is an open research in the data mining. To discover the significant information through clustering algorithm stands inadequate as most of the data finds to be non actionable. Existing clustering technique is not feasible to time varying data in high dimensional space. Hence Subspace clustering will be answerable to problems in the clustering through incorporation of domain knowledge and parameter sensitive prediction. Sensitiveness of the data is also predicted through thresholding mechanism. The problems of usability and usefulness in 3D subspace clustering are very important issue in subspace clustering. . The Solutions is highly helpful benefit for police departments and law enforcement organisations to better understand stock issues and provide insights that will enable them to track activities, predict the likelihood. Also determining the correct dimension is inconsistent and challenging issue in subspace clustering .In this thesis, we propose Centroid based Subspace Forecasting Framework by constraints is proposed, i.e. must link and must not link with domain knowledge. Unsupervised Subspace clustering algorithm with inbuilt process like inconsistent constraints correlating to dimensions has been resolved through singular value decomposition. Principle component analysis is been used in which condition has been explored to estimate the strength of actionable to be particular attributes and utilizing the domain knowledge to refinement and validating the optimal centroids dynamically. An experimental result proves that proposed framework outperforms other competition subspace clustering technique in terms of efficiency, Fmeasure, parameter insensitiveness and accuracy. G. Raj Kamal | A. Deepika | D. Pavithra | J. Mohammed Nadeem | V. Prasath Kumar "Principle Component Analysis Based on Optimal Centroid Selection Model for SubSpace Clustering Model" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-4 , June 2020, URL: https://www.ijtsrd.com/papers/ijtsrd31374.pdf Paper Url :https://www.ijtsrd.com/computer-science/data-miining/31374/principle-component-analysis-based-on-optimal-centroid-selection-model-for-subspace-clustering-model/g-raj-kamal
Defining Homogenous Climate zones of Bangladesh using Cluster AnalysisPremier Publishers
Climate zones of Bangladesh are identified by using mathematical methodology of cluster analysis. Monthly data from 34 climate stations for rainfall from 1991 to 2013 are used in the cluster analysis. Five Agglomerative Hierarchical clustering measures based on mostly used six proximity measures are chosen to perform the regionalization. Besides three popular measures: K-means, Fuzzy and density based clustering techniques are applied initially to decide the most suitable method for the identification of homogeneous region. Stability of the cluster is also tested based on nine validity indices. It is decided that Ward method based on Euclidean distance, K-means, Fuzzy are the most likely to yield acceptable results in this particular case, as is often the case in climatological research. In this analysis we found seven different climate zones in Bangladesh.
A Semantic Approach to Retrieving, Linking, and Integrating Heterogeneous Ge...Craig Knoblock
Ying Zhang, Yao-Yi Chang, Pedro Szekely, and Craig A. Knoblock, A Semantic Approach to Retrieving, Linking, and Integrating Heterogeneous Geospatial Data, Proceedings of the 2013 IJCAI Workshop on Semantic Cities, 2013
Gravitational search algorithm with chaotic map (gsa cm) for solving optimiza...eSAT Journals
Abstract
Gravitational Search Algorithm (GSA) is a newly heuristic algorithm inspired by nature which utilizes Newtonian gravity law and
mass interactions. It has captured much attention since it has provided higher performance in solving various optimization
problems. This study hybridizes the GSA and chaotic equations. Ten chaotic-based GSA (GSA-CM) methods, which define the
random selections by different chaotic maps, have been developed. The proposed methods have been applied to the minimization
of benchmark problems and the results have been compared. The obtained numeric results show that most of the proposed
algorithms have increased the performance of GSA and have developed its quality of solution.
Keywords: Computational Intelligence, Evolutionary Computation, Heuristic Algorithms, Chaotic Maps, Optimization
Methods.
A hybrid approach for analysis of dynamic changes in spatial dataijdms
Any geographic location undergoes changes over a period of time. These changes can be observed by
naked eye, only if they are huge in number spread over a small area. However, when the changes are small
and spread over a large area, it is very difficult to observe or extract the changes. Presently, there are few
methods available for tackling these types of problems, such as GRID, DBSCAN etc. However, these
existing mechanisms are not adequate for finding an accurate changes or observation which is essential
with respect to most important geometrical changes such as deforestations and land grabbing etc.,. This
paper proposes new mechanism to solve the above problem. In this proposed method, spatial image
changes are compared over a period of time taken by the satellite. Partitioning the satellite image in to
grids, employed in the proposed hybrid method, provides finer details of the image which are responsible
for improving the precision of clustering compared to whole image manipulation, used in DBSCAN, at a
time .The simplicity of DBSCAN explored while processing portioned grid portion.
International Journal of Engineering Research and Development (IJERD)IJERD Editor
journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJERD, journal of science and technology, how to get a research paper published, publishing a paper, publishing of journal, publishing of research paper, reserach and review articles, IJERD Journal, How to publish your research paper, publish research paper, open access engineering journal, Engineering journal, Mathemetics journal, Physics journal, Chemistry journal, Computer Engineering, Computer Science journal, how to submit your paper, peer reviw journal, indexed journal, reserach and review articles, engineering journal, www.ijerd.com, research journals,
yahoo journals, bing journals, International Journal of Engineering Research and Development, google journals, hard copy of journal
Principle Component Analysis Based on Optimal Centroid Selection Model for Su...ijtsrd
Clustering a large sparse and large scale data is an open research in the data mining. To discover the significant information through clustering algorithm stands inadequate as most of the data finds to be non actionable. Existing clustering technique is not feasible to time varying data in high dimensional space. Hence Subspace clustering will be answerable to problems in the clustering through incorporation of domain knowledge and parameter sensitive prediction. Sensitiveness of the data is also predicted through thresholding mechanism. The problems of usability and usefulness in 3D subspace clustering are very important issue in subspace clustering. . The Solutions is highly helpful benefit for police departments and law enforcement organisations to better understand stock issues and provide insights that will enable them to track activities, predict the likelihood. Also determining the correct dimension is inconsistent and challenging issue in subspace clustering .In this thesis, we propose Centroid based Subspace Forecasting Framework by constraints is proposed, i.e. must link and must not link with domain knowledge. Unsupervised Subspace clustering algorithm with inbuilt process like inconsistent constraints correlating to dimensions has been resolved through singular value decomposition. Principle component analysis is been used in which condition has been explored to estimate the strength of actionable to be particular attributes and utilizing the domain knowledge to refinement and validating the optimal centroids dynamically. An experimental result proves that proposed framework outperforms other competition subspace clustering technique in terms of efficiency, Fmeasure, parameter insensitiveness and accuracy. G. Raj Kamal | A. Deepika | D. Pavithra | J. Mohammed Nadeem | V. Prasath Kumar "Principle Component Analysis Based on Optimal Centroid Selection Model for SubSpace Clustering Model" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-4 , June 2020, URL: https://www.ijtsrd.com/papers/ijtsrd31374.pdf Paper Url :https://www.ijtsrd.com/computer-science/data-miining/31374/principle-component-analysis-based-on-optimal-centroid-selection-model-for-subspace-clustering-model/g-raj-kamal
Defining Homogenous Climate zones of Bangladesh using Cluster AnalysisPremier Publishers
Climate zones of Bangladesh are identified by using mathematical methodology of cluster analysis. Monthly data from 34 climate stations for rainfall from 1991 to 2013 are used in the cluster analysis. Five Agglomerative Hierarchical clustering measures based on mostly used six proximity measures are chosen to perform the regionalization. Besides three popular measures: K-means, Fuzzy and density based clustering techniques are applied initially to decide the most suitable method for the identification of homogeneous region. Stability of the cluster is also tested based on nine validity indices. It is decided that Ward method based on Euclidean distance, K-means, Fuzzy are the most likely to yield acceptable results in this particular case, as is often the case in climatological research. In this analysis we found seven different climate zones in Bangladesh.
Text documents clustering using modified multi-verse optimizerIJECEIAES
In this study, a multi-verse optimizer (MVO) is utilised for the text document clus- tering (TDC) problem. TDC is treated as a discrete optimization problem, and an objective function based on the Euclidean distance is applied as similarity measure. TDC is tackled by the division of the documents into clusters; documents belonging to the same cluster are similar, whereas those belonging to different clusters are dissimilar. MVO, which is a recent metaheuristic optimization algorithm established for continuous optimization problems, can intelligently navigate different areas in the search space and search deeply in each area using a particular learning mechanism. The proposed algorithm is called MVOTDC, and it adopts the convergence behaviour of MVO operators to deal with discrete, rather than continuous, optimization problems. For evaluating MVOTDC, a comprehensive comparative study is conducted on six text document datasets with various numbers of documents and clusters. The quality of the final results is assessed using precision, recall, F-measure, entropy accuracy, and purity measures. Experimental results reveal that the proposed method performs competitively in comparison with state-of-the-art algorithms. Statistical analysis is also conducted and shows that MVOTDC can produce significant results in comparison with three well-established methods.
A Novel Multi- Viewpoint based Similarity Measure for Document ClusteringIJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
International Journal of Modern Engineering Research (IJMER) covers all the fields of engineering and science: Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Computer Engineering, Agricultural Engineering, Aerospace Engineering, Thermodynamics, Structural Engineering, Control Engineering, Robotics, Mechatronics, Fluid Mechanics, Nanotechnology, Simulators, Web-based Learning, Remote Laboratories, Engineering Design Methods, Education Research, Students' Satisfaction and Motivation, Global Projects, and Assessment…. And many more.
Cds based energy efficient topology control algorithm in wireless sensor net...eSAT Journals
Abstract Wireless Sensor Networks (WSNs) are a self organized network which consists of large number of sensor nodes that collects the data in a various environment [1, 2]. The sensors work on battery that have limited lifetime so it is a challenge to create an energy efficient network that can reduce the energy consumption and interference in the network graph and thereby extend the network lifetime [2]. For saving energy and extending network lifetime the topology is a well-known technique in WSNs and the widely used topology control strategy is the construction of Connected Dominating Set (CDS) [3, 4]. In this paper, we construct a CDS based energy efficient topology control algorithm i.e. GCDSTC for WSNs. The performance analysis includes the study of GCDSTC algorithm in terms of complexity and compares it with EBTC (Energy Balanced Topology Control) algorithm. The simulation results indicate that the GCDSTC algorithm reduce the energy consumption and interference in the network graph, in order to enhance the network lifetime. Keywords: Wireless Sensor Network (WSN), Connected Dominating Set (CDS), Topology Control (TC), etc.
Simultaneous Real time Graphical Representation of Kinematic Variables Using ...iosrjce
IOSR Journal of Applied Physics (IOSR-JAP) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of physics and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in applied physics. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
IMPROVED NEURAL NETWORK PREDICTION PERFORMANCES OF ELECTRICITY DEMAND: MODIFY...csandit
Accurate prediction of electricity demand can bring extensive benefits to any country as the
forecast values help the relevant authorities to take decisions regarding electricity generation,
transmission and distribution much appropriately. The literature reveals that, when compared
to conventional time series techniques, the improved artificial intelligent approaches provide
better prediction accuracies. However, the accuracy of predictions using intelligent approaches
like neural networks are strongly influenced by the correct selection of inputs and the number of
neuro-forecasters used for prediction. This research shows how a cluster analysis performed to
group similar day types, could contribute towards selecting a better set of neuro-forecasters in
neural networks. Daily total electricity demands for five years were considered for the analysis
and each date was assigned to one of the thirteen day-types, in a Sri Lankan context. As a
stochastic trend could be seen over the years, prior to performing the k-means clustering, the
trend was removed by taking the first difference of the series. Three different clusters were
found using Silhouette plots, and thus three neuro-forecasters were used for predictions. This
paper illustrates the proposed modified neural network procedure using electricity demand
data.
Data reduction techniques for high dimensional biological dataeSAT Journals
Abstract
High dimensional biological datasets in recent years has been growing rapidly. Extracting the knowledge and analyzing highdimensional
biological data is one the key challenges in which variety and veracity are the two distinct characteristics. The
question that arises now is, how to perform dimensionality reduction for this heterogeneous data and how to develop a high
performance platform to efficiently analyze high dimensional biological data and how to find the useful things from this data. To
deeply discuss this issue, this paper begins with a brief introduction to data analytics available for biological data, followed by
the discussions of big data analytics and then a survey on various data reduction methods for biological data. We propose a dense
clustering algorithm for standard high dimensional biological data.
Keywords: Big Data Analytics, Dimensionality Reduction
We build a network of locations from Twitter geolocalized tweets. We then analyze the small-world and scale-free properties of such spatial network. We also use modularity to reveal the structre of neighborhood communities.
PERFORMANCE EVALUATION OF GRID-BASED ROUTING PROTOCOL FOR UNDERWATER WIRELESS...ijwmn
In this paper, we have conducted a simulation study to evaluate Multipath Grid-Based Geographic Routing
Protocol (MGGR) under different selected mobility models. The protocol has been evaluated under three
mobility models (i.e. Random Way Point, Reference Point Group and Meandering Current Mobility Model)
using Aqua-Sim NS2-based simulator. An Extensive number of experiments have been conducted to assess
the packet delivery ratio, the end-to-end-delay and power consumption under different operating
conditions. We have noticed that the performance of MGGR is only marginally affected by the mobility
behavior of nodes especially with the condition where the speed of nodes movements increased.
Nevertheless, the evaluation has also shown that MGGR exhibit only average performance when used in
coastal areas. This is evident as the MCMM (used primarily to model the movement of nodes in coastal
areas) gave the lowest performance compared to other models.
Study of the Class and Structural Changes Caused By Incorporating the Target ...ijceronline
High dimensional data when processed by using various machine learning and pattern recognition techniques, it undergoes several changes. Dimensionality reduction is one such successfully used pre-processing technique to analyze and represent the high dimensional data that causes several structural changes to occur in the data through the process. The high-dimensional data when used to extract just the target class from among several classes that are spatially scattered then the philosophy of the dimensionality reduction is to find an optimal subset of features either from the original space or from the transformed space using the control set of the target class and then project the input space onto this optimal feature subspace. This paper is an exploratory analysis carried out to study the class properties and the structural properties that are affected due to the target class guided feature subsetting in specific. K-nearest neighbors and minimum spanning tree are employed to study the structural properties, and cluster analysis is applied to understand the target class and other class properties. The experimentation is conducted on the target class derived features on the selected bench mark data sets namely IRIS, AVIRIS Indiana Pine and ROSIS Pavia University data set. Experimentation is also extended to data represented in the optimal principal components obtained by transforming the subset of features and results are also compared
A SURVEY ON OPTIMIZATION APPROACHES TO TEXT DOCUMENT CLUSTERINGijcsa
Text Document Clustering is one of the fastest growing research areas because of availability of huge amount of information in an electronic form. There are several number of techniques launched for clustering documents in such a way that documents within a cluster have high intra-similarity and low inter-similarity to other clusters. Many document clustering algorithms provide localized search in effectively navigating, summarizing, and organizing information. A global optimal solution can be obtained by applying high-speed and high-quality optimization algorithms. The optimization technique performs a globalized search in the entire solution space. In this paper, a brief survey on optimization approaches to text document clustering is turned out.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Modeling monthly average daily diffuse radiation for dhaka, bangladesheSAT Journals
Abstract The diffuse part of solar radiation is one of the elements necessary for the design and evaluation of energy production of a solar system. However, in most cases, when radiometric measurements are made, only global radiation is available. To remedy this situation, this paper presents a model of the scattered radiation measured on a horizontal surface for the capital city of Bangladesh. The correlation established for the chosen site was compared to the work of Liu anf Jordan, Page, Collares Pereira and Rabl, Modi and Sukhatme and Gupta el al. Keywords: Diffuse Radiation, Clearness Index, Regression analysis, Horizontal Radiation.
Computational Database for 3D and 2D materials to accelerate discoveryKAMAL CHOUDHARY
The Density functional theory section of JARVIS (JARVIS-DFT) consists of thousands of VASP based calculations for 3D-bulk, single layer (2D), nanowire (1D) and molecular (0D) systems. Most of the calculations are carried out with optB88vDW functional. JARVIS-DFT includes materials data such as: energetics, diffraction pattern, radial distribution function, band-structure, density of states, carrier effective mass, temperature and carrier concentration dependent thermoelectric properties, elastic constants and gamma-point phonons.
MK-Prototypes: A Novel Algorithm for Clustering Mixed Type Data IJMER
Clustering mixed type data is one of the major research topics in the area of data mining. In
this paper, a new algorithm for clustering mixed type data is proposed where the concept of distribution
centroid is used to represent the prototype of categorical variables in a cluster which is then combined
with the mean to represent the prototype of clusters with mixed type variables. In the method, data is
observed from different views and the variables are grouped into different views. Those instances that
can be viewed differently from different viewpoints can be defined as multiview data. During clustering
process the differences among views are ignored in usual cases. Here, both views and variables weights
are computed simultaneously. The view weight is used to determine the closeness or density of view and
variable weight is used to identify the significance of each variable. With the intention of determining
the cluster of objects both these weights are used in the distance function. In the proposed method,
enhancement to the k-prototypes is done so that it automatically computes both view and variable
weights. The proposed algorithm MK-Prototypes algorithm is compared with two other clustering
algorithms.
Energy saving in star marked refrigerators and effect of door opening frequen...eSAT Journals
Abstract
The resurgent economy of India is in need of energy as it is one of the vital requirements for a sustainable development. So, it is obvious that India will need a huge hike in energy to fulfill its requirements in the next decades. But, there is a large gap between the demand and supply. All India average energy shortfall is 9% and peak demand shortfall is 14% [O’Neill 2009]. .Loss of electricity in India, due to transmission, is around 24% during 2011-12 [Energy statistics 2013]. Energy saved at the point of use is equivalent to 2.5 times the energy produced [Deshpande V.M.,2011]. As such sincere effort is essential in every aspect wherever there is a possibility of saving energy. Refrigerators are huge consumer of electrical energy. In India, 13 % of residential electricity is used by refrigeration systems [BEE, Energy and Building, 2014 ]. There is very little study on the effect of use of energy star marked refrigerators on actual reduction of energy consumption, especially in developing countries. Another important factor which contributes to the increase in consumption of electrical energy is door opening pattern in refrigerators. But there is very little study on the effect of door opening frequency in refrigerators on energy consumption especially in field condition. In the present work experimental study was conducted by keeping the door of a 5 star marked refrigerator open at different frequency but same total opening time. The result showed that frequent opening of door leads to substantial increase in energy consumption and increase in frequency is directly related to increase in energy consumption. The increase may be more than 100% if a door is opened for even 1 minute after every 3 minutes or the door is kept open (by mistake or so) for a very long period (90minutes).
Keywords: energy, efficiency, refrigerator, star marking, compressor, door opening, frequency
A review on implementation of algorithms for detection of diabetic retinopathyeSAT Journals
Abstract Diabetes is a group of metabolic disease in which a person has high blood sugar. Diabetic Retinopathy (DR) is caused by the abnormalities in the retina due to insufficient insulin in the body. It can lead to sudden vision loss due to delayed detection of retinopathy. So that Diabetic patients require regular medical checkup for effective timing of sight –saving treatment. This is continuous and stimulating research area for automated analysis of Diabetic Retinopathy in Diabetic patients. A completely automated screening system for the detection of Diabetic Retinopathy can effectively reduces the burden of the specialist and saves cost as well as time. Due to noise and other disturbances that occur during image acquisition Diabetic Retinopathy may lead to false detection and this is overcome by various image processing techniques. Further the different features are extracted which serves as the guideline to identify and grade the severity of the disease. Based on the extracted features classification of the retinal image as normal or abnormal is carried out. In this paper, we have presented detail study of various screening methods for Diabetic Retinopathy. Many researchers have made number of attempts to improve accuracy, predictivity, sensitivity and specificity. Keywords: Color Fundus Images, Diabetic Retinopathy, Exudates, Lesions.
An approach for discrimination prevention in data miningeSAT Journals
Abstract In the age of Database technologies a large amount of data is collected and analyzed by using data mining techniques. However, the main issue in data mining is potential privacy invasion and potential discrimination. One of the techniques used in data mining for making decision is classification. On the other hand, if the dataset is biased then the discriminatory decision may occur. Therefore, in this paper we review the recent state of the art approaches for antidiscrimination techniques and also focuses on discrimination discovery and prevention in data mining. On the other hand, we also study a theoretical proposal for enhancing the results of the data quality. Keywords- Antidiscrimination, data mining, direct and indirect discrimination prevention, rule protection, rule generalization, privacy.
Buckling analysis of line continuum with new matrices of stiffness and geometryeSAT Journals
Abstract This research work present buckling analysis of line continuum with new matrices of elastic stiffness and geometric stiffness. The stiffness matrices were developed using energy variational principle. Two deformable nodes were considered at the centre and at the two ends of the continuum which brings the number of deformable node to six. The six term Taylor McLaurin’s shape function was substituted into strain energy equation and the result functional was minimized, resulting in a 6 x 6 stiffness matrix used herein. The six term shape function is also substituted into the geometric work equation and minimized to obtain 6 x 6 geometric stiffness matrix for buckling analysis. The two matrices were employed, as well as traditional 4 x 4 matrices in classical buckling analysis of four line continua. The results from the new 6 x 6 matrices of stiffness and geometry were very close to exact results, with average percentage difference of 2.33% from exact result. Whereas those from the traditional 4 x 4 matrices and 5 x 5 matrices differed from exact results, with average percentage difference of 23.73% and 2.55% respectively. Thus the newly developed 6 x 6 matrices of stiffness and geometry are suitable for classical buckling analysis of line continuum. Keywords: 6x6 stiffness system; buckling; geometry; line continuum; variational principle; deformable node; shape function; classical; numerical; analysis; beam
Text documents clustering using modified multi-verse optimizerIJECEIAES
In this study, a multi-verse optimizer (MVO) is utilised for the text document clus- tering (TDC) problem. TDC is treated as a discrete optimization problem, and an objective function based on the Euclidean distance is applied as similarity measure. TDC is tackled by the division of the documents into clusters; documents belonging to the same cluster are similar, whereas those belonging to different clusters are dissimilar. MVO, which is a recent metaheuristic optimization algorithm established for continuous optimization problems, can intelligently navigate different areas in the search space and search deeply in each area using a particular learning mechanism. The proposed algorithm is called MVOTDC, and it adopts the convergence behaviour of MVO operators to deal with discrete, rather than continuous, optimization problems. For evaluating MVOTDC, a comprehensive comparative study is conducted on six text document datasets with various numbers of documents and clusters. The quality of the final results is assessed using precision, recall, F-measure, entropy accuracy, and purity measures. Experimental results reveal that the proposed method performs competitively in comparison with state-of-the-art algorithms. Statistical analysis is also conducted and shows that MVOTDC can produce significant results in comparison with three well-established methods.
A Novel Multi- Viewpoint based Similarity Measure for Document ClusteringIJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
International Journal of Modern Engineering Research (IJMER) covers all the fields of engineering and science: Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Computer Engineering, Agricultural Engineering, Aerospace Engineering, Thermodynamics, Structural Engineering, Control Engineering, Robotics, Mechatronics, Fluid Mechanics, Nanotechnology, Simulators, Web-based Learning, Remote Laboratories, Engineering Design Methods, Education Research, Students' Satisfaction and Motivation, Global Projects, and Assessment…. And many more.
Cds based energy efficient topology control algorithm in wireless sensor net...eSAT Journals
Abstract Wireless Sensor Networks (WSNs) are a self organized network which consists of large number of sensor nodes that collects the data in a various environment [1, 2]. The sensors work on battery that have limited lifetime so it is a challenge to create an energy efficient network that can reduce the energy consumption and interference in the network graph and thereby extend the network lifetime [2]. For saving energy and extending network lifetime the topology is a well-known technique in WSNs and the widely used topology control strategy is the construction of Connected Dominating Set (CDS) [3, 4]. In this paper, we construct a CDS based energy efficient topology control algorithm i.e. GCDSTC for WSNs. The performance analysis includes the study of GCDSTC algorithm in terms of complexity and compares it with EBTC (Energy Balanced Topology Control) algorithm. The simulation results indicate that the GCDSTC algorithm reduce the energy consumption and interference in the network graph, in order to enhance the network lifetime. Keywords: Wireless Sensor Network (WSN), Connected Dominating Set (CDS), Topology Control (TC), etc.
Simultaneous Real time Graphical Representation of Kinematic Variables Using ...iosrjce
IOSR Journal of Applied Physics (IOSR-JAP) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of physics and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in applied physics. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
IMPROVED NEURAL NETWORK PREDICTION PERFORMANCES OF ELECTRICITY DEMAND: MODIFY...csandit
Accurate prediction of electricity demand can bring extensive benefits to any country as the
forecast values help the relevant authorities to take decisions regarding electricity generation,
transmission and distribution much appropriately. The literature reveals that, when compared
to conventional time series techniques, the improved artificial intelligent approaches provide
better prediction accuracies. However, the accuracy of predictions using intelligent approaches
like neural networks are strongly influenced by the correct selection of inputs and the number of
neuro-forecasters used for prediction. This research shows how a cluster analysis performed to
group similar day types, could contribute towards selecting a better set of neuro-forecasters in
neural networks. Daily total electricity demands for five years were considered for the analysis
and each date was assigned to one of the thirteen day-types, in a Sri Lankan context. As a
stochastic trend could be seen over the years, prior to performing the k-means clustering, the
trend was removed by taking the first difference of the series. Three different clusters were
found using Silhouette plots, and thus three neuro-forecasters were used for predictions. This
paper illustrates the proposed modified neural network procedure using electricity demand
data.
Data reduction techniques for high dimensional biological dataeSAT Journals
Abstract
High dimensional biological datasets in recent years has been growing rapidly. Extracting the knowledge and analyzing highdimensional
biological data is one the key challenges in which variety and veracity are the two distinct characteristics. The
question that arises now is, how to perform dimensionality reduction for this heterogeneous data and how to develop a high
performance platform to efficiently analyze high dimensional biological data and how to find the useful things from this data. To
deeply discuss this issue, this paper begins with a brief introduction to data analytics available for biological data, followed by
the discussions of big data analytics and then a survey on various data reduction methods for biological data. We propose a dense
clustering algorithm for standard high dimensional biological data.
Keywords: Big Data Analytics, Dimensionality Reduction
We build a network of locations from Twitter geolocalized tweets. We then analyze the small-world and scale-free properties of such spatial network. We also use modularity to reveal the structre of neighborhood communities.
PERFORMANCE EVALUATION OF GRID-BASED ROUTING PROTOCOL FOR UNDERWATER WIRELESS...ijwmn
In this paper, we have conducted a simulation study to evaluate Multipath Grid-Based Geographic Routing
Protocol (MGGR) under different selected mobility models. The protocol has been evaluated under three
mobility models (i.e. Random Way Point, Reference Point Group and Meandering Current Mobility Model)
using Aqua-Sim NS2-based simulator. An Extensive number of experiments have been conducted to assess
the packet delivery ratio, the end-to-end-delay and power consumption under different operating
conditions. We have noticed that the performance of MGGR is only marginally affected by the mobility
behavior of nodes especially with the condition where the speed of nodes movements increased.
Nevertheless, the evaluation has also shown that MGGR exhibit only average performance when used in
coastal areas. This is evident as the MCMM (used primarily to model the movement of nodes in coastal
areas) gave the lowest performance compared to other models.
Study of the Class and Structural Changes Caused By Incorporating the Target ...ijceronline
High dimensional data when processed by using various machine learning and pattern recognition techniques, it undergoes several changes. Dimensionality reduction is one such successfully used pre-processing technique to analyze and represent the high dimensional data that causes several structural changes to occur in the data through the process. The high-dimensional data when used to extract just the target class from among several classes that are spatially scattered then the philosophy of the dimensionality reduction is to find an optimal subset of features either from the original space or from the transformed space using the control set of the target class and then project the input space onto this optimal feature subspace. This paper is an exploratory analysis carried out to study the class properties and the structural properties that are affected due to the target class guided feature subsetting in specific. K-nearest neighbors and minimum spanning tree are employed to study the structural properties, and cluster analysis is applied to understand the target class and other class properties. The experimentation is conducted on the target class derived features on the selected bench mark data sets namely IRIS, AVIRIS Indiana Pine and ROSIS Pavia University data set. Experimentation is also extended to data represented in the optimal principal components obtained by transforming the subset of features and results are also compared
A SURVEY ON OPTIMIZATION APPROACHES TO TEXT DOCUMENT CLUSTERINGijcsa
Text Document Clustering is one of the fastest growing research areas because of availability of huge amount of information in an electronic form. There are several number of techniques launched for clustering documents in such a way that documents within a cluster have high intra-similarity and low inter-similarity to other clusters. Many document clustering algorithms provide localized search in effectively navigating, summarizing, and organizing information. A global optimal solution can be obtained by applying high-speed and high-quality optimization algorithms. The optimization technique performs a globalized search in the entire solution space. In this paper, a brief survey on optimization approaches to text document clustering is turned out.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Modeling monthly average daily diffuse radiation for dhaka, bangladesheSAT Journals
Abstract The diffuse part of solar radiation is one of the elements necessary for the design and evaluation of energy production of a solar system. However, in most cases, when radiometric measurements are made, only global radiation is available. To remedy this situation, this paper presents a model of the scattered radiation measured on a horizontal surface for the capital city of Bangladesh. The correlation established for the chosen site was compared to the work of Liu anf Jordan, Page, Collares Pereira and Rabl, Modi and Sukhatme and Gupta el al. Keywords: Diffuse Radiation, Clearness Index, Regression analysis, Horizontal Radiation.
Computational Database for 3D and 2D materials to accelerate discoveryKAMAL CHOUDHARY
The Density functional theory section of JARVIS (JARVIS-DFT) consists of thousands of VASP based calculations for 3D-bulk, single layer (2D), nanowire (1D) and molecular (0D) systems. Most of the calculations are carried out with optB88vDW functional. JARVIS-DFT includes materials data such as: energetics, diffraction pattern, radial distribution function, band-structure, density of states, carrier effective mass, temperature and carrier concentration dependent thermoelectric properties, elastic constants and gamma-point phonons.
MK-Prototypes: A Novel Algorithm for Clustering Mixed Type Data IJMER
Clustering mixed type data is one of the major research topics in the area of data mining. In
this paper, a new algorithm for clustering mixed type data is proposed where the concept of distribution
centroid is used to represent the prototype of categorical variables in a cluster which is then combined
with the mean to represent the prototype of clusters with mixed type variables. In the method, data is
observed from different views and the variables are grouped into different views. Those instances that
can be viewed differently from different viewpoints can be defined as multiview data. During clustering
process the differences among views are ignored in usual cases. Here, both views and variables weights
are computed simultaneously. The view weight is used to determine the closeness or density of view and
variable weight is used to identify the significance of each variable. With the intention of determining
the cluster of objects both these weights are used in the distance function. In the proposed method,
enhancement to the k-prototypes is done so that it automatically computes both view and variable
weights. The proposed algorithm MK-Prototypes algorithm is compared with two other clustering
algorithms.
Energy saving in star marked refrigerators and effect of door opening frequen...eSAT Journals
Abstract
The resurgent economy of India is in need of energy as it is one of the vital requirements for a sustainable development. So, it is obvious that India will need a huge hike in energy to fulfill its requirements in the next decades. But, there is a large gap between the demand and supply. All India average energy shortfall is 9% and peak demand shortfall is 14% [O’Neill 2009]. .Loss of electricity in India, due to transmission, is around 24% during 2011-12 [Energy statistics 2013]. Energy saved at the point of use is equivalent to 2.5 times the energy produced [Deshpande V.M.,2011]. As such sincere effort is essential in every aspect wherever there is a possibility of saving energy. Refrigerators are huge consumer of electrical energy. In India, 13 % of residential electricity is used by refrigeration systems [BEE, Energy and Building, 2014 ]. There is very little study on the effect of use of energy star marked refrigerators on actual reduction of energy consumption, especially in developing countries. Another important factor which contributes to the increase in consumption of electrical energy is door opening pattern in refrigerators. But there is very little study on the effect of door opening frequency in refrigerators on energy consumption especially in field condition. In the present work experimental study was conducted by keeping the door of a 5 star marked refrigerator open at different frequency but same total opening time. The result showed that frequent opening of door leads to substantial increase in energy consumption and increase in frequency is directly related to increase in energy consumption. The increase may be more than 100% if a door is opened for even 1 minute after every 3 minutes or the door is kept open (by mistake or so) for a very long period (90minutes).
Keywords: energy, efficiency, refrigerator, star marking, compressor, door opening, frequency
A review on implementation of algorithms for detection of diabetic retinopathyeSAT Journals
Abstract Diabetes is a group of metabolic disease in which a person has high blood sugar. Diabetic Retinopathy (DR) is caused by the abnormalities in the retina due to insufficient insulin in the body. It can lead to sudden vision loss due to delayed detection of retinopathy. So that Diabetic patients require regular medical checkup for effective timing of sight –saving treatment. This is continuous and stimulating research area for automated analysis of Diabetic Retinopathy in Diabetic patients. A completely automated screening system for the detection of Diabetic Retinopathy can effectively reduces the burden of the specialist and saves cost as well as time. Due to noise and other disturbances that occur during image acquisition Diabetic Retinopathy may lead to false detection and this is overcome by various image processing techniques. Further the different features are extracted which serves as the guideline to identify and grade the severity of the disease. Based on the extracted features classification of the retinal image as normal or abnormal is carried out. In this paper, we have presented detail study of various screening methods for Diabetic Retinopathy. Many researchers have made number of attempts to improve accuracy, predictivity, sensitivity and specificity. Keywords: Color Fundus Images, Diabetic Retinopathy, Exudates, Lesions.
An approach for discrimination prevention in data miningeSAT Journals
Abstract In the age of Database technologies a large amount of data is collected and analyzed by using data mining techniques. However, the main issue in data mining is potential privacy invasion and potential discrimination. One of the techniques used in data mining for making decision is classification. On the other hand, if the dataset is biased then the discriminatory decision may occur. Therefore, in this paper we review the recent state of the art approaches for antidiscrimination techniques and also focuses on discrimination discovery and prevention in data mining. On the other hand, we also study a theoretical proposal for enhancing the results of the data quality. Keywords- Antidiscrimination, data mining, direct and indirect discrimination prevention, rule protection, rule generalization, privacy.
Buckling analysis of line continuum with new matrices of stiffness and geometryeSAT Journals
Abstract This research work present buckling analysis of line continuum with new matrices of elastic stiffness and geometric stiffness. The stiffness matrices were developed using energy variational principle. Two deformable nodes were considered at the centre and at the two ends of the continuum which brings the number of deformable node to six. The six term Taylor McLaurin’s shape function was substituted into strain energy equation and the result functional was minimized, resulting in a 6 x 6 stiffness matrix used herein. The six term shape function is also substituted into the geometric work equation and minimized to obtain 6 x 6 geometric stiffness matrix for buckling analysis. The two matrices were employed, as well as traditional 4 x 4 matrices in classical buckling analysis of four line continua. The results from the new 6 x 6 matrices of stiffness and geometry were very close to exact results, with average percentage difference of 2.33% from exact result. Whereas those from the traditional 4 x 4 matrices and 5 x 5 matrices differed from exact results, with average percentage difference of 23.73% and 2.55% respectively. Thus the newly developed 6 x 6 matrices of stiffness and geometry are suitable for classical buckling analysis of line continuum. Keywords: 6x6 stiffness system; buckling; geometry; line continuum; variational principle; deformable node; shape function; classical; numerical; analysis; beam
Comparative study of chemically and mechanically singed knit fabriceSAT Journals
Abstract Single jersey cotton knit fabric of 162 GSM was singed both chemically (enzyme treatment) and mechanically. Mechanically singed fabric exhibited more brightness and whiteness than chemically singed fabric and in the same manner chemically singed fabric resulted more yellowness than mechanically singed fabric. The color fastness to washing of the treated fabric experienced almost same results in both of the cases. On the other hand chemically singed fabric focused on prevailing more strength than mechanically singed one which was evaluated by bursting strength test. Eventually the CMC value of the chemically and mechanically singed, and then dyed, fabric remained within acceptable range. Microscopic view of the treated samples was also taken to capture the surface hairiness characteristics. Keywords: Cotton knit fabric, mechanical singeing, chemical singeing, brightness, whiteness, yellowness, strength, CMC pass fail, surface hairiness.
Study of vibration and its effects on health of a two wheeler ridereSAT Journals
Abstract Majority of Indian population depends on a two wheeler for their transportation due to economic reasons. Because of improper design of vehicle and bad road conditions, people in the age group of 30 to 45 years have pains developed in their body. The percentages of people having musculoskeletal pain problems are found to be 13.33%. Hence an attempt has been made to analyze and obtain the idealized operating conditions of the human body. The analysis has shown that for the given vehicle and human body, the idealized operating speed for HERO HONDA SPLENDOR vehicle on the terrain of specified amplitude at given input is found to be 49.66 km/hr. Index Terms: two wheeler, HERO HONDA SPLENDOR, musculoskeletal, amplitude
Abstract Every company or organization has to use a database for storing information. But what if this database fails? So it behooves the company to use another database for backup purpose. This is called as failover clustering. For making this clustering manageable and lucid the corporat people spend more money on buying a licensed copy for both, the core database and the redundant database. This overhead can be avoided using a database with non-similar platform in which one database is licensed and other may be freeware. If platforms are similar then the transactions between those databases become simpler. But it won’t be the case with non-similar platforms. Hence to overcome this problem in both cases cost effective failover clustering is proposed. For designing such system a common interfacing technique between two non-similar database platforms has to be developed. This will make provision for using any two database platforms for failover clustering. Keywords: CEFC (Cost Effective Failover Clustering), DAL (Data Access Layer), DB (Database), HA (High Availability)
Abstract Recent advancements in MEMS technologies and development in the area of low power microcontrollers have resulted as implementation of wireless sensor networks in real life problem solving in areas like traffic monitoring, patient monitoring , battlefield surveillance. These wireless sensors are very small in size and are operated at low power for low data rate applications. WSN nodes include features like scalability, self-organizing, self-healing. WSN nodes face many challenges starting from deployment till their life span which is dependent on very low battery strength. Since these nodes are operated in unattended environments, many security threats are for them to survive. These nodes face variety of attacks at different layers of their architecture, ranging from physical stealing, tempering to reprogramming. Applying any traditional security mechanism over wireless sensor nodes is also not possible as those traditional algorithms or protocols consume very much processing and power due to their complexity. In this paper, we have mentioned. This paper aims at reporting an initial introduction of WSN, WSN architecture, challenges and security threats subsequently. Keywords: Wireless, Sensor, Threat, Security, Power, Node
An investigation and rectification on failure of bearings of casting shakeout...eSAT Journals
Abstract This project deals with investigation and rectification on failure of bearing of casting shakeout, required in foundry industries to separate solidified casting and sand from mould box. The failure of bearings is mainly due to brinelling tends to create cavities on the bearing raceway. This result in roller and inner races surface of spherical bearing get damage. The rectification of existing system uses four bearings as modified setup of existing system to distribute the load acting on shakeout aims at reducing this frequent breakdown, increases life of bearings and increases the productivity of plant. The software Pro-E wildfire 4 is used for modeling. Keywords: Brinelling, Casting shakeout, Spherical Roller bearing, and Bearing life
A multi classifier prediction model for phishing detectioneSAT Journals
Abstract Phishing is the technique of stealing personal and sensitive information from an email client by sending emails that impersonates like the ones from some trustworthy organizations. Phishing mails are a specific type of spam mails; however the effects of them are much more terrible than alternate sorts. Mostly the phishing attackers aim the clients of the financial organizations, so its detection needs high priority. Lots of research activities are done to detect the phished emails, in the proposed methodology a multi-classifier prediction model is introduced for detecting phished emails. Our contention is that solitary classifier prediction might not be satisfactory to urge the clearest picture of the phishing email; multi-classifier prediction has accuracy 99.8% with an FP rate of 0.8%. Keywords: Phishing email detection, Machine learning techniques, Multi-classifier prediction model, Majority voting
Water quality analysis of bhishma lake at gadag cityeSAT Journals
Abstract The water bodies are facing a severe threat of pollution all over the world. Eutrophication in lakes is widespread world and the severity is increasing especially in the developing countries like India. The main objective is to study t haell eoffveecrt st hoef pwoelrleu ticoonll eocf tweda tefrro qmu aJlaitnyu ianr yB htios hAmpar illa kaen dd ubei -tmo ornatphildy utrebsatsn iwzaetrieo nc aanrrdi etdo ioduetn. tiTfyh eth pe hsyosuicrcoe-cs hoefm pioclalul tciohna rianc ltaekries.t iTchs eo sfa mwaptleers sdaismsopllvees dl iokxey gpeHn,, tBuOrbDid iwtye,r ea lakanlailnyiztye,d tiont aol rdhearr dtnoe sdse,t etromtailn ed itshseo llveevde ls oolfi dcso,n ntaitmraintea,n tpsh porsepsheantte ,i nc hlalokrei dwe,a tpeor taasnsdiu imts, psoosdsiiubmle, smoidtiiguamti,o nni tmraetaes, uprheso.s pWhaatteer, pqoutaalsitsiieusm s,u Dchi ssaos lvpeHd, OTxoytagle nD i(sDsoOlv) eadn sdo lBidios c(hTeDmSic),a lt uOrbxiydgietyn, Dalekmalainnidty ,( BtoOtDal) ,h aCrhdenmesicsa, lc hOloxyrigdeen, sDheomwaendd t h(CatO tDhe) wtoetrael dheatredrnmeisnse, dt ufrobri dwitayt,e rto staaml dpilsesso clvoellde cstoeldid fsr oamnd f iavelk astlaintiiotyn sv ainlu leask ee.x cTeheed iannga ltyhsei sd oesf irlaakbele w laimteirt paanrda mdueete rtos pCrheesmeniccael oOf xoyrggeann icD aenmda nindo r(gCaOnDic) phoallsu tiannctrse aDsiesdso lwvehdic hO xsyhgoewns (DthOe ) phoaosr reqduuaclietdy , oBfi owchaetemr icaanl dO ixsy gneont Dsueimtaabnlde (fBoOr Dh)u manadn consumption these values have been graphically plotted. Keywords: Lake Water Quality analysis, BOD, COD.
Effect of water bath temperature on evaporator and condenser temperature of c...eSAT Journals
Abstract
The objective of this paper is to study the effect of water bath temperature on evaporator and condenser temperature of closed loop pulsating heat pipe with acetone as working fluid. Heat pipe is a heat exchanger with high heat dissipation capacity and used for cooling the electronic equipments. Here in this paper copper has been selected as material for heat pipe due to compatibility of copper with acetone as working fluid. Filling ratio of the working fluid significantly influence on the performance closed loop pulsating heat pipe. Here in this paper 60 % filling ratio has been selected for this filing ratio the effect of increasing water bath temperature on acetone closed loop pulsating heat pipe is investigated.
Keywords: closed loop pulsating heat pipe, condenser, evaporator, working fluid, filling ratio.
Analysis of green’s function and surface current density for rectangular micr...eSAT Journals
Abstract In this paper, Green’s function and surface current density for planar structure has been calculated. The approach makes use of the popular and rigorously used spectral domain full wave analysis method in conjunction with method of moment as numerical analysis tool. In present approach, boundary conditions are applied at patch metallization, which leads to integral equation with the involvement of green’s function in spectral domain, which includes the effect of dielectric, conductor loss, surface wave modes and space wave radiation. By applying Galerkin’s moment method integral equation are transformed to linear set of equations. Entire domain basis function is used to improve the efficiency of the solution. Keywords: Spectral domain full wave analysis, Green function, Galerkin’s moment method, Entire domain basis function, Surface current density.
Freertos based environmental data acquisition using arm cortex m4 f coreeSAT Journals
Abstract In today’s decade of Big Data and complex computing, the Data acquisition systems hold a prime position. In this project, we present a Data acquisition system, which runs on ARM Cortex M4F core microcontroller and this whole system is managed by a real time operating system names FreeRTOS. Tiva C Series Launchpad is used as the board. The system is currently designed to monitor only temperature. A temperature sensor will be used for monitoring temperature. The data received will be stored in SD card. The main objective of this project is to design a data acquisition system, which is portable, cheaper and deterministic. Currently available systems in market are expensive and bulky whereas this FreeRTOS based Data acquisition system is cheaper and portable as well. Keywords: FreeRTOS, ARM Cortex M4F core, Temperature sensor, Tiva C series Launchpad, SD card.
An experimental study on durability and water absorption properties of pervio...eSAT Journals
Abstract Pervious concrete is a special high porosity concrete used for flatwork applications that allows water from precipitation and other source to pass through there by Reducing the Runoff from a site and Recharging Ground Water Levels. Durability and Water Absorption are important properties of Pervious Concrete. This paper represents the experimental methodology and experimental results related to durability and water absorption. Cylinders of size 100 mm Ø and 200 mm height are prepared to investigate both these properties. This investigation should be carried out at the end of 28 days for water absorption and 56 days for durability in which cylinders are immersed in Sodium Chloride (NaCl) Solution after 28 days of casting. Different concrete mix proportion such as 1:6, 1:8 and 1:10 with different size of gravel such as 18.75 mm and 9.375 mm should be used to check both these properties of pervious concrete. Test results indicates that pervious concrete made by 1:6 concrete mix proportion has more durability and less water absorption and pervious concrete made by 1:10 mix proportion has more water absorption and less durability that’s why durability and water absorption are inversely proportional to each other. Keywords: Pervious concrete, porosity, durability, water absorption, sodium chloride solution
Impressive smart card based electronic voting systemeSAT Journals
Abstract
The impressive smart card based electronic voting system is introduced to ensure apposite voting procedures and voting counts. This idea prevents the illegal acts against the voting system and provides the voter authentication in an effective manner. This proposal plans to provide a fortification for each and every votes. It entirely changes the status of the election process and ensures the integrity of electoral system. The primary idea of this paper is to make the voters as to have a trust in election through the methods of taking fingerprint and providing a smart card to each user to promise their uniqueness in the voting system and reduces the work of election committee. At the same time the result of the election process will be automatically declared to the public. With the help of this method, the person can vote from any election booth easily.
Keywords- Electronic voting system, authentication, biometric fingerprinting method, smart card
Flue gas low temperature heat recovery system for air conditioningeSAT Journals
Abstract Huge amount of energy wasted through the flue gas in thermal power station causes great concern in recent years. Discharging hot flue gas in the environment is not only a wastage of energy but also increases the rate of global warming. Efforts are given world -wide to harness the energy for useful purposes. In this work, the waste heat of flue gas in a 350 MW thermal power plant is utilized in vapor absorption air conditioning plant. Gas to liquid multi-pass cross flow heat exchanger that have been placed in the existing space between boiler and chimney. The dimensions of the finally selected heat exchanger are 0.106m × 2.4m × 3.4m. The number of pipes required for the heat exchanger is found to be 12 using iteration method and temperature of water at the outlet of last pipe is 101.1℃. The extracted energy from the flue gas is used to heat water to be utilized in the generator of a vapor absorption refrigeration system that has produced a refrigerating capacity of 70 TR. approximately. Due to the corrosive nature of flue gas, heat recovery is confined up to the acid dew point temperature of the flue gas. Suitable software is used to find out the detailed design parameters of Gas to liquid multi-pass cross flow heat exchangers. Out of many feasible designs of heat exchangers, the most economic design is selected as the final design. Keywords—Air Conditioning; Flue Gas; Heat Exchanger; Heat Recovery; Vapour Absorption Machine
Speech compression analysis using matlabeSAT Journals
Abstract The growth of the cellular technology and wireless networks all over the world has increased the demand for digital information by manifold. This massive demand poses difficulties for handling huge amounts of data that need to be stored and transferred. To overcome this problem we can compress the information by removing the redundancies present in it. Redundancies are the major source of generating errors and noisy signals. Coding in MATLAB helps in analyzing compression of speech signals with varying bit rate and remove errors and noisy signals from the speech signals. Speech signal’s bit rate can also be reduced to remove error and noisy signals which is suitable for remote broadcast lines, studio links, satellite transmission of high quality audio and voice over internet This paper focuses on speech compression process and its analysis through MATLAB by which processed speech signal can be heard with clarity and in noiseless mode at the receiver end . Keywords: Speech compression, bit rate, filter, MPEG, DCT.
Behavior of r.c.c. beam with rectangular opening strengthened by cfrp and gfr...eSAT Journals
Abstract
In the construction of multistory buildings the opening in beams are provided for utility ducts and pipes. Providing an opening in
beam develops cracks around the opening due to stress concentration. In this paper the behavior of R.C.C. beam with rectangular
opening strengthened by CFRP and GFRP sheets were studied. This paper presents the behavior of R.C.C. beam with rectangular
opening strengthened by CFRP and GFRP sheets with different techniques. In this experimental study total ten beams were casted,
one beam without opening (i.e. solid beam) and one beam with rectangular post opening and these two considered as a control beams
for comparison. The remaining eight beams were externally strengthened by Carbon fiber reinforced polymer (CFRP) and Glass fiber
reinforced polymer (GFRP) sheets with different strengthening techniques i.e. around the opening, inside the opening, inside and
around the opening and double layer around the opening. These beams were tested under two point loading in the loading frame. The
effect of CFRP and GFRP sheets with different strengthening schemes on such beams were studied interns of initial crack load,
ultimate failure load, cracking pattern and deflection, From the experimental results it is concluded that the ultimate load carrying
capacity of the R.C.C. beam with opening strengthened with GFRP sheets of different schemes were increased in the range of 3.74 to
37.41% and beams strengthened with CFRP sheets increased in the range of 9.35% to 50.50%. Among all these techniques, the
strengthening with CFRP around and inside the opening was found more effective in improving the ultimate load carrying capacity of
beam. This investigation helps the practicing engineers to provide an opening in the beams without reducing its load carrying
capacity.
Keywords: Reinforced concrete beams, Beams with rectangular opening, CFRP, GFRP, Strengthening schemes, Ultimate
load carrying capacity.
Trajectory Segmentation and Sampling of Moving Objects Based On Representativ...ijsrd.com
Moving Object Databases (MOD), although ubiquitous, still call for methods that will be able to understand, search, analyze, and browse their spatiotemporal content. In this paper, we propose a method for trajectory segmentation and sampling based on the representativeness of the (sub) trajectories in the MOD. In order to find the most representative sub trajectories, the following methodology is proposed. First, a novel global voting algorithm is performed, based on local density and trajectory similarity information. This method is applied for each segment of the trajectory, forming a local trajectory descriptor that represents line segment representativeness. The sequence of this descriptor over a trajectory gives the voting signal of the trajectory, where high values correspond to the most representative parts. Then, a novel segmentation algorithm is applied on this signal that automatically estimates the number of partitions and the partition borders, identifying homogenous partitions concerning their representativeness. Finally, a sampling method over the resulting segments yields the most representative sub trajectories in the MOD. Our experimental results in synthetic and real MOD verify the effectiveness of the proposed scheme, also in comparison with other sampling techniques.
Drsp dimension reduction for similarity matching and pruning of time series ...IJDKP
Similarity matching and join of time series data streams has gained a lot of relevance in today’s world that
has large streaming data. This process finds wide scale application in the areas of location tracking,
sensor networks, object positioning and monitoring to name a few. However, as the size of the data stream
increases, the cost involved to retain all the data in order to aid the process of similarity matching also
increases. We develop a novel framework to addresses the following objectives. Firstly, Dimension
reduction is performed in the preprocessing stage, where large stream data is segmented and reduced into
a compact representation such that it retains all the crucial information by a technique called Multi-level
Segment Means (MSM). This reduces the space complexity associated with the storage of large time-series
data streams. Secondly, it incorporates effective Similarity Matching technique to analyze if the new data
objects are symmetric to the existing data stream. And finally, the Pruning Technique that filters out the
pseudo data object pairs and join only the relevant pairs. The computational cost for MSM is O(l*ni) and
the cost for pruning is O(DRF*wsize*d), where DRF is the Dimension Reduction Factor. We have
performed exhaustive experimental trials to show that the proposed framework is both efficient and
competent in comparison with earlier works.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Automatic registration, integration and enhancement of india's chandrayaan 1 ...eSAT Journals
Abstract Chandrayaan-1 was India's first mission in deep space exploration to the moon. Its Terrain Mapping Camera (TMC) sent images of about 50% of total lunar surface in its limited lifetime and covered polar areas almost completely at a high resolution of 5m/pixel and 10m/pixel. This image dataset has been processed and put in public domain as individual strips of images categorized according to the orbits. The authors have already developed a Lunar GIS including a set of utilities like 3-D vision and exploration, crater detection and search using datasets from NASA's Lunar Reconnaissance Orbiter Wide Angle Camera (WAC) which are of lower resolution than CH1. The objective of this paper is to normalize and register the Chandrayaan-1 images to existing processed data so that all these utilities can be transparently applied to high resolution Chandrayaan-1 datasets. Registration process consists of identification of features in source and target images and estimating appropriate correction for offset, rotation and scaling parameters. Furthermore, due to the low altitude orbit of satellite, the acquired images have displacement of pixels from actual nadir position, which need non-linear correction. This paper describes step by step technique to integrate these high and low resolution images in single framework. Keywords: Chandrayaan-1Lunar mapping, Moon, Feature based Image registration, Integration, ISRO, LRO, NASA, TMC, WAC.
Hyperparameters analysis of long short-term memory architecture for crop cla...IJECEIAES
Deep learning (DL) has seen a massive rise in popularity for remote sensing (RS) based applications over the past few years. However, the performance of DL algorithms is dependent on the optimization of various hyperparameters since the hyperparameters have a huge impact on the performance of deep neural networks. The impact of hyperparameters on the accuracy and reliability of DL models is a significant area for investigation. In this study, the grid Search algorithm is used for hyperparameters optimization of long short-term memory (LSTM) network for the RS-based classification. The hyperparameters considered for this study are, optimizer, activation function, batch size, and the number of LSTM layers. In this study, over 1,000 hyperparameter sets are evaluated and the result of all the sets are analyzed to see the effects of various combinations of hyperparameters as well the individual parameter effect on the performance of the LSTM model. The performance of the LSTM model is evaluated using the performance metric of minimum loss and average loss and it was found that classification can be highly affected by the choice of optimizer; however, other parameters such as the number of LSTM layers have less influence.
Scalable and efficient cluster based framework for multidimensional indexingeSAT Journals
Abstract Indexing high dimensional data has its utility in many real world applications. Especially the information retrieval process is dramatically improved. The existing techniques could overcome the problem of “Curse of Dimensionality” of high dimensional data sets by using a technique known as Vector Approximation-File which resulted in sub-optimal performance. When compared with VA-File clustering results in more compact data set as it uses inter-dimensional correlations. However, pruning of unwanted clusters is important. The existing pruning techniques are based on bounding rectangles, bounding hyper spheres have problems in NN search. To overcome this problem Ramaswamy and Rose proposed an approach known as adaptive cluster distance bounding for high dimensional indexing which also includes an efficient spatial filtering. In this paper we implement this high-dimensional indexing approach. We built a prototype application to for proof of concept. Experimental results are encouraging and the prototype can be used in real time applications. Index Terms–Clustering, high dimensional indexing, similarity measures, and multimedia databases
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
SPATIO-TEMPORAL QUERIES FOR MOVING OBJECTS DATA WAREHOUSINGijdms
In the last decade, Moving Object Databases (MODs) have attracted a lot of attention from researchers.
Several research works were conducted to extend traditional database techniques to accommodate the new
requirements imposed by the continuous change in location information of moving objects. Managing,
querying, storing, and mining moving objects were the key research directions. This extensive interest in
moving objects is a natural consequence of the recent ubiquitous location-aware devices, such as PDAs,
mobile phones, etc., as well as the variety of information that can be extracted from such new databases. In
this paper we propose a Spatio-Temporal data warehousing (STDW) for efficiently querying location
information of moving objects. The proposed schema introduces new measures like direction majority and
other direction-based measures that enhance the decision making based on location information
Object Classification of Satellite Images Using Cluster Repulsion Based Kerne...IOSR Journals
Abstract: We investigated the Classification of satellite images and multispectral remote sensing data .we
focused on uncertainty analysis in the produced land-cover maps .we proposed an efficient technique for
classifying the multispectral satellite images using Support Vector Machine (SVM) into road area, building area
and green area. We carried out classification in three modules namely (a) Preprocessing using Gaussian
filtering and conversion from conversion of RGB to Lab color space image (b) object segmentation using
proposed Cluster repulsion based kernel Fuzzy C- Means (FCM) and (c) classification using one-to-many SVM
classifier. The goal of this research is to provide the efficiency in classification of satellite images using the
object-based image analysis. The proposed work is evaluated using the satellite images and the accuracy of the
proposed work is compared to FCM based classification. The results showed that the proposed technique has
achieved better results reaching an accuracy of 79%, 84%, 81% and 97.9% for road, tree, building and vehicle
classification respectively.
Keywords:-Satellite image, FCM Clustering, Classification, SVM classifier.
USING ONTOLOGY BASED SEMANTIC ASSOCIATION RULE MINING IN LOCATION BASED SERVICESIJDKP
Recently, GPS and mobile devices allowed collecting a huge amount of mobility data. Researchers from
different communities have developed models and techniques for mobility analysis. But they mainly focused
on the geometric properties of trajectories and do not consider the semantic facet of moving objects. The
techniques are good at extracting patterns, but they are hard to interpret in a specific application domain.
This paper proposes a methodology to understand mobility data and semantically interpret trajectory
patterns. The process considers four different behavior types such as semantic, semantic and space,
semantic and time, and semantic and space-time. Finally, a system prototype was developed to evaluate the
behavior models in different aspects using one of the location based services. The results showed that
applying the semantic association rules could significantly reduce the number of available services and
customize the services based on the rules.
Self scale estimation of the tracking window merged with adaptive particle fi...IJECEIAES
Tracking a mobile object is one of the important topics in pattern recognition, but style has some obstacles. A reliable tracking system must adjust their tracking windows in real time according to appearance changes of the tracked object. Furthermore, it has to deal with many challenges when one or multiple objects need to be tracked, for instance when the target is partially or fully occluded, background clutter, or even some target region is blurred. In this paper, we will present a novel approach for a single object tracking that combines particle filter algorithm and kernel distribution that update its tracking window according to object scale changes, whose name is multi-scale adaptive particle filter tracker. We will demonstrate that the use of particle filter combined with kernel distribution inside the resampling process will provide more accurate object localization within a research area. Furthermore, its average error for target localization was significantly lower than 21.37 pixels as the mean value. We have conducted several experiments on real video sequences and compared acquired results to other existing state of the art trackers to demonstrate the effectiveness of the multi-scale adaptive particle filter tracker.
Performance analysis of change detection techniques for land use land coverIJECEIAES
Remotely sensed satellite images have become essential to observe the spatial and temporal changes occurring due to either natural phenomenon or man-induced changes on the earth’s surface. Real time monitoring of this data provides useful information related to changes in extent of urbanization, environmental changes, water bodies, and forest. Through the use of remote sensing technology and geographic information system tools, it has become easier to monitor changes from past to present. In the present scenario, choosing a suitable change detection method plays a pivotal role in any remote sensing project. Previously, digital change detection was a tedious task. With the advent of machine learning techniques, it has become comparatively easier to detect changes in the digital images. The study gives a brief account of the main techniques of change detection related to land use land cover information. An effort is made to compare widely used change detection methods used to identify changes and discuss the need for development of enhanced change detection methods.
Abstract
The exponential growth of knowledge in the World Wide Web, has understood the need to develop economical and effective ways for organizing relevant contents. In the field of web computing, document clustering plays a vital role and plays an interesting and challenging problem. Document clustering is mainly used for grouping the similar documents in the search engine. The web also has rich and dynamic collection of hyperlink information. The retrieval of relevant document from the internet is the complicated task. Based on the user’s query the document will be retrieved from the various databases to give relevant information and additional information for the given query. The documents are already clustered based on keyword extraction and stored in the database. The probabilistic relational approach for web document clustering is to find the relation between two linked pages and to define a relational clustering algorithm based on probabilistic graph representation. In document clustering, both content information and hyperlink structure of web page are considered and document is viewed as a semantic units. It also provides additional information to the user.
Keywords: Document Clustering, Agglomerative Clustering, Entropy, F-Measure
Similar to An innovative idea to discover the trend on multi dimensional spatio-temporal datasets (20)
Mechanical properties of hybrid fiber reinforced concrete for pavementseSAT Journals
Abstract
The effect of addition of mono fibers and hybrid fibers on the mechanical properties of concrete mixture is studied in the present
investigation. Steel fibers of 1% and polypropylene fibers 0.036% were added individually to the concrete mixture as mono fibers and
then they were added together to form a hybrid fiber reinforced concrete. Mechanical properties such as compressive, split tensile and
flexural strength were determined. The results show that hybrid fibers improve the compressive strength marginally as compared to
mono fibers. Whereas, hybridization improves split tensile strength and flexural strength noticeably.
Keywords:-Hybridization, mono fibers, steel fiber, polypropylene fiber, Improvement in mechanical properties.
Material management in construction – a case studyeSAT Journals
Abstract
The objective of the present study is to understand about all the problems occurring in the company because of improper application
of material management. In construction project operation, often there is a project cost variance in terms of the material, equipments,
manpower, subcontractor, overhead cost, and general condition. Material is the main component in construction projects. Therefore,
if the material management is not properly managed it will create a project cost variance. Project cost can be controlled by taking
corrective actions towards the cost variance. Therefore a methodology is used to diagnose and evaluate the procurement process
involved in material management and launch a continuous improvement was developed and applied. A thorough study was carried
out along with study of cases, surveys and interviews to professionals involved in this area. As a result, a methodology for diagnosis
and improvement was proposed and tested in selected projects. The results obtained show that the main problem of procurement is
related to schedule delays and lack of specified quality for the project. To prevent this situation it is often necessary to dedicate
important resources like money, personnel, time, etc. To monitor and control the process. A great potential for improvement was
detected if state of the art technologies such as, electronic mail, electronic data interchange (EDI), and analysis were applied to the
procurement process. These helped to eliminate the root causes for many types of problems that were detected.
Managing drought short term strategies in semi arid regions a case studyeSAT Journals
Abstract
Drought management needs multidisciplinary action. Interdisciplinary efforts among the experts in various fields of the droughts
prone areas are helpful to achieve tangible and permanent solution for this recurring problem. The Gulbarga district having the total
area around 16, 240 sq.km, and accounts 8.45 per cent of the Karnataka state area. The district has been situated with latitude 17º 19'
60" North and longitude of 76 º 49' 60" east. The district is situated entirely on the Deccan plateau positioned at a height of 300 to
750 m above MSL. Sub-tropical, semi-arid type is one among the drought prone districts of Karnataka State. The drought
management is very important for a district like Gulbarga. In this paper various short term strategies are discussed to mitigate the
drought condition in the district.
Keywords: Drought, South-West monsoon, Semi-Arid, Rainfall, Strategies etc.
Life cycle cost analysis of overlay for an urban road in bangaloreeSAT Journals
Abstract
Pavements are subjected to severe condition of stresses and weathering effects from the day they are constructed and opened to traffic
mainly due to its fatigue behavior and environmental effects. Therefore, pavement rehabilitation is one of the most important
components of entire road systems. This paper highlights the design of concrete pavement with added mono fibers like polypropylene,
steel and hybrid fibres for a widened portion of existing concrete pavement and various overlay alternatives for an existing
bituminous pavement in an urban road in Bangalore. Along with this, Life cycle cost analyses at these sections are done by Net
Present Value (NPV) method to identify the most feasible option. The results show that though the initial cost of construction of
concrete overlay is high, over a period of time it prove to be better than the bituminous overlay considering the whole life cycle cost.
The economic analysis also indicates that, out of the three fibre options, hybrid reinforced concrete would be economical without
compromising the performance of the pavement.
Keywords: - Fatigue, Life cycle cost analysis, Net Present Value method, Overlay, Rehabilitation
Laboratory studies of dense bituminous mixes ii with reclaimed asphalt materialseSAT Journals
Abstract
The issue of growing demand on our nation’s roadways over that past couple of decades, decreasing budgetary funds, and the need to
provide a safe, efficient, and cost effective roadway system has led to a dramatic increase in the need to rehabilitate our existing
pavements and the issue of building sustainable road infrastructure in India. With these emergency of the mentioned needs and this
are today’s burning issue and has become the purpose of the study.
In the present study, the samples of existing bituminous layer materials were collected from NH-48(Devahalli to Hassan) site.The
mixtures were designed by Marshall Method as per Asphalt institute (MS-II) at 20% and 30% Reclaimed Asphalt Pavement (RAP).
RAP material was blended with virgin aggregate such that all specimens tested for the, Dense Bituminous Macadam-II (DBM-II)
gradation as per Ministry of Roads, Transport, and Highways (MoRT&H) and cost analysis were carried out to know the economics.
Laboratory results and analysis showed the use of recycled materials showed significant variability in Marshall Stability, and the
variability increased with the increase in RAP content. The saving can be realized from utilization of recycled materials as per the
methodology, the reduction in the total cost is 19%, 30%, comparing with the virgin mixes.
Keywords: Reclaimed Asphalt Pavement, Marshall Stability, MS-II, Dense Bituminous Macadam-II
Laboratory investigation of expansive soil stabilized with natural inorganic ...eSAT Journals
Abstract
Soil stabilization has proven to be one of the oldest techniques to improve the soil properties. Literature review conducted revealed
that uses of natural inorganic stabilizers are found to be one of the best options for soil stabilization. In this regard an attempt has
been made to evaluate the influence of RBI-81 stabilizer on properties of black cotton soil through laboratory investigations. Black
cotton soil with varying percentages of RBI-81 viz., 0, 0.5, 1, 1.5, 2, and 2.5 percent were studied for moisture density relationships
and strength behaviour of soils. Also the effect of curing period was evaluated as literature review clearly emphasized the strength
gain of soils stabilized with RBI-81 over a period of time. The results obtained shows that the unconfined compressive strength of
specimens treated with RBI-81 increased approximately by 250% for a curing period of 28 days as compared to virgin soil. Further
the CBR value improved approximately by 400%. The studies indicated an increasing trend for soil strength behaviour with
increasing percentage of RBI-81 suggesting its potential applications in soil stabilization.
Influence of reinforcement on the behavior of hollow concrete block masonry p...eSAT Journals
Abstract
Reinforced masonry was developed to exploit the strength potential of masonry and to solve its lack of tensile strength. Experimental
and analytical studies have been carried out to investigate the effect of reinforcement on the behavior of hollow concrete block
masonry prisms under compression and to predict ultimate failure compressive strength. In the numerical program, three dimensional
non-linear finite elements (FE) model based on the micro-modeling approach is developed for both unreinforced and reinforced
masonry prisms using ANSYS (14.5). The proposed FE model uses multi-linear stress-strain relationships to model the non-linear
behavior of hollow concrete block, mortar, and grout. Willam-Warnke’s five parameter failure theory has been adopted to model the
failure of masonry materials. The comparison of the numerical and experimental results indicates that the FE models can successfully
capture the highly nonlinear behavior of the physical specimens and accurately predict their strength and failure mechanisms.
Keywords: Structural masonry, Hollow concrete block prism, grout, Compression failure, Finite element method,
Numerical modeling.
Influence of compaction energy on soil stabilized with chemical stabilizereSAT Journals
Abstract
Increase in traffic along with heavier magnitude of wheel loads cause rapid deterioration in pavements. There is a need to improve
density, strength of soil subgrade and other pavement layers. In this study an attempt is made to improve the properties of locally
available loamy soil using twin approaches viz., i) increasing the compaction of soil and ii) treating the soil with chemical stabilizer.
Laboratory studies are carried out on both untreated and treated soil samples compacted by different compaction efforts. Studies
show that increase in compaction effort results in increase in density of soil. However in soil treated with chemical stabilizer, rate of
increase in density is not significant. The soil treated with chemical stabilizer exhibits improvement in both strength and performance
properties.
Keywords: compaction, density, subgradestabilization, resilient modulus
Geographical information system (gis) for water resources managementeSAT Journals
Abstract
Water resources projects are inherited with overlapping and at times conflicting objectives. These projects are often of varied sizes
ranging from major projects with command areas of millions of hectares to very small projects implemented at the local level. Thus,
in all these projects there is seldom proper coordination which is essential for ensuring collective sustainability.
Integrated watershed development and management is the accepted answer but in turn requires a comprehensive framework that can
enable planning process involving all the stakeholders at different levels and scales is compulsory. Such a unified hydrological
framework is essential to evaluate the cause and effect of all the proposed actions within the drainage basins.
The present paper describes a hydrological framework developed in the form of a Hydrologic Information System (HIS) which is
intended to meet the specific information needs of the various line departments of a typical State connected with water related aspects.
The HIS consist of a hydrologic information database coupled with tools for collating primary and secondary data and tools for
analyzing and visualizing the data and information. The HIS also incorporates hydrological model base for indirect assessment of
various entities of water balance in space and time. The framework would be maintained and updated to reflect fully the most
accurate ground truth data and the infrastructure requirements for planning and management.
Keywords: Hydrological Information System (HIS); WebGIS; Data Model; Web Mapping Services
Forest type mapping of bidar forest division, karnataka using geoinformatics ...eSAT Journals
Abstract
The study demonstrate the potentiality of satellite remote sensing technique for the generation of baseline information on forest types
including tree plantation details in Bidar forest division, Karnataka covering an area of 5814.60Sq.Kms. The Total Area of Bidar
forest division is 5814Sq.Kms analysis of the satellite data in the study area reveals that about 84% of the total area is Covered by
crop land, 1.778% of the area is covered by dry deciduous forest, 1.38 % of mixed plantation, which is very threatening to the
environmental stability of the forest, future plantation site has been mapped. With the use of latest Geo-informatics technology proper
and exact condition of the trees can be observed and necessary precautions can be taken for future plantation works in an appropriate
manner
Keywords:-RS, GIS, GPS, Forest Type, Tree Plantation
Factors influencing compressive strength of geopolymer concreteeSAT Journals
Abstract
To study effects of several factors on the properties of fly ash based geopolymer concrete on the compressive strength and also the
cost comparison with the normal concrete. The test variables were molarities of sodium hydroxide(NaOH) 8M,14M and 16M, ratio of
NaOH to sodium silicate (Na2SiO3) 1, 1.5, 2 and 2.5, alkaline liquid to fly ash ratio 0.35 and 0.40 and replacement of water in
Na2SiO3 solution by 10%, 20% and 30% were used in the present study. The test results indicated that the highest compressive
strength 54 MPa was observed for 16M of NaOH, ratio of NaOH to Na2SiO3 2.5 and alkaline liquid to fly ash ratio of 0.35. Lowest
compressive strength of 27 MPa was observed for 8M of NaOH, ratio of NaOH to Na2SiO3 is 1 and alkaline liquid to fly ash ratio of
0.40. Alkaline liquid to fly ash ratio of 0.35, water replacement of 10% and 30% for 8 and 16 molarity of NaOH and has resulted in
compressive strength of 36 MPa and 20 MPa respectively. Superplasticiser dosage of 2 % by weight of fly ash has given higher
strength in all cases.
Keywords: compressive strength, alkaline liquid, fly ash
Experimental investigation on circular hollow steel columns in filled with li...eSAT Journals
Abstract
Composite Circular hollow Steel tubes with and without GFRP infill for three different grades of Light weight concrete are tested for
ultimate load capacity and axial shortening , under Cyclic loading. Steel tubes are compared for different lengths, cross sections and
thickness. Specimens were tested separately after adopting Taguchi’s L9 (Latin Squares) Orthogonal array in order to save the initial
experimental cost on number of specimens and experimental duration. Analysis was carried out using ANN (Artificial Neural
Network) technique with the assistance of Mini Tab- a statistical soft tool. Comparison for predicted, experimental & ANN output is
obtained from linear regression plots. From this research study, it can be concluded that *Cross sectional area of steel tube has most
significant effect on ultimate load carrying capacity, *as length of steel tube increased- load carrying capacity decreased & *ANN
modeling predicted acceptable results. Thus ANN tool can be utilized for predicting ultimate load carrying capacity for composite
columns.
Keywords: Light weight concrete, GFRP, Artificial Neural Network, Linear Regression, Back propagation, orthogonal
Array, Latin Squares
Experimental behavior of circular hsscfrc filled steel tubular columns under ...eSAT Journals
Abstract
This paper presents an outlook on experimental behavior and a comparison with predicted formula on the behaviour of circular
concentrically loaded self-consolidating fibre reinforced concrete filled steel tube columns (HSSCFRC). Forty-five specimens were
tested. The main parameters varied in the tests are: (1) percentage of fiber (2) tube diameter or width to wall thickness ratio (D/t
from 15 to 25) (3) L/d ratio from 2.97 to 7.04 the results from these predictions were compared with the experimental data. The
experimental results) were also validated in this study.
Keywords: Self-compacting concrete; Concrete-filled steel tube; axial load behavior; Ultimate capacity.
Evaluation of punching shear in flat slabseSAT Journals
Abstract
Flat-slab construction has been widely used in construction today because of many advantages that it offers. The basic philosophy in
the design of flat slab is to consider only gravity forces; this method ignores the effect of punching shear due to unbalanced moments
at the slab column junction which is critical. An attempt has been made to generate generalized design sheets which accounts both
punching shear due to gravity loads and unbalanced moments for cases (a) interior column; (b) edge column (bending perpendicular
to shorter edge); (c) edge column (bending parallel to shorter edge); (d) corner column. These design sheets are prepared as per
codal provisions of IS 456-2000. These design sheets will be helpful in calculating the shear reinforcement to be provided at the
critical section which is ignored in many design offices. Apart from its usefulness in evaluating punching shear and the necessary
shear reinforcement, the design sheets developed will enable the designer to fix the depth of flat slab during the initial phase of the
design.
Keywords: Flat slabs, punching shear, unbalanced moment.
Evaluation of performance of intake tower dam for recent earthquake in indiaeSAT Journals
Abstract
Intake towers are typically tall, hollow, reinforced concrete structures and form entrance to reservoir outlet works. A parametric
study on dynamic behavior of circular cylindrical towers can be carried out to study the effect of depth of submergence, wall thickness
and slenderness ratio, and also effect on tower considering dynamic analysis for time history function of different soil condition and
by Goyal and Chopra accounting interaction effects of added hydrodynamic mass of surrounding and inside water in intake tower of
dam
Key words: Hydrodynamic mass, Depth of submergence, Reservoir, Time history analysis,
Evaluation of operational efficiency of urban road network using travel time ...eSAT Journals
Abstract
Efficiency of the road network system is analyzed by travel time reliability measures. The study overlooks on an important measure of
travel time reliability and prioritizing Tiruchirappalli road network. Traffic volume and travel time were collected using license plate
matching method. Travel time measures were estimated from average travel time and 95th travel time. Effect of non-motorized vehicle
on efficiency of road system was evaluated. Relation between buffer time index and traffic volume was created. Travel time model has
been developed and travel time measure was validated. Then service quality of road sections in network were graded based on
travel time reliability measures.
Keywords: Buffer Time Index (BTI); Average Travel Time (ATT); Travel Time Reliability (TTR); Buffer Time (BT).
Estimation of surface runoff in nallur amanikere watershed using scs cn methodeSAT Journals
Abstract
The development of watershed aims at productive utilization of all the available natural resources in the entire area extending from
ridge line to stream outlet. The per capita availability of land for cultivation has been decreasing over the years. Therefore, water and
the related land resources must be developed, utilized and managed in an integrated and comprehensive manner. Remote sensing and
GIS techniques are being increasingly used for planning, management and development of natural resources. The study area, Nallur
Amanikere watershed geographically lies between 110 38’ and 110 52’ N latitude and 760 30’ and 760 50’ E longitude with an area of
415.68 Sq. km. The thematic layers such as land use/land cover and soil maps were derived from remotely sensed data and overlayed
through ArcGIS software to assign the curve number on polygon wise. The daily rainfall data of six rain gauge stations in and around
the watershed (2001-2011) was used to estimate the daily runoff from the watershed using Soil Conservation Service - Curve Number
(SCS-CN) method. The runoff estimated from the SCS-CN model was then used to know the variation of runoff potential with different
land use/land cover and with different soil conditions.
Keywords: Watershed, Nallur watershed, Surface runoff, Rainfall-Runoff, SCS-CN, Remote Sensing, GIS.
Estimation of morphometric parameters and runoff using rs & gis techniqueseSAT Journals
Abstract
Land and water are the two vital natural resources, the optimal management of these resources with minimum adverse environmental
impact are essential not only for sustainable development but also for human survival. Satellite remote sensing with geographic
information system has a pragmatic approach to map and generate spatial input layers of predicting response behavior and yield of
watershed. Hence, in the present study an attempt has been made to understand the hydrological process of the catchment at the
watershed level by drawing the inferences from moprhometric analysis and runoff. The study area chosen for the present study is
Yagachi catchment situated in Chickamaglur and Hassan district lies geographically at a longitude 75⁰52’08.77”E and
13⁰10’50.77”N latitude. It covers an area of 559.493 Sq.km. Morphometric analysis is carried out to estimate morphometric
parameters at Micro-watershed to understand the hydrological response of the catchment at the Micro-watershed level. Daily runoff
is estimated using USDA SCS curve number model for a period of 10 years from 2001 to 2010. The rainfall runoff relationship of the
study shows there is a positive correlation.
Keywords: morphometric analysis, runoff, remote sensing and GIS, SCS - method
-
Effect of variation of plastic hinge length on the results of non linear anal...eSAT Journals
Abstract The nonlinear Static procedure also well known as pushover analysis is method where in monotonically increasing loads are applied to the structure till the structure is unable to resist any further load. It is a popular tool for seismic performance evaluation of existing and new structures. In literature lot of research has been carried out on conventional pushover analysis and after knowing deficiency efforts have been made to improve it. But actual test results to verify the analytically obtained pushover results are rarely available. It has been found that some amount of variation is always expected to exist in seismic demand prediction of pushover analysis. Initial study is carried out by considering user defined hinge properties and default hinge length. Attempt is being made to assess the variation of pushover analysis results by considering user defined hinge properties and various hinge length formulations available in literature and results compared with experimentally obtained results based on test carried out on a G+2 storied RCC framed structure. For the present study two geometric models viz bare frame and rigid frame model is considered and it is found that the results of pushover analysis are very sensitive to geometric model and hinge length adopted. Keywords: Pushover analysis, Base shear, Displacement, hinge length, moment curvature analysis
Effect of use of recycled materials on indirect tensile strength of asphalt c...eSAT Journals
Abstract
Depletion of natural resources and aggregate quarries for the road construction is a serious problem to procure materials. Hence
recycling or reuse of material is beneficial. On emphasizing development in sustainable construction in the present era, recycling of
asphalt pavements is one of the effective and proven rehabilitation processes. For the laboratory investigations reclaimed asphalt
pavement (RAP) from NH-4 and crumb rubber modified binder (CRMB-55) was used. Foundry waste was used as a replacement to
conventional filler. Laboratory tests were conducted on asphalt concrete mixes with 30, 40, 50, and 60 percent replacement with RAP.
These test results were compared with conventional mixes and asphalt concrete mixes with complete binder extracted RAP
aggregates. Mix design was carried out by Marshall Method. The Marshall Tests indicated highest stability values for asphalt
concrete (AC) mixes with 60% RAP. The optimum binder content (OBC) decreased with increased in RAP in AC mixes. The Indirect
Tensile Strength (ITS) for AC mixes with RAP also was found to be higher when compared to conventional AC mixes at 300C.
Keywords: Reclaimed asphalt pavement, Foundry waste, Recycling, Marshall Stability, Indirect tensile strength.
Welcome to WIPAC Monthly the magazine brought to you by the LinkedIn Group Water Industry Process Automation & Control.
In this month's edition, along with this month's industry news to celebrate the 13 years since the group was created we have articles including
A case study of the used of Advanced Process Control at the Wastewater Treatment works at Lleida in Spain
A look back on an article on smart wastewater networks in order to see how the industry has measured up in the interim around the adoption of Digital Transformation in the Water Industry.
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Dr.Costas Sachpazis
Terzaghi's soil bearing capacity theory, developed by Karl Terzaghi, is a fundamental principle in geotechnical engineering used to determine the bearing capacity of shallow foundations. This theory provides a method to calculate the ultimate bearing capacity of soil, which is the maximum load per unit area that the soil can support without undergoing shear failure. The Calculation HTML Code included.
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptxR&R Consult
CFD analysis is incredibly effective at solving mysteries and improving the performance of complex systems!
Here's a great example: At a large natural gas-fired power plant, where they use waste heat to generate steam and energy, they were puzzled that their boiler wasn't producing as much steam as expected.
R&R and Tetra Engineering Group Inc. were asked to solve the issue with reduced steam production.
An inspection had shown that a significant amount of hot flue gas was bypassing the boiler tubes, where the heat was supposed to be transferred.
R&R Consult conducted a CFD analysis, which revealed that 6.3% of the flue gas was bypassing the boiler tubes without transferring heat. The analysis also showed that the flue gas was instead being directed along the sides of the boiler and between the modules that were supposed to capture the heat. This was the cause of the reduced performance.
Based on our results, Tetra Engineering installed covering plates to reduce the bypass flow. This improved the boiler's performance and increased electricity production.
It is always satisfying when we can help solve complex challenges like this. Do your systems also need a check-up or optimization? Give us a call!
Work done in cooperation with James Malloy and David Moelling from Tetra Engineering.
More examples of our work https://www.r-r-consult.dk/en/cases-en/
About
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Technical Specifications
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
Key Features
Indigenized remote control interface card suitable for MAFI system CCR equipment. Compatible for IDM8000 CCR. Backplane mounted serial and TCP/Ethernet communication module for CCR remote access. IDM 8000 CCR remote control on serial and TCP protocol.
• Remote control: Parallel or serial interface
• Compatible with MAFI CCR system
• Copatiable with IDM8000 CCR
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
Application
• Remote control: Parallel or serial interface.
• Compatible with MAFI CCR system.
• Compatible with IDM8000 CCR.
• Compatible with Backplane mount serial communication.
• Compatible with commercial and Defence aviation CCR system.
• Remote control system for accessing CCR and allied system over serial or TCP.
• Indigenized local Support/presence in India.
• Easy in configuration using DIP switches.
Immunizing Image Classifiers Against Localized Adversary Attacksgerogepatton
This paper addresses the vulnerability of deep learning models, particularly convolutional neural networks
(CNN)s, to adversarial attacks and presents a proactive training technique designed to counter them. We
introduce a novel volumization algorithm, which transforms 2D images into 3D volumetric representations.
When combined with 3D convolution and deep curriculum learning optimization (CLO), itsignificantly improves
the immunity of models against localized universal attacks by up to 40%. We evaluate our proposed approach
using contemporary CNN architectures and the modified Canadian Institute for Advanced Research (CIFAR-10
and CIFAR-100) and ImageNet Large Scale Visual Recognition Challenge (ILSVRC12) datasets, showcasing
accuracy improvements over previous techniques. The results indicate that the combination of the volumetric
input and curriculum learning holds significant promise for mitigating adversarial attacks without necessitating
adversary training.
Cosmetic shop management system project report.pdfKamal Acharya
Buying new cosmetic products is difficult. It can even be scary for those who have sensitive skin and are prone to skin trouble. The information needed to alleviate this problem is on the back of each product, but it's thought to interpret those ingredient lists unless you have a background in chemistry.
Instead of buying and hoping for the best, we can use data science to help us predict which products may be good fits for us. It includes various function programs to do the above mentioned tasks.
Data file handling has been effectively used in the program.
The automated cosmetic shop management system should deal with the automation of general workflow and administration process of the shop. The main processes of the system focus on customer's request where the system is able to search the most appropriate products and deliver it to the customers. It should help the employees to quickly identify the list of cosmetic product that have reached the minimum quantity and also keep a track of expired date for each cosmetic product. It should help the employees to find the rack number in which the product is placed.It is also Faster and more efficient way.
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...Amil Baba Dawood bangali
Contact with Dawood Bhai Just call on +92322-6382012 and we'll help you. We'll solve all your problems within 12 to 24 hours and with 101% guarantee and with astrology systematic. If you want to take any personal or professional advice then also you can call us on +92322-6382012 , ONLINE LOVE PROBLEM & Other all types of Daily Life Problem's.Then CALL or WHATSAPP us on +92322-6382012 and Get all these problems solutions here by Amil Baba DAWOOD BANGALI
#vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore#blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #blackmagicforlove #blackmagicformarriage #aamilbaba #kalajadu #kalailam #taweez #wazifaexpert #jadumantar #vashikaranspecialist #astrologer #palmistry #amliyaat #taweez #manpasandshadi #horoscope #spiritual #lovelife #lovespell #marriagespell#aamilbabainpakistan #amilbabainkarachi #powerfullblackmagicspell #kalajadumantarspecialist #realamilbaba #AmilbabainPakistan #astrologerincanada #astrologerindubai #lovespellsmaster #kalajaduspecialist #lovespellsthatwork #aamilbabainlahore #Amilbabainuk #amilbabainspain #amilbabaindubai #Amilbabainnorway #amilbabainkrachi #amilbabainlahore #amilbabaingujranwalan #amilbabainislamabad
Overview of the fundamental roles in Hydropower generation and the components involved in wider Electrical Engineering.
This paper presents the design and construction of hydroelectric dams from the hydrologist’s survey of the valley before construction, all aspects and involved disciplines, fluid dynamics, structural engineering, generation and mains frequency regulation to the very transmission of power through the network in the United Kingdom.
Author: Robbie Edward Sayers
Collaborators and co editors: Charlie Sims and Connor Healey.
(C) 2024 Robbie E. Sayers
Saudi Arabia stands as a titan in the global energy landscape, renowned for its abundant oil and gas resources. It's the largest exporter of petroleum and holds some of the world's most significant reserves. Let's delve into the top 10 oil and gas projects shaping Saudi Arabia's energy future in 2024.
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
An innovative idea to discover the trend on multi dimensional spatio-temporal datasets
1. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
__________________________________________________________________________________________
Volume: 03 Issue: 03 | Mar-2014, Available @ http://www.ijret.org 243
AN INNOVATIVE IDEA TO DISCOVER THE TREND ON MULTI-
DIMENSIONAL SPATIO-TEMPORAL DATASETS
N. Naga Saranya1
, M. Hemalatha2
1
Assistant Professor, Department of Computer Applications, Tamil Nadu, India
2
Professor, Department of Computer Science, Tamil Nadu, India
Abstract
Spatio-temporal data is any information regarding space and time. It is frequently updated data with 1TB/hr, are greatly challenging
our ability to digest the data. Thereupon, it is unable to gain exact information from that data. So this research offers an innovative
idea to discover the trend on multi-dimensional spatio-temporal datasets. Here it briefly describes the scope and relevancy of spatio-
temporal data. From that, gain the depth knowledge of spatio-temporal recent research process to discover the trend.
Keywords: Spatio-temporal Data, Applications of Spatio-temporal data, Problem Definition, Contributions
----------------------------------------------------------------------***------------------------------------------------------------------------
1. INTRODUCTION
Data mining is the analysis of observational data sets to find
unsuspected relationships and to summarize the data in novel
ways that are both understandable and useful to the data
owner. Extracting interesting and useful patterns from spatio -
temporal database is more difficult than extracting the
corresponding patterns from traditional (fixed) numeric and
categorical (clear) data due to the complexity of spatio -
temporal data types, spatio - temporal relationships, and spatio
- temporal autocorrelation. Spatio-temporal data is any
information relating space and time.
Geospatial data is regularly updated data with 1TB/hr, are
greatly challenging our ability to digest the data. With that
data, it is unable to gain exact information. Here the data may
lose. Geospatial data take a view of geospatial phenomena,
here it captures the spatiality. If it evolves over time, it
captures the spatiality as well as the temporality. From that we
are able to know the process and events of spatio-temporal
data. Knowledge of extracting spatio-temporal data gives
better prediction of its process and events.
Finding the trend is most important task in spatio-temporal
data. Particularly it is to analyze trajectories movement,
animal movements, mating behavior, harvesting, soil quality
changes, earthquake histories, volcanic activities and
prediction etc. Spatial, temporal and/or spatio-temporal data
mining looks for patterns in data using the spatio-temporal
attributes. Trend discovery cannot be directly performed well.
The proposed research performs trend discovery by combining
the principles of clustering, association rule mining,
generalization and characterization.
The significance of spatio-temporal data is continuously
updated data with 1 TB/hr. It is terribly massive and huge-
dimensional geographic and spatio-temporal datasets.
Meteorology
Biology
Crop sciences
Forestry
Medicine
Geophysics
Ecology
Transportation, etc
And we can apply this proposed algorithm to any kind of data
mining applications. Especially when we apply to spatio-
temporal databases, it gives most significant results. When
trend discover algorithm is used, the rate of the retrieval is fast
and results obtained were accurate.
The objective of this research is to develop a novel algorithm
for trend discovery on multi dimensional Spatio-temporal
databases. Objectives are,
Predict the knowledge
Retrieve hidden information
Discover Trend
Future Usage
2. LITERATURE REVIEW
2.1 Clustering in Spatio-Temporal Data
Literature survey have been made on of cluster-ing the moving
data objects in the trajectory dataset the basis of trajectories
representation, moving object point, similarity, clustering
algorithms. Data can be viewed as three forms like spatial
aspects, temporal aspects and spatio-temporal aspects. The
problem of clustering of moving objects in spatial networks
2. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
__________________________________________________________________________________________
Volume: 03 Issue: 03 | Mar-2014, Available @ http://www.ijret.org 244
environment is ad-dressed (Jidong Chen et al., 2007). Since
the clustering process is seems to be difficult the author
proposed a framework that are divided into two sections. It
can be done through performing various methods on CBs for
constant maintaining and periodical construction of clusters.
Results show that the proposed framework outperforms inroad
network. The author deploys multi-scale wavelet transforms
and self-organizing map with neural networks to mine air
pollutant data (Sheng-Tun Li and Shih-Wei Chou, 2000).
Significance of using Wavelet transform method is to
generate data and time variant features for detecting the air
pollutant spatial data in a specific time variant. This wavelet
transform method greatly reduces the local small regions using
small scale input data and improve the over-smoothed regions
using one large scale input data.
Moving objects in trajectory database can be identified with
the help of free moving trajectory context and constrained
trajectories (Ahmed Kharrat et al., 2008). Spatial shapes are
identified to evaluate the OWD (One Way Distance) in both
continuous and discrete cases and trajectories similarity
measures are later dis-cussed (Lin et al., 2005).
The problem of moving object with two datasets with the help
of time series algorithm is investigated (Hoda M. O. Mokhtar
et al., 2011). And both simulated and real time data has been
used for the study. A new similarity measures based on huge
dimensional mobile trajectories is presented (Sigal Elnekave et
al., 2007). The uniform geometric model has been proposed
(O. Gervasi et al., 2005) for clustering and query time-
dependent data. Three dimensional visualization tools to help
the user visualization and time taken for query clustering that
are dynamic in nature.
A study on historical trajectories of moving ob-jects by
proposing an algorithm to discover moving clus-ters are
performed (Kalnis et al., 2005). Sequence of spatial clusters
that appear in consecutive snapshots of the object movements
is termed as moving clusters and other sequence of clusters
tends to share the most common objects. That can be detected
using clusters at consecutive snapshots with high time
complexity. The problem of finding clusters both in dynamic
and continuous nature within 2D Euclidean space is (Christian
S. Jensen et al., 2007) analyzed. Using this concept the user
can easily generate cluster necessary features within reduced
time constraints. The insertion and deletion of clustering
scheme has not done efficiently. An experimental result shows
that the proposed method outperforms when compared to
other existing methods. Clusters for mobile object trajectories
to find the movement patterns based on cluster centroids are
(Sigal Elnekave et al., 2007) evaluated.
To predict the object location, the movement pattern is used.
Precision and recall values are used to measure the
performance of clusters. Threshold values are kept to remove
the exceptional data points. Several existing clustering
methods have been done to calculate the overall partition of
data sets so as to provide some meaningful information about
clusters. But it may not applicable for moving object databases
with huge dimensional spatio-temporal data sets. An
interesting approach called model based concept clustering
(MCC) to identify the moving object in video surveillance is
(Jeongkyu Lee et al., 2007) provided. It includes three steps
namely model formation, model-based concept analysis and
concept graph generation. The results show that the MCC
works well in terms of quality when compared to other
existing methods.
The goal of trajectory clustering is to find simi-lar movement
traces. Many clustering methods have been proposed using
different distance measures between trajectories. The similar
portions of sub trajectories are (Lee. J.G et al., 2007)
discovered. Note that a trajectory may have a long and
complicated path. Since the two trajectories are similar in
some sub-trajectories, they may not be similar as a whole.
Similar sub-trajectories are useful, when one considers regions
of special interest in analysis part. This observation leads to
the development of new sub-trajectory clustering algorithm
named as TRACLUS (Lee. J.G et al., 2007). This method
discovers clusters by grouping sub-trajectories based on
density. Whereas each trajectory must be partitioned into line
segments using Minimum Description Length (MDL). Then,
density-based clustering is applied on the segments to
represent the sub-trajectories that are summarized over all the
line segments in the cluster.
The predestination method to know the history of a driver’s
destination, along with its behaviors, to pre-dict a trip progress
of a driver is (Krumm.J and E.Horvitz, 2006) described. The
driving behaviors pos-sess the starting point of drier, its
destinations and also its efficiency to estimate the trip times
and position of the truck. Four different probabilistic aspects
and ma-thematical principles to create a probability grid of ex-
pected destinations. The authors introduce an open-world
model of destinations that helps the algorithm to work well
with small of training data at the beginning of the training
period by accounting the behavior of users (I.e.what they
visited previously) and unobserved loca-tions based on trends
in the data and the background properties related to the
concerned locations. The 3,667 different driving trips show
the error for two kilometers at the trip’s halfway point.
However this technique predicts the driver’s destination, at
any given point of time. In case of linear functions the object
movement may exhibit different movement patterns.
A trajectories cluster contains similar periodic trajectories.
Trajectories in the same cluster contain as much similar MBBs
that are close to space and time. The centroid of a cluster in a
trajectory databases groups similar trajectory to represent a
movement pattern of a given object. Since the genetic
clustering algorithm handles input of trajectory data to bound
intervals (trajectories) in spite of numeric vectors Y X T hours
3. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
__________________________________________________________________________________________
Volume: 03 Issue: 03 | Mar-2014, Available @ http://www.ijret.org 245
are estimated. The K-Means algorithm for clustering
trajectories based upon the amount of data and its
corresponding similarity measures in a spatial temporal
version is also (Elnekave S et al., 2007) used. It makes use of
new centroid structure and updating me-thod respectively. It
adapt the incremental clustering approach to promote the
clustering of the first training data window (e.g. trajectories
during the first month of data collection), when no pasts data
are available and clustering for the corresponding subsequent
windows, by utilizing clustering centroids and thereby
improving the performance of various sequences of
incremental clustering processes (Elnekave S et al., 2007).
Limited updation are needed to pursue the movement behavior
of the same object which stays relatively stable.
More recently (Spiliopoulou et al., 2006), a framework called
MONIC has been proposed to model the traces of cluster
transitions. Specifically, the data can be clustered at varying
timestamps using bisecting K-means algorithm to detect the
changes of clusters at various timestamps. Unlike the above
two works, which analyze the relations between clusters after
the clusters are obtained, here author aims to predict the
possible cluster evolution to guide the clustering. Finally, we
note that clustering of moving objects involves future-position
modeling. In addition to the linear function model, which has
been involved in most work, a recent proposal considers
nonlinear object movement (Tao.Y et al., 2004).
The key aspect is to obtain a recursive motion function to
predict the future positions of a moving ob-ject depending
upon the positions in the recent past. However, this approach
is more complex than the widely adopted linear model and
complicates the analysis of several interesting spatio-temporal
problems. Thus, we use the linear model. But it is not feasible
to find work on clustering in the literature devoted to kinetic
data structures (Basch.J et al., 1999).
Recently, a new framework has been proposed for segmenting
trajectories for enabling line segments (Lee.J.G et al., 2008,
and Lee.J.G et al., 2007). These line segments are grouped
together to build the clusters. However, time has not been
considered in this system (Lee.J.G et al., 2008, and Lee.J.G et
al., 2007), which makes some line segments to be clustered
together even though they are not ―close‖ when time is
considered. Nevertheless, such approaches for clustering
moving objects cannot solve the flock pattern query since: (1)
they use different criteria when joining the moving ob-ject
clusters for two consecutive time instances; (2) they employ
clustering algorithms, and therefore no strong relationship
between all elements are enforced; (3) mov-ing clustering
does not require the same set of moving objects to stay in a
cluster all the time for the specified minimum duration.
The distance between two trajectory segmenta-tions at time t
as the distance between the rectangles at time t, and the
distance between two segmentations is the sum of the
distances between them at every time instant is
(Anagnostopoulos et al. (2006) defined. The distance between
the trajectory MBRs is a lower bound of the original distance
between the raw data, which is an essential property for
generating the correctness of results for most mining tasks.
The similarity of trajectories along time is computed by
analyzing the way the distance between the varies trajectories
(D'Auria, et al., 2005). More precisely, for each time instant
the positions of moving objects is compared at that moment,
therby aggregating the set of distance values.
The generic definition of clustering is usually redefined
depending on the type of data to be clustered and the
clustering objective. Different and scalable clus-tering
algorithms have been proposed (Iwerks Glenn S et al., 2003,
and LiuWenting et al., 2008). The authors pro-pose a
clustering technique (Zhang Qing and Lin Xu-emin, 2004),
that uses a distance function which com-bines both position
and velocity differences, they employ the k-center clustering
algorithm (Gonzalez, 1985) for histogram construction. The
authors represent a trajectory as a list of (MBB) minimal
bounding boxes (Sigal Elnekave et al., 2008).
Trajectory clustering problem is further investi-gated (Dino
Pedreschi Margherita D'Auria and Mirco Nanni, 2004). In this
work, a clustering technique called temporal focusing was
introduced to make use of the intrinsic semantics of the
temporal dimension and therby promoting the quality of
trajectory clusters. Based on this dependency, the density-
based clustering algorithm has been proposed to trajectory
data for estimating a notion of distance between trajectories.
Clustering of moving objects to optimize the continuous
spatio-temporal query execution has been (Rimma V. Nehme
and Elke A. Rundensteiner, 2006) used. The moving cluster
can be represented by a circle. And a unified framework for
Clustering Moving Objects in spatial Networks (CMON) is
(Chen Jidong et al., 2007) proposed. In this architecture the
clustering process is divided into the continuous maintenance
of cluster blocks (CBs) and the periodical construction of
clusters with different criteria based on CBs. A CB groups a
set of objects on a road segment in close proximity to each
other at present and in future.
The moving clusters from historical trajectories of objects is
(In Panos Kalnis et al., 2005) discovered. Important events
like collision among micro-clusters are also identified. Based
on the historical trajectories of moving objects a new
algorithm has been proposed. A moving cluster is defined as a
sequence of spatial clusters that appear in consecutive
snapshots of the object movements and subsequent spatial
clusters share a large number of common objects. A clustering
technique (Hoda M. O. Mokhtar et al., 2011), that is close to
the partition and group framework proposed (Lee Jae-Gil et
al., 2007) to partition a trajectory into a set of line segments
and then groups similar line segments together into a cluster.
Clustering approaches to enable the common aspects of sub-
4. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
__________________________________________________________________________________________
Volume: 03 Issue: 03 | Mar-2014, Available @ http://www.ijret.org 246
trajectories from a trajectory database has benn (Hoda M. O.
Mokhtar et al., 2011) proposed. This aspect makes the
approaches more accurate than techniques that cluster based
on the whole trajectory behavior. The proposed algorithms
uses the idea of recapping to create a list of clusters that are
visited by different trajectory's segments developed by an
approximate clustering algorithm using the visited cluster list
so as to predict the future motion pattern in the trajectory
databases.
2.2 Outlier Detection in Spatio-Temporal Data
Data objects that appear away from particular consistency in
database are called outliers (Barnett and Lewis, 1994). Usually
uncommon and remarkable ac-tions of data will be considered
as anomalies or noise. Outlier detection is classified as
distribution-based, depth-based and distance based methods.
First, the dis-tribution-based approach uses standard statistical
distri-bution, secondly, the depth-based technique divides the
data objects into an n number of dimensional space (i.e., n is
the total number of attribute in that space) and finally the
distance-based approach estimates the amount of database
objects in a specified distance from a target object (Ng. R.T,
2001). The non-spatial referenced objects vary from the other
objects by means of spatial reference of its neighborhood
called spatial outliers. All spatially referenced objects may not
differ from the entire objects only some of its specified lo-
cation differs from the other in nature (Shekhar et al, 2003).
Distribution based approach have been used extensively in
case of analyzing spatial outliers (Shekhar et al, 2003). It
includes two types of outlier detection methods namely single
and multidimensional outlier detection method.
The problem of spatial-temporal outlier (STO) in which the
spatio – temporal objects differs greatly from both spatial and
temporal objects in terms of rela-tionship are addressed (Tao
Cheng et al., 2004). To im-prove the performance of both
semantic aspects and dynamic aspects of spatio-temporal data
in multi-scales environment, the STO is proposed.
Trajectory outliers are of two kinds. First, a moving object can
be an outlier in terms of its neighborhood or it does not follow
the similar paths. It can be detected by calculating the
distance between two fixed moving objects. Secondly, the
outlying sub-trajectory may not follow the common aspects
among other sub trajectories respectively. This method
includes many challenging issues and can be solved by
partition-and-detect framework (Lee. J.G et al., 2008).
2.3 Feature, Locations, Patterns from Spatio-
temporal Data
The main motivation of association rule is to find the
regularities between the items and their support and
confidence value in huge volume of data sets (Diansheng Guo
and Jeremy Mennis, 2009). Finding the rule generation using
association rule mining, sometimes it may seriously affects the
threshold; support and confidence value (R.J. Kuoa et al.,
2011). To overcome this drawback, a new framework has been
proposed for assigning the threshold automatically by the
system with the help of association rule mining technique and
the working efficiency is more.
Normal association rule generation steps for data set are same
to generate the rule in spatio-temporal database also. It
considers the spatio-temporal predicates and properties.
Number of authors has been discussed about rule mining in
spatio-temporal data (Appice et al., 2003, and Han et al., 2001,
and Koperski et al., 1995, and Mennis et al., 2005).
But for finding the spatio-temporal data set feature, locations,
patterns from the co-location point etc, are very difficult for
generating the rule using association rule mining method
(Shekhar and Huang, 2001). Features of those are frequently
located together, such as a certain species of bird tend to
habitat with a certain type of trees. And for finding the spatio-
temporal patterns and their co-locations, and the measuring
process, a new algorithm has been proposed (Huang et al.,
2006, and Lu et al., 2008 and Shekhar et al., 2008). Finally the
features are extracted from the spatio-temporal data sets and
discover the trend using association rule mining (Zhang
Xuewu , 2008).
Particle swarm optimization (PSO) first searches the best
fitness for each and every particle with the help of minimum
support and confidence value denoted as threshold (Manisha
Gupta, 2011). Based on the literature study, the author
concluded that the PSO will suggest the suitable threshold for
the searching the particle. Spatio-temporal auto correlation
and regression using algebra formula is also done with the
help of spatio-temporal association rule mining (Corbett. J,
1985). Here spatio-temporal data structure and their auto
correlations are calculated by algebra formula.
The measurement of auto correlation, frequent item set of
autocorrelation using algebra formula has been done with the
help of spatio-temporal data structure such as points, lines,
curves, regions, etc (Jiangping Chen, 2008). Several author
discussed the role of association rule mining in spatial mining,
temporal mining, privacy preservation mining, data stream
mining, utility mining, and etc using optimization techniques
(Maragatham. G, 2012). This survey guides the upcoming
researchers to know the growth of association rule mining role
in different types of mining.
By applying association rule in spatio-temporal or non spatio-
temporal data, a specific format data has been delivered
(Imam Mukhlash and Benhard Sitohang, 2007). Chaotic
particle swarm optimization (CPSO) method with an
acceleration strategy has been proposed for finding the
optimization in spatio-temporal data (Cheng-Hong Yang,
2009). Acceleration strategy combined with chaotic particle
5. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
__________________________________________________________________________________________
Volume: 03 Issue: 03 | Mar-2014, Available @ http://www.ijret.org 247
swarm optimization (CPSO) becomes Acceleration chaotic
particle swarm optimization (ACPSO). ACPSO effectively
find out the cluster centers and the global fitness for the
certain particle. But this system cannot able to work properly
to find out the best particle in huge volume of data.
The combination of PSO and BP based neural network system
is developed to train the network. Here it produces the shortest
time, high accuracy with precision and recall rate, fast
disappearance, etc (ZhongLiang Fu and Bin Wan, 2009).
Generating the rule and finding the cluster center using
optimization techniques also discussed by number of authors
(Samadzadegan. F and S. Saeedi, 2009). Here the optimization
method (PSO) reduces the time consumption and produce the
better performance, while finding the cluster centers and
searching the best fitness of each particle. But when the
concentration of local and global best particle, it may miss to
work properly.
A new framework has been proposed for finding the best
fitness in terms of comprehensibility and accuracy (Veenu
Mangat, 2010) using association rule mining technique. This
system reduces the time and space complexity and it finds the
optimal value for the variables in the optimization algorithm.
The features of frequently co-located patterns are discovered
for the spatio-temporal Boolean features. The problem of
spatio-temporal co-location pattern is difficult, when applying
spatial association rule. Because there is no normal notion of
transactions which are embedded in continues geographic
space.
3. DRAWBACKS OF EXISTING SYSTEM
Clustering of spatio - temporal data is a difficult prob-lem
that is compared to various fields and applications.
o The Major and common drawbacks are high
dimensionality of data (2D or 3D), initial error
propaga-tion, high dimensional data complexity.
o Clustering of space and time related data, spa-tio-
temporal clustering methods focus on the specific
characteristics of distributions in 2-D or 3-D space,
while general-purpose high-dimensional clustering
method have limited power in recognizing spatio-
temporal pat-terns that involve neighbors.
o In human computer interaction, the clustering
techniques may not work properly to find out the com-
plex patterns in huge volume of spatio-temporal data
sets.
Outlier Detection - Spatio-temporal neighbors which is
occurs in spatio-temporal outliers are very conflict, even the
non spatio-temporal values are normal for the rest of the
objects of the same class.
o Spatio-temporal outlier detection may not prop-erly work
with non-spatial or non-temporal datasets. Even if the
non-spatial attributes are separated, it cannot able to
predict the outliers from spatio-temporal datasets.
Rule mining in spatio-temporal data - Dependency Anal-ysis
existing algorithms drawbacks are, while incorporat-ing local
search in high-dimensional spatio-temporal data, the
performance is very low and it faces the diversity problem. In
the optimization process, the existing algorithms are not able
to work with partial optimism.
o In that stage, the algorithm will easily affect the speed
and direction of the particle and it may not work properly
to do their remaining iterations.
o And the existing algorithms cannot able to solve the
issues of huge dimensional, unsystematic datasets in the
stage of particle search and moving object search.
Trend discovery in spatio-temporal data- existing algo-
rithms drawbacks, while discovering the trend in spatio-
temporal data are as follow,
o The anomalies will occur in the existing system, while it
concentrates the graph alteration. Here the in-formation
may loss badly. So the necessities of the table scan may
increase more.
o In generalization process face problem to identify the
description of data while doing the path relation and their
hierarchy.
o Spatio-temporal dependency process faces sev-eral
problems while finding the dependency of large spa-tio-
temporal data.
To overcome these drawbacks, we propose new algo-rithms
for discovering the best trend in spatio-temporal data sets.
4. PROBLEM DEFINITION
The performance of trend discovery analysis is hindered in
spatio-temporal data due to the following reason:
o Clustering of massive spatio-temporal data is
cumbersome. Features of the data are often preselected,
yet the properties of different features and feature
combinations are not well investigated in the huge
spatio-temporal data sets. Finding appropriate features to
form a cluster group is essential for better search.
o Cluster accuracy is less, so it requires deviation and
outlier detection analysis for high dimensional spatio-
temporal clustered data.
o Rule generation accuracy of spatio-temporal data is less,
while it predicts the value of one attribute with the help
of another attribute value over a period of time. So it
requires optimization process for rule generation.
o To know about the compact description of data, it needs
better generalization and characterization method.
o Trend discovery of spatio-temporal data accuracy is very
less, while it’s going for the huge dimensional data sets.
6. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
__________________________________________________________________________________________
Volume: 03 Issue: 03 | Mar-2014, Available @ http://www.ijret.org 248
5. CONTRIBUTIONS OF THIS RESEARCH
WORK
Phase 1 gives the solutions to the challenges in segmenting
the spatio - temporal data by Proposed PANN algorithm
Phase 2 removes the deviations and outliers of spatio -
temporal data accurately by improved STOD Technique
Phase 3 generates the rules for dependency analysis of spatio -
temporal data by proposed IESO technique
Phase 4 generalize and characterize the spatio - temporal data
by improved GAOI technique
Phase 5 Finally the Proposed GPLDE algorithm is developed
for the further trend discovery of spatio - temporal data.
Fig 1 shows that the entire framework of proposed research.
Fig 1 Entire Framework of Proposed Research
6. CONCLUSIONS
This research offers an innovative idea to discover the trend
on multi-dimensional spatio-temporal datasets. Here it briefly
describes the scope and relevancy of spatio-temporal data.
From the literature survey it has listed a number of issues. And
also it contributes several phases, in which each and every
phase output must be very helpful to go for the next phase
input. From that, gain the depth knowledge of spatio-temporal
recent research process to discover the trend. The proposed
algorithms evaluation process of this research will be learned
in our next research paper.
REFERENCES
[1] Adam, N.R., V.P. Janeja and V. Atluri, 2004. Neigh-
bourhood based detection of anomalies inhigh di-
mension spatio-temporal sensor datasets. ACM sym-
posium on Applied Computing, Nicosia, Cyprus., 1:
576 – 583.
[2] Ai, C. and X. Chen, 2003. Efficient estimation of
models with conditional moment restrictions containing
un-known functions. Econometrica., 7: 1795-1843.
[3] Anagnostopoulos. A., M. Vlachos., M.
Hadjieleftheriou., E Keogh and P.s. Yu, 2006. Global
Distance-Based Segmentation of Trajectories.
KDD’06, Philadelphia, Pennsylvania, USA.
[4] Anselin, L. and A.K. Bera, A.K, 2002. Spatial depen-
dence in linear regression models with an introduction
to spatial econometrics. In: Ullah, A., Giles, D.E.A.
(Eds.), Handbook of Applied Economics Statistics.
Marcel Dekker, New York.
[5] Auroop. RR., Ganguly and Karsten Steinhaeuser, 2008.
Data Mining for Climate Change and Impacts. IEEE
International conference on Data mining workshop,
ICDMW., 385-394.
[6] Basile, R. and B. Gress, 2004. Semi-parametric spatial
auto-covariance models of regional growth behavior in
Europe. Mimeo. Dept. of Economics, UC, River-side.
[7] Benkert. M., J. Gudmundsson., F. H¨ubner and T.
Wolle, 2008. Reporting flock patterns. Comput. Geom.
Theory Appl., 41(3):111–125.
[8] Cheng-Hong Yang et al,. 2009. Accelerated Chaotic
Par-ticle Swarm Optimization for Data Clustering.
Inter-national Conference on Machine Learning and
Com-puting IPCSIT vol.3 ,IACSIT Press, Singapore.,
1: 249-253.
[9] Diansheng Guo and Jeremy Mennis, 2009. Spatial data
mining and geographic knowledge discovery—An
introduction. Computers, Environment and Urban
Systems., 33(6): 403-408.
[10] Dien J., K.M. Spencer and E. Donchin, 2005. Parsing
the "Late Positive Complex": Mental chronometry and
the ERP components that inhabit the neighborhood of
the P300. Psychophysiology.
[11] Elnekave. S., M. Last and O. Maimon, 2007.
Incremental Clustering of Mobile Objects. STDM07,
IEEE. Engineering, Washington, DC, USA., 1: 422—
432.
[12] Gudmundsson, J. and M. Van Kreveld, 2006.
Computing longest duration flocks in trajectory data. In
ACM GIS., 35–42.
[13] Hoda M. O. Mokhtar et al., 2011. A Time
Parameterized Technique for Clustering Moving Object
Trajectories, International Journal of Data Mining &
Knowledge Management Process (IJDKP)., 1(1): 14-
30.
[14] Hoda Mokhtar and Jianwen Su, 2005. A query
language for moving object trajectories. In SSDBM:
Interna-tional Conference on Scientific and Statistical
Data-base Management., 1: 173-182.
[15] Hoda, M. O. and Mokhtar, 2011. A Time
Parameterized Technique for Clustering Moving Object
Trajectories. International Journal of Data Mining &
Knowledge Management Process., 1(1): 1-17.
[16] Huang, Y., J. Pei and H. Xiong, 2006. Mining co-
location patterns with rare events from spatial data sets.
Geoinformatica., 10(3): 239–260.
7. IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308
__________________________________________________________________________________________
Volume: 03 Issue: 03 | Mar-2014, Available @ http://www.ijret.org 249
[17] Imam Mukhlash and Benhard Sitohang, 2007. Spatial
Data Preprocessing for Mining Spatial Association
Rule with Conventional Association Mining Algo-
rithms. International Conference on Electrical Engi-
neering and Informatics Institute Technology Ban-
dung, Indonesia., 1: 531-534.
[18] Jensen. C,, D. Lin and B. Ooi, 2007. Continuous
cluster-ing of moving objects. IEEE Trans. Knowl.
Data Eng., 19(9): 1161–1174.
[19] Jiangping Chen, 2008. An Algorithm About
Association Rule Mining Based On Spatial
Autocorrelation. The International Archives of the
Photogrammetry, Re-mote Sensing and Spatial
Information Sciences. XXXVII( B6b), Beijing.
[20] Jiawei Han and Micheline Kamber, 2009. Data mining
concepts and techniques. Morgan Kaufmann
Publishers., 354-359.
[21] Jidong Chen et al., 2007. Clustering Moving Objects in
Spatial Networks. Proceedings of the 12th interna-
tional conference on Database systems for advanced
applications., 1: 611-623.
[22] Kang Hye-Young., Joon-Seok and Ki-Joune, 2009.
Simi-larity measures for trajectory of moving objects in
cellular space. In SAC '09: Proceedings of the 2009
ACM symposium on Applied Computing., 1: 1325-
1330.
[23] Kelejian, H.H. and I.R. Prucha, 2010. Specification and
estimation of spatial autoregressive models with au-
toregressive and heteroskedastic disturbances. Jour-nal
of Econometrics., 157(1): 53-67.
[24] Kuo. R.J., C.M. Chao and Y.T. Chiu, 2011.
Application of particle swarm optimization to
association rule mining. Applied Soft Computing., 11:
326–336.
[25] Lee Jae-Gil., Han Jiawei and Whang Kyu-Young,
2007. Trajectory clustering: a partition-andgroup
frame-work. In SIGMOD '07: Proceedings of the 2007
ACM SIGMOD international conference on
Management of data., 1: 593-604.
[26] Lin, X. and L.-F. Lee, L.-F, 2010. GMM estimation of
spatial autoregressive models with unknown hete-
roskedasticity. Journal of Econometrics., 157 (1): 34-
52.
[27] Lu, Y. and J-C. Thill, 2008. Cross-scale analysis of
clus-ter correspondence using different operational
neigh-borhoods. Journal of Geographical Systems.,
10(3): 241–261.
[28] Manisha Gupta, 2011. Application of Weighted
Particle Swarm Optimization in Association Rule
Mining. In-ternational Journal of Computer Science
and Infor-matics (IJCSI) ISSN (PRINT): 2231 –5292.,
1(3).
[29] Maragatham G et al, 2012. A Recent Review On
Associ-ation Rule Mining. Indian Journal of Computer
Science and Engineering (IJCSE)., 2(6): 831-836.
[30] Margaret H. Dunham., 2009. Data Mining Introductory
and Advanced Topics. Dorling Kindersley (India) Pvt.
Ltd., 1(1): 1-311.
[31] Richard Frank., Wen Jin and Martin Ester, 2009. Effi-
ciently Mining Regional Outliers in Spatial Data., 1-18.
[32] Sigal Elnekave and Mark Last, 2007. A Compact
Repre-sentation of Spatio-Temporal Data, Seventh
IEEE International Conference on Data Mining –
Workshops.
[33] Sigal Elnekave et al., 2007. Predicting Future
Locations Using Clusters' Centroids. Proceedings of
the 15th annual ACM international symposium on
Advances in geographic information systems., 55(1).
[34] Urška Demšar., Paul Harris., Chris Brunsdon., A.
Stewart Fotheringham and Sean McLoone, 2012.
Principal Component Analysis on Spatial Data: An
Overview, Annals of the Association of American
Geographers., 1:1-24.
[35] Veenu Mangat, 2010. Swarm Intelligence Based Tech-
nique for Rule Mining in the Medical Domain. Inter-
national Journal of Computer Applications (0975 –
8887)., 4(1): 19-24.
[36] Yang, Z., C. Li and Y.K. Tse, 2006. Functional form
and spatial dependence in dynamic panels. Economics
Letters., 91: 138-145.
[37] Yiu, M.L. and N. Mamoulis, 2004. Clustering objects
on a spatial network. In Proc. ACM SIGMOD.,1: 443–
454.
[38] Zhang Xuewu, 2008. Association Rule Mining on
Spatio-temporalProcesses. Wireless Communications,
Net-working and Mobile Computing, WiCOM '08., 1:
1- 4.