IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Clustering of medline documents using semi supervised spectral clusteringeSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Outlier Detection using Reverse Neares Neighbor for Unsupervised Dataijtsrd
Data mining has become one of the most popular and new technology that it has gained a lot of attention in the recent times and with the increase in the popularity and the usage there comes a lot of issues/problems with the usage one of it Outlier detection and maintaining the datasets without the expected patterns. To identify the difference between Outlier and normal behavior we use key assumption techniques. We Provide the reverse nearest neighbor technique. There is a connection between the hubs and antihubs, outliers and the present unsupervised detection methods. With the KNN method it will be possible to identify and influence the outlier and antihub methods on real life datasets and synthetic datasets. So, From this we provide the insight of the Reverse neighbor count on unsupervised outlier detection. V. V. R. Manoj | V. Aditya Rama Narayana | A. Bhargavi | A. Lakshmi Prasanna | Md. Aakhila Bhanu"Outlier Detection using Reverse Neares Neighbor for Unsupervised Data" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-2 | Issue-3 , April 2018, URL: http://www.ijtsrd.com/papers/ijtsrd11406.pdf http://www.ijtsrd.com/computer-science/data-miining/11406/outlier-detection-using-reverse-neares-neighbor-for-unsupervised-data/v-v-r-manoj
CHARACTER AND IMAGE RECOGNITION FOR DATA CATALOGING IN ECOLOGICAL RESEARCHcscpconf
Data collection is an essential, but manpower intensive procedure in ecological research. An
algorithm was developed by the author which incorporated two important computer vision
techniques to automate data cataloging for butterfly measurements. Optical Character
Recognition is used for character recognition and Contour Detection is used for imageprocessing.
Proper pre-processing is first done on the images to improve accuracy. Although
there are limitations to Tesseract’s detection of certain fonts, overall, it can successfully identify
words of basic fonts. Contour detection is an advanced technique that can be utilized to
measure an image. Shapes and mathematical calculations are crucial in determining the precise
location of the points on which to draw the body and forewing lines of the butterfly. Overall,
92% accuracy were achieved by the program for the set of butterflies measured.
Information Upload and retrieval using SP Theory of IntelligenceINFOGAIN PUBLICATION
In today’s technology Cloud computing has become an important aspect and storing of data on cloud is of high importance as the need for virtual space to store massive amount of data has grown during the years. However time taken for uploading and downloading is limited by processing time and thus need arises to solve this issue to handle large data and their processing. Another common problem is de duplication. With the cloud services growing at a rapid rate it is also associated by increasing large volumes of data being stored on remote servers of cloud. But most of the remote stored files are duplicated because of uploading the same file by different users at different locations. A recent survey by EMC says about 75% of the digital data present on cloud are duplicate copies. To overcome these two problems in this paper we are using SP theory of intelligence using lossless compression of information, which makes the big data smaller and thus reduces the problems in storage and management of large amounts of data.
Additive gaussian noise based data perturbation in multi level trust privacy ...IJDKP
This document discusses a technique called additive Gaussian noise based data perturbation for privacy preserving data mining. The technique introduces multiple perturbed copies of data for different trust levels of data miners to prevent diversity attacks. Gaussian noise is added to the original data and correlated between copies so that combining copies does not provide additional information about the original data. The goal is to limit what information adversaries can learn from individual or combined copies to within what the data owner intends to share, while still allowing accurate data mining. Experiments on banking customer data show the approach controls the normalized estimation error from individual and combined copies.
A genetic algorithm approach for predicting ribonucleic acid sequencing data ...TELKOMNIKA JOURNAL
Malaria larvae accept explosive variable lifecycle as they spread across numerous mosquito vector stratosphere. Transcriptomes arise in thousands of diverse parasites. Ribonucleic acid sequencing (RNA-seq) is a prevalent gene expression that has led to enhanced understanding of genetic queries. RNA-seq tests transcript of gene expression, and provides methodological enhancements to machine learning procedures. Researchers have proposed several methods in evaluating and learning biological data. Genetic algorithm (GA) as a feature selection process is used in this study to fetch relevant information from the RNA-Seq Mosquito Anopheles gambiae malaria vector dataset, and evaluates the results using kth nearest neighbor (KNN) and decision tree classification algorithms. The experimental results obtained a classification accuracy of 88.3 and 98.3 percents respectively.
This document summarizes a research paper that proposes using a genetic algorithm to efficiently cluster wireless sensor nodes. The genetic algorithm aims to minimize the total communication distance between sensors and the base station in order to prolong the network lifetime. Simulation results showed that the genetic algorithm can quickly find good clustering solutions that reduce energy consumption compared to previous clustering methods. The full paper provides details on wireless sensor networks, related clustering algorithms, genetic algorithms, and the proposed genetic algorithm-based clustering method.
Clustering of medline documents using semi supervised spectral clusteringeSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Outlier Detection using Reverse Neares Neighbor for Unsupervised Dataijtsrd
Data mining has become one of the most popular and new technology that it has gained a lot of attention in the recent times and with the increase in the popularity and the usage there comes a lot of issues/problems with the usage one of it Outlier detection and maintaining the datasets without the expected patterns. To identify the difference between Outlier and normal behavior we use key assumption techniques. We Provide the reverse nearest neighbor technique. There is a connection between the hubs and antihubs, outliers and the present unsupervised detection methods. With the KNN method it will be possible to identify and influence the outlier and antihub methods on real life datasets and synthetic datasets. So, From this we provide the insight of the Reverse neighbor count on unsupervised outlier detection. V. V. R. Manoj | V. Aditya Rama Narayana | A. Bhargavi | A. Lakshmi Prasanna | Md. Aakhila Bhanu"Outlier Detection using Reverse Neares Neighbor for Unsupervised Data" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-2 | Issue-3 , April 2018, URL: http://www.ijtsrd.com/papers/ijtsrd11406.pdf http://www.ijtsrd.com/computer-science/data-miining/11406/outlier-detection-using-reverse-neares-neighbor-for-unsupervised-data/v-v-r-manoj
CHARACTER AND IMAGE RECOGNITION FOR DATA CATALOGING IN ECOLOGICAL RESEARCHcscpconf
Data collection is an essential, but manpower intensive procedure in ecological research. An
algorithm was developed by the author which incorporated two important computer vision
techniques to automate data cataloging for butterfly measurements. Optical Character
Recognition is used for character recognition and Contour Detection is used for imageprocessing.
Proper pre-processing is first done on the images to improve accuracy. Although
there are limitations to Tesseract’s detection of certain fonts, overall, it can successfully identify
words of basic fonts. Contour detection is an advanced technique that can be utilized to
measure an image. Shapes and mathematical calculations are crucial in determining the precise
location of the points on which to draw the body and forewing lines of the butterfly. Overall,
92% accuracy were achieved by the program for the set of butterflies measured.
Information Upload and retrieval using SP Theory of IntelligenceINFOGAIN PUBLICATION
In today’s technology Cloud computing has become an important aspect and storing of data on cloud is of high importance as the need for virtual space to store massive amount of data has grown during the years. However time taken for uploading and downloading is limited by processing time and thus need arises to solve this issue to handle large data and their processing. Another common problem is de duplication. With the cloud services growing at a rapid rate it is also associated by increasing large volumes of data being stored on remote servers of cloud. But most of the remote stored files are duplicated because of uploading the same file by different users at different locations. A recent survey by EMC says about 75% of the digital data present on cloud are duplicate copies. To overcome these two problems in this paper we are using SP theory of intelligence using lossless compression of information, which makes the big data smaller and thus reduces the problems in storage and management of large amounts of data.
Additive gaussian noise based data perturbation in multi level trust privacy ...IJDKP
This document discusses a technique called additive Gaussian noise based data perturbation for privacy preserving data mining. The technique introduces multiple perturbed copies of data for different trust levels of data miners to prevent diversity attacks. Gaussian noise is added to the original data and correlated between copies so that combining copies does not provide additional information about the original data. The goal is to limit what information adversaries can learn from individual or combined copies to within what the data owner intends to share, while still allowing accurate data mining. Experiments on banking customer data show the approach controls the normalized estimation error from individual and combined copies.
A genetic algorithm approach for predicting ribonucleic acid sequencing data ...TELKOMNIKA JOURNAL
Malaria larvae accept explosive variable lifecycle as they spread across numerous mosquito vector stratosphere. Transcriptomes arise in thousands of diverse parasites. Ribonucleic acid sequencing (RNA-seq) is a prevalent gene expression that has led to enhanced understanding of genetic queries. RNA-seq tests transcript of gene expression, and provides methodological enhancements to machine learning procedures. Researchers have proposed several methods in evaluating and learning biological data. Genetic algorithm (GA) as a feature selection process is used in this study to fetch relevant information from the RNA-Seq Mosquito Anopheles gambiae malaria vector dataset, and evaluates the results using kth nearest neighbor (KNN) and decision tree classification algorithms. The experimental results obtained a classification accuracy of 88.3 and 98.3 percents respectively.
This document summarizes a research paper that proposes using a genetic algorithm to efficiently cluster wireless sensor nodes. The genetic algorithm aims to minimize the total communication distance between sensors and the base station in order to prolong the network lifetime. Simulation results showed that the genetic algorithm can quickly find good clustering solutions that reduce energy consumption compared to previous clustering methods. The full paper provides details on wireless sensor networks, related clustering algorithms, genetic algorithms, and the proposed genetic algorithm-based clustering method.
Data collection in multi application sharing wireless sensor networksPvrtechnologies Nellore
- This document discusses algorithms for minimizing data collection in wireless sensor networks that are shared by multiple applications. It introduces the interval data sharing problem, where each application requires continuous interval data sampling rather than single data points.
- The problem is formulated as a non-linear, non-convex optimization problem. A 2-factor approximation algorithm is proposed with time complexity O(n^2) and memory complexity O(n) to address the high complexity of solving the optimization problem on resource-constrained sensor nodes.
- A special case where sampling intervals are the same length is analyzed, and a dynamic programming algorithm is provided that runs in optimal O(n^2) time and O(n) memory. Three online algorithms
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Protected Data Collection In WSN by Filtering Attackers Influence (Published ...sangasandeep
This document discusses secure data aggregation in wireless sensor networks. It presents three approaches to secure data aggregation: hop-by-hop encryption, end-to-end encryption, and privacy homomorphism. A general framework is also proposed that uses clustering to perform secure and energy-efficient data aggregation across sensor nodes. The framework applies end-to-end symmetric cryptography using privacy homomorphism to encrypt data before sending it to cluster heads. This helps prevent attackers from accessing plaintext sensor data during transmission and aggregation.
The various anonymization techniques, called generalization and bucketization, have been designed
for providing data privacy and preserving micro data publishing. Recent work shows that generalization
loses considerable amount of information on high dimensional data and bucketization did not prevent
membership disclosure with clear separation between quasi-identifying attributes and sensitive attributes.
The slicing techniques partitions the data both horizontally and vertically is proposed with entity resolution
which preserves data utility and membership disclosure protection .Slicing develops an efficient algorithm
with the ‘l-diversity requirement. The workload experiments with sensitive attribute ensures that slicing
provides better utility than generalization and is more effective than bucketization. As an extension we
proposed a technique called overlapped slicing, were the attributes are divided into more than one column
and release in each column consists of more attribute correlations.
TUPLE VALUE BASED MULTIPLICATIVE DATA PERTURBATION APPROACH TO PRESERVE PRIVA...IJDKP
Huge volume of data from domain specific applications such as medical, financial, library, telephone,
shopping records and individual are regularly generated. Sharing of these data is proved to be beneficial
for data mining application. On one hand such data is an important asset to business decision making by
analyzing it. On the other hand data privacy concerns may prevent data owners from sharing information
for data analysis. In order to share data while preserving privacy, data owner must come up with a solution
which achieves the dual goal of privacy preservation as well as an accuracy of data mining task –
clustering and classification. An efficient and effective approach has been proposed that aims to protect
privacy of sensitive information and obtaining data clustering with minimum information loss
Data Leakage Detection and Security Using Cloud ComputingIJERA Editor
The data owner will store the data in the cloud. Every user must registered in the cloud. Cloud provider must
verify the authorized user. If someone try to access the account, data will get leaked. This leaked data will
present in an unauthorized place (e.g., on the internet or someone’s laptop). In this paper, we propose Division
and Replication of Data in the Cloud for Optimal Performance and Security (DROPS) that collectively
approaches the security and performance issues. In DROPS methodology, we have to select the file and then
store the particular file in the cloud account. In order to provide security we are going to implement DROPS
concepts. Now we divide the file into various fragments based on the threshold value. Each and every fragments
are stored in the node using T-Coloring. After the placement of fragments in node, it is necessary to replicate
each fragments for one time in cloud.
This document summarizes a research paper on developing an improved LEACH (Low-Energy Adaptive Clustering Hierarchy) communication protocol for energy efficient data mining in multi-feature sensor networks. It begins with background on wireless sensor networks and issues like energy efficiency. It then discusses the existing LEACH protocol and its drawbacks. The proposed improved LEACH protocol includes cluster heads, sub-cluster heads, and cluster nodes to address LEACH's limitations. This new version aims to minimize energy consumption during cluster formation and data aggregation in multi-feature sensor networks.
A survey on location based serach using spatial inverted index methodeSAT Journals
Abstract Conventional spatial queries, nearest neighbor retrieval and vary search consists solely conditions on objects geometric property. But today, several fashionable applications support new kind of queries that aim to seek out objects that satisfies each spatial knowledge and their associated text. As an example instead of considering all the hotels, a nearest neighbor queries would instead elicit the building that's nearest to among people who offer services like pool, internet at a similar time. For this a sort of questioning a variant of inverted index is employed that's effective for multidimensional points associate degreed come with an R-tree which is constructed on each inverted list, and uses the algorithm of minimum bounding methodology which will answer the closest neighbor queries with keywords in real time. Keywords: Spatial database, nearest neighbor search, spatial index, keyword search.
This document summarizes a research paper that proposes a new density-based clustering technique called Triangle-Density Based Clustering Technique (TDCT) to efficiently cluster large spatial datasets. TDCT uses a polygon approach where the number of data points inside each triangle of a polygon is calculated to determine triangle densities. Triangle densities are used to identify clusters based on a density confidence threshold. The technique aims to identify clusters of arbitrary shapes and densities while minimizing computational costs. Experimental results demonstrate the technique's superiority in terms of cluster quality and complexity compared to other density-based clustering algorithms.
This document summarizes a research paper that proposes the design and implementation of an intelligent laser warning system using fuzzy logic. The system uses four laser sensors to detect the incident laser angle from 0 to 360 degrees and an additional sensor to distinguish the laser from background sunlight. A fuzzy logic algorithm is used to fuse the sensor data and estimate the angle of incidence. The system is first simulated in MATLAB and then implemented using a TI-430 microcontroller. The goal is to develop a low-cost laser detection system that can accurately detect laser threats and distinguish them from other light sources like the sun.
The document proposes a Modified Pure Radix Sort algorithm for large heterogeneous datasets. The algorithm divides the data into numeric and string processes that work simultaneously. The numeric process further divides data into sublists by element length and sorts them simultaneously using an even/odd logic across digits. The string process identifies common patterns to convert strings to numbers that are then sorted. This optimizes problems with traditional radix sort through a distributed computing approach.
Current Issue - January 2022, Volume 14, Number 1 - International Journal of ...IJCNCJournal
The International Journal of Computer Networks & Communications (IJCNC) is a bi monthly open access peer-reviewed journal that publishes articles which contribute new results in all areas of Computer Networks & Communications. The journal focuses on all technical and practical aspects of Computer Networks & data Communications. The goal of this journal is to bring together researchers and practitioners from academia and industry to focus on advanced networking concepts and establishing new collaborations in these areas.
A Reliable Routing Technique for Wireless Sensor NetworksEditor IJCATR
Wireless Sensor Network (WSN) consists of very large number of sensor nodes which are deployed close to the area which
is to be monitored so as to sense various environmental conditions. WSN is a data-driven network which produces large amount of data
and also sensor nodes are energy-limited devices and their energy consumption is mainly associated with data routing. Therefore it is
necessary to perform redundant data aggregation so as to save energy. In this work data aggregation is achieved with the help of two key
approaches namely Clustering approach and In-network data aggregation. These two approaches help to save energy and thereby
increasing the lifetime of the network. The proposed work has some key features like reliable cluster formation, high data aggregation
rate, priority of packets, minimized overhead, multiple routes, reduced energy consumption which enhance the network lifetime. The
performance evaluation of the proposed approach is carried out using Network Simulator- version 2
IRJET- Swift Retrieval of DNA Databases by Aggregating QueriesIRJET Journal
This document summarizes a research paper that proposes a new method for securely sharing and querying genomic DNA sequences stored in the cloud without violating privacy. The method builds on existing frameworks by offering deterministic results with zero error probability, and a scheme that is twice as fast but uses twice the storage space, which is preferable given cloud storage pricing. The encoding of the data supports a richer set of query types beyond exact matching, including counting matches, logical OR matches, handling ambiguities, threshold queries, and concealing results from the decrypting server. Linear and logistic regression algorithms are used to analyze the data. The literature review discusses previous work on securely sharing genomic data and transforming protocols to ensure accountability without compromising privacy.
A Secure and Dynamic Multi-keyword Ranked Search Scheme over Encrypted Cloud ...1crore projects
IEEE PROJECTS 2015
1 crore projects is a leading Guide for ieee Projects and real time projects Works Provider.
It has been provided Lot of Guidance for Thousands of Students & made them more beneficial in all Technology Training.
Dot Net
DOTNET Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
Java Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
ECE IEEE Projects 2015
1. Matlab project
2. Ns2 project
3. Embedded project
4. Robotics project
Eligibility
Final Year students of
1. BSc (C.S)
2. BCA/B.E(C.S)
3. B.Tech IT
4. BE (C.S)
5. MSc (C.S)
6. MSc (IT)
7. MCA
8. MS (IT)
9. ME(ALL)
10. BE(ECE)(EEE)(E&I)
TECHNOLOGY USED AND FOR TRAINING IN
1. DOT NET
2. C sharp
3. ASP
4. VB
5. SQL SERVER
6. JAVA
7. J2EE
8. STRINGS
9. ORACLE
10. VB dotNET
11. EMBEDDED
12. MAT LAB
13. LAB VIEW
14. Multi Sim
CONTACT US
1 CRORE PROJECTS
Door No: 214/215,2nd Floor,
No. 172, Raahat Plaza, (Shopping Mall) ,Arcot Road, Vadapalani, Chennai,
Tamin Nadu, INDIA - 600 026
Email id: 1croreprojects@gmail.com
website:1croreprojects.com
Phone : +91 97518 00789 / +91 72999 51536
Privacy Preserving Reputation Calculation in P2P Systems with Homomorphic Enc...IJCNCJournal
This document discusses a method for privacy-preserving reputation calculation in peer-to-peer systems using homomorphic encryption. Specifically, it proposes:
1) Extending the EigenTrust reputation system to calculate node reputations in a distributed manner while preserving evaluator privacy. It does this by successively updating encrypted reputation values through calculation to reflect trust values without disclosing the original values.
2) Improving calculation efficiency by offloading parts of the task to participating nodes and using different public keys during calculation to improve robustness against node churn.
3) Evaluating the performance of the proposed method, finding it reduces maximum circulation time for aggregating multiplication results by half, reducing computation time per round. The privacy preservation cost scales
Book of abstract volume 8 no 9 ijcsis december 2010Oladokun Sulaiman
The International Journal of Computer Science and Information Security (IJCSIS) is a publication venue for novel research in computer science and information security. This issue from December 2010 contains 5 research papers. The first paper proposes a 128-bit chaotic hash function that uses the logistic map and MD5/SHA-1 hashes. The second paper discusses constructing an ontology for representing human emotions in videos to improve video retrieval. The third paper proposes an intelligent memory controller for H.264 encoders to reduce external memory access. The fourth paper investigates the impact of fragmentation on query performance in distributed databases. The fifth paper examines the effect of guard intervals in a proposed MIMO-OFDM system for wireless communication.
Building Programming Abstractions for Wireless Sensor Networks Using Watershe...M H
The availability and quality of information extracted from Wireless Sensor Networks (WSNs) revolutionised a wide range of application areas. The success of any WSN application is, nonetheless, determined by the ability to retrieve information with the required level of accuracy, within specified time constraints, and with minimum resource utilisation. This paper presents a new approach to localised information extraction that utilises the Watershed segmentation algorithm to dynamically group nodes into segments, which can be used as programming abstractions upon which different query operations can be performed. Watershed results in a set of well delimited areas, such that the number of necessary operations (communication and computation) to answer a query are minimised. This paper presents a fully asynchronous Watershed implementation, where nodes can compute their local data in parallel and independently from one another. The preliminary experimental results demonstrate that the proposed approach is able to significantly reduce the query processing cost and time without involving any loss of efficiency.
Smart Spaces and Next Generation Wired/Wireless Networking Smart Spaces and Next Generation Wired/Wireless Networking Look
Inside
Share
Share this content on Facebook Share this content on Twitter Share this content on LinkedIn
Other actions
Export citations
About this Book
Reprints and Permissions
The document describes the synthesis and morphology of silicon nanoparticles deposited on a silicon dioxide substrate using low pressure chemical vapor deposition with varying deposition times. Atomic force microscopy and image analysis software were used to characterize the nanoparticles and found that their height, density, and size varied with deposition time, with heights between 1-3 nm, densities from 2x1011 to 3.5x1011 particles/cm2, and sizes of 2-10 nm. The goal was to study how the morphological and electrical characteristics of the nanoparticles changed with different deposition parameters.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and TechnologyIJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Comparative studies on flotation of kasolite using cationic and anionic surfa...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Data collection in multi application sharing wireless sensor networksPvrtechnologies Nellore
- This document discusses algorithms for minimizing data collection in wireless sensor networks that are shared by multiple applications. It introduces the interval data sharing problem, where each application requires continuous interval data sampling rather than single data points.
- The problem is formulated as a non-linear, non-convex optimization problem. A 2-factor approximation algorithm is proposed with time complexity O(n^2) and memory complexity O(n) to address the high complexity of solving the optimization problem on resource-constrained sensor nodes.
- A special case where sampling intervals are the same length is analyzed, and a dynamic programming algorithm is provided that runs in optimal O(n^2) time and O(n) memory. Three online algorithms
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Protected Data Collection In WSN by Filtering Attackers Influence (Published ...sangasandeep
This document discusses secure data aggregation in wireless sensor networks. It presents three approaches to secure data aggregation: hop-by-hop encryption, end-to-end encryption, and privacy homomorphism. A general framework is also proposed that uses clustering to perform secure and energy-efficient data aggregation across sensor nodes. The framework applies end-to-end symmetric cryptography using privacy homomorphism to encrypt data before sending it to cluster heads. This helps prevent attackers from accessing plaintext sensor data during transmission and aggregation.
The various anonymization techniques, called generalization and bucketization, have been designed
for providing data privacy and preserving micro data publishing. Recent work shows that generalization
loses considerable amount of information on high dimensional data and bucketization did not prevent
membership disclosure with clear separation between quasi-identifying attributes and sensitive attributes.
The slicing techniques partitions the data both horizontally and vertically is proposed with entity resolution
which preserves data utility and membership disclosure protection .Slicing develops an efficient algorithm
with the ‘l-diversity requirement. The workload experiments with sensitive attribute ensures that slicing
provides better utility than generalization and is more effective than bucketization. As an extension we
proposed a technique called overlapped slicing, were the attributes are divided into more than one column
and release in each column consists of more attribute correlations.
TUPLE VALUE BASED MULTIPLICATIVE DATA PERTURBATION APPROACH TO PRESERVE PRIVA...IJDKP
Huge volume of data from domain specific applications such as medical, financial, library, telephone,
shopping records and individual are regularly generated. Sharing of these data is proved to be beneficial
for data mining application. On one hand such data is an important asset to business decision making by
analyzing it. On the other hand data privacy concerns may prevent data owners from sharing information
for data analysis. In order to share data while preserving privacy, data owner must come up with a solution
which achieves the dual goal of privacy preservation as well as an accuracy of data mining task –
clustering and classification. An efficient and effective approach has been proposed that aims to protect
privacy of sensitive information and obtaining data clustering with minimum information loss
Data Leakage Detection and Security Using Cloud ComputingIJERA Editor
The data owner will store the data in the cloud. Every user must registered in the cloud. Cloud provider must
verify the authorized user. If someone try to access the account, data will get leaked. This leaked data will
present in an unauthorized place (e.g., on the internet or someone’s laptop). In this paper, we propose Division
and Replication of Data in the Cloud for Optimal Performance and Security (DROPS) that collectively
approaches the security and performance issues. In DROPS methodology, we have to select the file and then
store the particular file in the cloud account. In order to provide security we are going to implement DROPS
concepts. Now we divide the file into various fragments based on the threshold value. Each and every fragments
are stored in the node using T-Coloring. After the placement of fragments in node, it is necessary to replicate
each fragments for one time in cloud.
This document summarizes a research paper on developing an improved LEACH (Low-Energy Adaptive Clustering Hierarchy) communication protocol for energy efficient data mining in multi-feature sensor networks. It begins with background on wireless sensor networks and issues like energy efficiency. It then discusses the existing LEACH protocol and its drawbacks. The proposed improved LEACH protocol includes cluster heads, sub-cluster heads, and cluster nodes to address LEACH's limitations. This new version aims to minimize energy consumption during cluster formation and data aggregation in multi-feature sensor networks.
A survey on location based serach using spatial inverted index methodeSAT Journals
Abstract Conventional spatial queries, nearest neighbor retrieval and vary search consists solely conditions on objects geometric property. But today, several fashionable applications support new kind of queries that aim to seek out objects that satisfies each spatial knowledge and their associated text. As an example instead of considering all the hotels, a nearest neighbor queries would instead elicit the building that's nearest to among people who offer services like pool, internet at a similar time. For this a sort of questioning a variant of inverted index is employed that's effective for multidimensional points associate degreed come with an R-tree which is constructed on each inverted list, and uses the algorithm of minimum bounding methodology which will answer the closest neighbor queries with keywords in real time. Keywords: Spatial database, nearest neighbor search, spatial index, keyword search.
This document summarizes a research paper that proposes a new density-based clustering technique called Triangle-Density Based Clustering Technique (TDCT) to efficiently cluster large spatial datasets. TDCT uses a polygon approach where the number of data points inside each triangle of a polygon is calculated to determine triangle densities. Triangle densities are used to identify clusters based on a density confidence threshold. The technique aims to identify clusters of arbitrary shapes and densities while minimizing computational costs. Experimental results demonstrate the technique's superiority in terms of cluster quality and complexity compared to other density-based clustering algorithms.
This document summarizes a research paper that proposes the design and implementation of an intelligent laser warning system using fuzzy logic. The system uses four laser sensors to detect the incident laser angle from 0 to 360 degrees and an additional sensor to distinguish the laser from background sunlight. A fuzzy logic algorithm is used to fuse the sensor data and estimate the angle of incidence. The system is first simulated in MATLAB and then implemented using a TI-430 microcontroller. The goal is to develop a low-cost laser detection system that can accurately detect laser threats and distinguish them from other light sources like the sun.
The document proposes a Modified Pure Radix Sort algorithm for large heterogeneous datasets. The algorithm divides the data into numeric and string processes that work simultaneously. The numeric process further divides data into sublists by element length and sorts them simultaneously using an even/odd logic across digits. The string process identifies common patterns to convert strings to numbers that are then sorted. This optimizes problems with traditional radix sort through a distributed computing approach.
Current Issue - January 2022, Volume 14, Number 1 - International Journal of ...IJCNCJournal
The International Journal of Computer Networks & Communications (IJCNC) is a bi monthly open access peer-reviewed journal that publishes articles which contribute new results in all areas of Computer Networks & Communications. The journal focuses on all technical and practical aspects of Computer Networks & data Communications. The goal of this journal is to bring together researchers and practitioners from academia and industry to focus on advanced networking concepts and establishing new collaborations in these areas.
A Reliable Routing Technique for Wireless Sensor NetworksEditor IJCATR
Wireless Sensor Network (WSN) consists of very large number of sensor nodes which are deployed close to the area which
is to be monitored so as to sense various environmental conditions. WSN is a data-driven network which produces large amount of data
and also sensor nodes are energy-limited devices and their energy consumption is mainly associated with data routing. Therefore it is
necessary to perform redundant data aggregation so as to save energy. In this work data aggregation is achieved with the help of two key
approaches namely Clustering approach and In-network data aggregation. These two approaches help to save energy and thereby
increasing the lifetime of the network. The proposed work has some key features like reliable cluster formation, high data aggregation
rate, priority of packets, minimized overhead, multiple routes, reduced energy consumption which enhance the network lifetime. The
performance evaluation of the proposed approach is carried out using Network Simulator- version 2
IRJET- Swift Retrieval of DNA Databases by Aggregating QueriesIRJET Journal
This document summarizes a research paper that proposes a new method for securely sharing and querying genomic DNA sequences stored in the cloud without violating privacy. The method builds on existing frameworks by offering deterministic results with zero error probability, and a scheme that is twice as fast but uses twice the storage space, which is preferable given cloud storage pricing. The encoding of the data supports a richer set of query types beyond exact matching, including counting matches, logical OR matches, handling ambiguities, threshold queries, and concealing results from the decrypting server. Linear and logistic regression algorithms are used to analyze the data. The literature review discusses previous work on securely sharing genomic data and transforming protocols to ensure accountability without compromising privacy.
A Secure and Dynamic Multi-keyword Ranked Search Scheme over Encrypted Cloud ...1crore projects
IEEE PROJECTS 2015
1 crore projects is a leading Guide for ieee Projects and real time projects Works Provider.
It has been provided Lot of Guidance for Thousands of Students & made them more beneficial in all Technology Training.
Dot Net
DOTNET Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
Java Project Domain list 2015
1. IEEE based on datamining and knowledge engineering
2. IEEE based on mobile computing
3. IEEE based on networking
4. IEEE based on Image processing
5. IEEE based on Multimedia
6. IEEE based on Network security
7. IEEE based on parallel and distributed systems
ECE IEEE Projects 2015
1. Matlab project
2. Ns2 project
3. Embedded project
4. Robotics project
Eligibility
Final Year students of
1. BSc (C.S)
2. BCA/B.E(C.S)
3. B.Tech IT
4. BE (C.S)
5. MSc (C.S)
6. MSc (IT)
7. MCA
8. MS (IT)
9. ME(ALL)
10. BE(ECE)(EEE)(E&I)
TECHNOLOGY USED AND FOR TRAINING IN
1. DOT NET
2. C sharp
3. ASP
4. VB
5. SQL SERVER
6. JAVA
7. J2EE
8. STRINGS
9. ORACLE
10. VB dotNET
11. EMBEDDED
12. MAT LAB
13. LAB VIEW
14. Multi Sim
CONTACT US
1 CRORE PROJECTS
Door No: 214/215,2nd Floor,
No. 172, Raahat Plaza, (Shopping Mall) ,Arcot Road, Vadapalani, Chennai,
Tamin Nadu, INDIA - 600 026
Email id: 1croreprojects@gmail.com
website:1croreprojects.com
Phone : +91 97518 00789 / +91 72999 51536
Privacy Preserving Reputation Calculation in P2P Systems with Homomorphic Enc...IJCNCJournal
This document discusses a method for privacy-preserving reputation calculation in peer-to-peer systems using homomorphic encryption. Specifically, it proposes:
1) Extending the EigenTrust reputation system to calculate node reputations in a distributed manner while preserving evaluator privacy. It does this by successively updating encrypted reputation values through calculation to reflect trust values without disclosing the original values.
2) Improving calculation efficiency by offloading parts of the task to participating nodes and using different public keys during calculation to improve robustness against node churn.
3) Evaluating the performance of the proposed method, finding it reduces maximum circulation time for aggregating multiplication results by half, reducing computation time per round. The privacy preservation cost scales
Book of abstract volume 8 no 9 ijcsis december 2010Oladokun Sulaiman
The International Journal of Computer Science and Information Security (IJCSIS) is a publication venue for novel research in computer science and information security. This issue from December 2010 contains 5 research papers. The first paper proposes a 128-bit chaotic hash function that uses the logistic map and MD5/SHA-1 hashes. The second paper discusses constructing an ontology for representing human emotions in videos to improve video retrieval. The third paper proposes an intelligent memory controller for H.264 encoders to reduce external memory access. The fourth paper investigates the impact of fragmentation on query performance in distributed databases. The fifth paper examines the effect of guard intervals in a proposed MIMO-OFDM system for wireless communication.
Building Programming Abstractions for Wireless Sensor Networks Using Watershe...M H
The availability and quality of information extracted from Wireless Sensor Networks (WSNs) revolutionised a wide range of application areas. The success of any WSN application is, nonetheless, determined by the ability to retrieve information with the required level of accuracy, within specified time constraints, and with minimum resource utilisation. This paper presents a new approach to localised information extraction that utilises the Watershed segmentation algorithm to dynamically group nodes into segments, which can be used as programming abstractions upon which different query operations can be performed. Watershed results in a set of well delimited areas, such that the number of necessary operations (communication and computation) to answer a query are minimised. This paper presents a fully asynchronous Watershed implementation, where nodes can compute their local data in parallel and independently from one another. The preliminary experimental results demonstrate that the proposed approach is able to significantly reduce the query processing cost and time without involving any loss of efficiency.
Smart Spaces and Next Generation Wired/Wireless Networking Smart Spaces and Next Generation Wired/Wireless Networking Look
Inside
Share
Share this content on Facebook Share this content on Twitter Share this content on LinkedIn
Other actions
Export citations
About this Book
Reprints and Permissions
The document describes the synthesis and morphology of silicon nanoparticles deposited on a silicon dioxide substrate using low pressure chemical vapor deposition with varying deposition times. Atomic force microscopy and image analysis software were used to characterize the nanoparticles and found that their height, density, and size varied with deposition time, with heights between 1-3 nm, densities from 2x1011 to 3.5x1011 particles/cm2, and sizes of 2-10 nm. The goal was to study how the morphological and electrical characteristics of the nanoparticles changed with different deposition parameters.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and TechnologyIJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Comparative studies on flotation of kasolite using cationic and anionic surfa...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
A quantitative risk assessment approach in an integrated cold chain system en...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Comparative study of one and two diode model of solar photovoltaic celleSAT Publishing House
1. The document compares one diode and two diode models for modeling solar photovoltaic cells. The one diode model assumes two parameters (solar radiation and temperature) but neglects recombination losses, limiting its accuracy under low irradiance conditions.
2. The two diode model accounts for recombination losses using two saturation currents but requires solving more equations, increasing computation time. Different modeling methods like particle swarm optimization require many input parameters and data, making them lengthy.
3. The paper will describe the mathematical equations for one diode and two diode models and use an iterative technique to determine the initial parameter values and compare the performance of the two models.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
This document describes the design and analysis of a planar positioning stage based on a redundantly actuated parallel linkage with six degrees of freedom (three translations and three rotations). The kinematics and workspace analysis of the linkage are presented. A static analysis method to calculate the actuator torques required for a given end-effector force and trajectory is also described. MATLAB programs were developed to analyze the workspace and perform the static analysis. The results show that the redundant actuation can help improve the workspace characteristics and prevent singular configurations compared to non-redundant parallel manipulators. The stage design has potential applications in micro-positioning.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
This document discusses various privacy preservation techniques in data mining. It summarizes classification, clustering, and association rule learning as common privacy preservation approaches. For classification, it describes decision trees, k-nearest neighbors, artificial neural networks, support vector machines, and naive Bayes models. It provides advantages and disadvantages of these techniques. The document concludes that privacy preservation techniques have emerged to allow for efficient and effective data mining while protecting sensitive data.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
The document proposes a method to secure data from theft in cloud computing using fog computing. It involves combining user behavior profiling to detect abnormal access and deploying decoy documents. Decoy documents that look like real data are used to confuse attackers if abnormal access is detected. By making it hard to distinguish real data from fakes, this integrated approach aims to increase security of personal and business data stored in the cloud.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Anomaly detection in the services provided by multi cloud architectures a surveyeSAT Publishing House
This document summarizes various anomaly detection techniques that can be used in multi-cloud architectures. It discusses statistical, data mining, and machine learning based techniques. A table compares 11 different anomaly detection models or frameworks, outlining their advantages and disadvantages. The document concludes that combining multiple techniques may generate better results for anomaly detection in clouds. Future work could optimize existing techniques or use unsupervised "black box" approaches without human intervention.
Ambiences on the-fly usage of available resources through personal devicesijasuc
In smart spaces such as smart homes, computation is
embedded everywhere: in toys, appliances, or the
home’s infrastructure. Most of these devices provid
e a pool of available resources which the user can
take
advantage, interacting and creating a friendly envi
ronment. The inherent composability of these system
s
and other unique characteristics such as low-cost e
nergy, simplicity in module programming, and even
their small size, make them a suitable candidate fo
r dynamic and adaptive ambient systems. This resear
ch
work focuses on what is defined as an “ambience”, a
space with a user-defined set of computational
devices. A smart-home is modeled as a collection of
ambiences, where every ambience is capable of
providing a pool of available resources to the user
. In turn, the user is supposed to carry one or sev
eral
personal devices able to interact with the ambience
s, taking advantage of his inherent mobility. In th
is way,
the whole system can benefit from resources discove
red in the spatial proximity. A software architectu
re is
designed, which is based on the implementation of l
ow-cost algorithms able to detect and update the sy
stem
when changes in an ambience occur. Ambience middlew
are implementation works in a wide range of
architectures and OSs, while showing a negligible o
verhead in the time to perform the basic output
operations.
SECURE CLOUD COMPUTING MECHANISM FOR ENHANCING: MTBACijistjournal
The development of the cloud system,A large number of vendors can visit their users in the same platform directing their focus on the software rather than the underlying framework. This necessary require the distribution, storage analysis of the data on cloud accessing virtualized and scalable web services with broad application of cloud, the data security and access control become a major concern. The access to the cloud requires authorization as well as data accessibility permission. The verification and updation of data accessibility permissions and data must be done with proper knowledge which requires identification of correct updates and block listed users who are intruder to cloud Introducing the false data system. In this paper we approach to builds a mutual trust relationship between users and cloud for accessing control method in cloud computing environment focusing on the system integrity and its security. The proposed approach is executed as a procedure manner and includes many steps to identify the user’s credibility in the cloud network.
A Survey of File Replication Techniques In Grid SystemsEditor IJCATR
Grid is a type of parallel and distributed systems that is designed to provide reliable access to data
and computational resources in wide area networks. These resources are distributed in different geographical
locations. Efficient data sharing in global networks is complicated by erratic node failure, unreliable network
connectivity and limited bandwidth. Replication is a technique used in grid systems to improve the
applications’ response time and to reduce the bandwidth consumption. In this paper, we present a survey on
basic and new replication techniques that have been proposed by other researchers. After that, we have a full
comparative study on these replication strategies.
This document provides a survey of file replication techniques used in grid systems. It begins with an introduction to grid systems and discusses their use of replication to improve response times and reduce bandwidth consumption. It then categorizes replication techniques as static or dynamic and describes challenges of replication including maintaining consistency and overhead. The document surveys various replication strategies for different grid topologies like peer-to-peer, tree and hybrid. It evaluates strategies based on factors like access latency, bandwidth consumption and fault tolerance. Specific replication techniques are discussed for peer-to-peer architectures aimed at availability, placement strategies and balancing workloads.
A Survey of File Replication Techniques In Grid SystemsEditor IJCATR
Grid is a type of parallel and distributed systems that is designed to provide reliable access to data
and computational resources in wide area networks. These resources are distributed in different geographical
locations. Efficient data sharing in global networks is complicated by erratic node failure, unreliable network
connectivity and limited bandwidth. Replication is a technique used in grid systems to improve the
applications’ response time and to reduce the bandwidth consumption. In this paper, we present a survey on
basic and new replication techniques that have been proposed by other researchers. After that, we have a full
comparative study on these replication strategies
Study on security and quality of service implementations in p2 p overlay netw...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
This document describes a proposed multi-agent system for searching distributed data. The system uses three types of agents - coordinator agents, search agents, and local agents. Coordinator agents coordinate the retrieval process by creating search agents and collecting results. Search agents carry queries to nodes containing relevant databases. Local agents reside at nodes with databases, accept queries from search agents, search the databases for answers, and return results to the search agents. The system aims to retrieve data from distributed databases with minimum network bandwidth consumption using this multi-agent approach.
Potato Leaf Disease Detection Using Machine LearningIRJET Journal
This document discusses a study on detecting potato leaf diseases using machine learning techniques. The researchers collected a dataset of potato leaf images from Kaggle containing healthy leaves and leaves affected by early and late blight diseases. They performed preprocessing including data augmentation to increase the dataset size. A convolutional neural network model was trained on the images to extract features and classify leaves as healthy or diseased, achieving an accuracy of 97.71%. The CNN model outperformed traditional machine learning classifiers. The researchers concluded machine learning is an effective approach for automated disease detection to improve agricultural production through early identification.
Activity Context Modeling in Context-AwareEditor IJCATR
The explosion of mobile devices has fuelled the advancement of pervasive computing to provide personal assistance in this
information-driven world. Pervasive computing takes advantage of context-aware computing to track, use and adapt to contextual
information. The context that has attracted the attention of many researchers is the activity context. There are six major techniques that
are used to model activity context. These techniques are key-value, logic-based, ontology-based, object-oriented, mark-up schemes and
graphical. This paper analyses these techniques in detail by describing how each technique is implemented while reviewing their pros
and cons. The paper ends with a hybrid modeling method that fits heterogeneous environment while considering the entire of modeling
through data acquisition and utilization stages. The modeling stages of activity context are data sensation, data abstraction and
reasoning and planning. The work revealed that mark-up schemes and object-oriented are best applicable at the data sensation stage.
Key-value and object-oriented techniques fairly support data abstraction stage whereas the logic-based and ontology-based techniques
are the ideal techniques for reasoning and planning stage. In a distributed system, mark-up schemes are very useful in data
communication over a network and graphical technique should be used when saving context data into database.
MAP/REDUCE DESIGN AND IMPLEMENTATION OF APRIORIALGORITHM FOR HANDLING VOLUMIN...acijjournal
Apriori is one of the key algorithms to generate frequent itemsets. Analysing frequent itemset is a crucial
step in analysing structured data and in finding association relationship between items. This stands as an
elementary foundation to supervised learning, which encompasses classifier and feature extraction
methods. Applying this algorithm is crucial to understand the behaviour of structured data. Most of the
structured data in scientific domain are voluminous. Processing such kind of data requires state of the art
computing machines. Setting up such an infrastructure is expensive. Hence a distributed environment
such as a clustered setup is employed for tackling such scenarios. Apache Hadoop distribution is one of
the cluster frameworks in distributed environment that helps by distributing voluminous data across a
number of nodes in the framework. This paper focuses on map/reduce design and implementation of
Apriori algorithm for structured data analysis.
Grid resource discovery a survey and comparative analysis 2IAEME Publication
This document provides a survey and comparative analysis of different approaches to grid resource discovery. It begins with introducing computational grids and the need for effective resource discovery given factors like heterogeneous and dynamic resources. It then outlines common components of resource discovery systems and environmental factors that impact design. The document surveys several prominent resource discovery approaches, including decentralized, agent-based, routing/transferring model-based, and ontology description-based. It also discusses specific algorithms and models for resource discovery, covering aspects like message complexity. In concluding, the document performs a comparative analysis of the various resource discovery approaches.
IRJET- An Efficient Energy Consumption Minimizing Based on Genetic and Power ...IRJET Journal
This document discusses techniques for minimizing energy consumption in cloud computing. It proposes using a genetic algorithm-based power aware scheduling (G-PARS) method along with a Dynamic Single Threshold (DST) virtual machine consolidation approach to dynamically reallocate VMs and reduce the number of active physical nodes. The paper also reviews related work on resource optimization algorithms like genetic algorithms, ant colony optimization, and particle swarm optimization. It finds that using DST and G-PARS can minimize power consumption compared to other existing algorithms under different workload conditions.
This document presents a framework for reusing existing software agents through ontological engineering. The framework includes components like a user interface agent, query processor, mapping agent, transfer agent, wrapper agent, and remote agents containing ontologies. The query processor reformulates the user's query, the mapping agent identifies relevant ontologies, and the transfer agent sends the query to remote agents. The remote agents provide ontologies as output, which are then integrated/merged and presented back to the user interface agent. The goal is to enable reuse of heterogeneous agents across different development environments through a standardized ontology representation.
Information extraction from sensor networks using the Watershed transform alg...M H
Wireless sensor networks are an effective tool to provide fine resolution monitoring of the physical environment. Sensors generate continuous streams of data, which leads to several computational challenges. As sensor nodes become increasingly active devices, with more processing and communication resources, various methods of distributed data processing and sharing become feasible. The challenge is to extract information from the gathered sensory data with a specified level of accuracy in a timely and power-efficient approach. This paper presents a new solution to distributed information extraction that makes use of the morphological Watershed algorithm. The Watershed algorithm dynamically groups sensor nodes into homogeneous network segments with respect to their topological relationships and their sensing-states. This setting allows network programmers to manipulate groups of spatially distributed data streams instead of individual nodes. This is achieved by using network segments as programming abstractions on which various query processes can be executed. Aiming at this purpose, we present a reformulation of the global Watershed algorithm. The modified Watershed algorithm is fully asynchronous, where sensor nodes can autonomously process their local data in parallel and in collaboration with neighbouring nodes. Experimental evaluation shows that the presented solution is able to considerably reduce query resolution cost without scarifying the quality of the returned results. When compared to similar purpose schemes, such as “Logical Neighborhood”, the proposed approach reduces the total query resolution overhead by up to 57.5%, reduces the number of nodes involved in query resolution by up to 59%, and reduces the setup convergence time by up to 65.1%.
This document summarizes a paper that presents a novel method for passive resource discovery in cluster grid environments. The method monitors network packet frequency from nodes' network interface cards to identify nodes with available CPU cycles (<70% utilization) by detecting latency signatures from frequent context switching. Experiments on a 50-node testbed showed the method can consistently and accurately discover available resources by analyzing existing network traffic, including traffic passed through a switch. The paper also proposes algorithms for distributed two-level resource discovery, replication and utilization to optimize resource allocation and access costs in distributed computing environments.
An efficient approach on spatial big data related to wireless networks and it...eSAT Journals
Abstract
Spatial big data acts as a important key role in wireless networks applications. In that spatial and spatio temporal problems contains the distinct role in big data and it’s compared to common relational problems. If we are solving those problems means describing the three applications for spatial big data. In each applications imposing the specific design and we are developing our work on highly scalable parallel processing for spatial big data in Hadoop frameworks by using map reduce computational model. Our results show that enables highly scalable implementations of algorithms using Hadoop for the purpose of spatial data processing problems. Inspite of developing these implementations requires specialized knowledge and user friendly.
Keywords: Spatial Big Data, Hadoop, Wireless Networks, Map reduce
Semantic Web concepts used in Web 3.0 applicationsIRJET Journal
This document discusses how semantic web concepts can be used in applications for Web 3.0. It provides examples of how semantic web could be applied in areas like medical sciences, search engines, and e-learning. Specifically, it describes how semantic web could help integrate medical data from different sources, enable more accurate medical diagnosis and treatment recommendations based on patient history. It also discusses how a semantic search engine like Swoogle works by tagging web pages with metadata to better understand context and return more relevant search results. Finally, it touches on how a semantic web architecture could enable more sophisticated e-learning systems by linking educational resources.
Resource Consideration in Internet of Things A Perspective Viewijtsrd
The ubiquitous computing and its applications at different levels of abstraction are possible mainly by virtualization. Most of its applications are becoming pervasive with each passing day and with the growing trend of embedding computational and networking capabilities in everyday objects of use by a common man. Virtualization provides many opportunities for research in IoT since most of the IoT applications are resource constrained. Therefore, there is a need for an approach that shall manage the resources of the IoT ecosystem. Virtualization is one such approach that can play an important role in maximizing resource utilization and managing the resources of IoT applications. This paper presents a survey of Virtualization and the Internet of Things. The paper also discusses the role of virtualization in IoT resource management. Rishikesh Sahani | Prof. Avinash Sharma ""Resource Consideration in Internet of Things: A Perspective View"" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-4 , June 2019, URL: https://www.ijtsrd.com/papers/ijtsrd23694.pdf
Paper URL: https://www.ijtsrd.com/computer-science/world-wide-web/23694/resource-consideration-in-internet-of-things-a-perspective-view/rishikesh-sahani
Hudhud cyclone caused extensive damage in Visakhapatnam, India in October 2014, especially to tree cover. This will likely impact the local environment in several ways: increased air pollution as trees absorb less; higher temperatures without tree canopy; increased erosion and landslides. It also created large amounts of waste from destroyed trees. Proper management of solid waste is needed to prevent disease spread. Suggested measures include restoring damaged plants, building fountains to reduce heat, mandating light-colored buildings, improving waste management, and educating public on health risks. Overall, changes are needed to water, land, and waste practices to rebuild the environment after the cyclone removed green cover.
Impact of flood disaster in a drought prone area – case study of alampur vill...eSAT Publishing House
1) In September-October 2009, unprecedented heavy rainfall and dam releases caused widespread flooding in Alampur village in Mahabub Nagar district, a historically drought-prone area.
2) The flood damaged or destroyed homes, buildings, infrastructure, crops, and documents. It displaced many residents and cut off the village.
3) The socioeconomic conditions and mud-based construction of homes in the village exacerbated the flood's impacts, making damage more severe and recovery more difficult.
The document summarizes the Hudhud cyclone that struck Visakhapatnam, India in October 2014. It describes the cyclone's formation, rapid intensification to winds of 175 km/h, and landfall near Visakhapatnam. The cyclone caused extensive damage estimated at over $1 billion and at least 109 deaths in India and Nepal. Infrastructure like buildings, bridges, and power lines were destroyed. Crops and fishing boats were also damaged. The document then discusses coping strategies and improvements needed to disaster management plans to better prepare for future cyclones.
Groundwater investigation using geophysical methods a case study of pydibhim...eSAT Publishing House
This document summarizes the results of a geophysical investigation using vertical electrical sounding (VES) methods at 13 locations around an industrial area in India. The VES data was interpreted to generate geo-electric sections and pseudo-sections showing subsurface resistivity variations. Three main layers were typically identified - a high resistivity topsoil, a weathered middle layer, and a basement rock. Pseudo-sections revealed relatively more weathered areas in the northwest and southwest. Resistivity sections helped identify zones of possible high groundwater potential based on low resistivity anomalies sandwiched between more resistive layers. The study concluded the electrical resistivity method was useful for understanding subsurface geology and identifying areas prospective for groundwater exploration.
Flood related disasters concerned to urban flooding in bangalore, indiaeSAT Publishing House
1. The document discusses urban flooding in Bangalore, India. It describes how factors like heavy rainfall, population growth, and improper land use have contributed to increased flooding in the city.
2. Flooding events in 2013 are analyzed in detail. A November rainfall caused runoff six times higher than the drainage capacity, inundating low-lying residential areas.
3. Impacts of urban flooding include disrupted daily life, damaged infrastructure, and decreased economic activity in affected areas. The document calls for improved flood management strategies to better mitigate urban flooding risks in Bangalore.
Enhancing post disaster recovery by optimal infrastructure capacity buildingeSAT Publishing House
This document discusses enhancing post-disaster recovery through optimal infrastructure capacity building. It presents a model to minimize the cost of meeting demand using auxiliary capacities when disaster damages infrastructure. The model uses genetic algorithms to select optimal capacity combinations. The document reviews how infrastructure provides vital services supporting recovery activities and discusses classifying infrastructure into six types. When disaster reduces infrastructure services, a gap forms between community demands and available support, hindering recovery. The proposed research aims to identify this gap and optimize capacity selection to fill it cost-effectively.
Effect of lintel and lintel band on the global performance of reinforced conc...eSAT Publishing House
This document analyzes the effect of lintels and lintel bands on the seismic performance of reinforced concrete masonry infilled frames through non-linear static pushover analysis. Four frame models are considered: a frame with a full masonry infill wall; a frame with a central opening but no lintel/band; a frame with a lintel above the opening; and a frame with a lintel band above the opening. The results show that the full infill wall model has 27% higher stiffness and 32% higher strength than the model with just an opening. Models with lintels or lintel bands have slightly higher strength and stiffness than the model with just an opening. The document concludes lintels and lintel
Wind damage to trees in the gitam university campus at visakhapatnam by cyclo...eSAT Publishing House
1) A cyclone with wind speeds of 175-200 kph caused massive damage to the green cover of Gitam University campus in Visakhapatnam, India. Thousands of trees were uprooted or damaged.
2) A study assessed different types of damage to trees from the cyclone, including defoliation, salt spray damage, damage to stems/branches, and uprooting. Certain tree species were more vulnerable than others.
3) The results of the study can help in selecting more wind-resistant tree species for future planting and reducing damage from future storms.
Wind damage to buildings, infrastrucuture and landscape elements along the be...eSAT Publishing House
1) A visual study was conducted to assess wind damage from Cyclone Hudhud along the 27km Visakha-Bheemli Beach road in Visakhapatnam, India.
2) Residential and commercial buildings suffered extensive roof damage, while glass facades on hotels and restaurants were shattered. Infrastructure like electricity poles and bus shelters were destroyed.
3) Landscape elements faced damage, including collapsed trees that damaged pavements, and debris in parks. The cyclone wiped out over half the city's green cover and caused beach erosion around protected areas.
1) The document reviews factors that influence the shear strength of reinforced concrete deep beams, including compressive strength of concrete, percentage of tension reinforcement, vertical and horizontal web reinforcement, aggregate interlock, shear span-to-depth ratio, loading distribution, side cover, and beam depth.
2) It finds that compressive strength of concrete, tension reinforcement percentage, and web reinforcement all increase shear strength, while shear strength decreases as shear span-to-depth ratio increases.
3) The distribution and amount of vertical and horizontal web reinforcement also affects shear strength, but closely spaced stirrups do not necessarily enhance capacity or performance.
Role of voluntary teams of professional engineers in dissater management – ex...eSAT Publishing House
1) A team of 17 professional engineers from various disciplines called the "Griha Seva" team volunteered after the 2001 Gujarat earthquake to provide technical assistance.
2) The team conducted site visits, assessments, testing and recommended retrofitting strategies for damaged structures in Bhuj and Ahmedabad. They were able to fully assess and retrofit 20 buildings in Ahmedabad.
3) Factors observed that exacerbated the earthquake's impacts included unplanned construction, non-engineered buildings, improper prior retrofitting, and defective materials and workmanship. The professional engineers' technical expertise was crucial for effective post-disaster management.
This document discusses risk analysis and environmental hazard management. It begins by defining risk, hazard, and toxicity. It then outlines the steps involved in hazard identification, including HAZID, HAZOP, and HAZAN. The document presents a case study of a hypothetical gas collecting station, identifying potential accidents and hazards. It discusses quantitative and qualitative approaches to risk analysis, including calculating a fire and explosion index. The document concludes by discussing hazard management strategies like preventative measures, control measures, fire protection, relief operations, and the importance of training personnel on safety.
Review study on performance of seismically tested repaired shear wallseSAT Publishing House
This document summarizes research on the performance of reinforced concrete shear walls that have been repaired after damage. It begins with an introduction to shear walls and their failure modes. The literature review then discusses the behavior of original shear walls as well as different repair techniques tested by other researchers, including conventional repair with new concrete, jacketing with steel plates or concrete, and use of fiber reinforced polymers. The document focuses on evaluating the strength retention of shear walls after being repaired with various methods.
Monitoring and assessment of air quality with reference to dust particles (pm...eSAT Publishing House
This document summarizes a study on monitoring and assessing air quality with respect to dust particles (PM10 and PM2.5) in the urban environment of Visakhapatnam, India. Sampling was conducted in residential, commercial, and industrial areas from October 2013 to August 2014. The average PM2.5 and PM10 concentrations were within limits in residential areas but moderate to high in commercial and industrial areas. Exceedance factor levels indicated moderate pollution for residential areas and moderate to high pollution for commercial and industrial areas. There is a need for management measures like improved public transport and green spaces to combat particulate air pollution in the study areas.
Low cost wireless sensor networks and smartphone applications for disaster ma...eSAT Publishing House
This document describes a low-cost wireless sensor network and smartphone application system for disaster management. The system uses an Arduino-based wireless sensor network comprising nodes with various sensors to monitor the environment. The sensor data is transmitted to a central gateway and then to the cloud for analysis. A smartphone app connected to the cloud can detect disasters from the sensor data and send real-time alerts to users to help with early evacuation. The system aims to provide low-cost localized disaster detection and warnings to improve safety.
Coastal zones – seismic vulnerability an analysis from east coast of indiaeSAT Publishing House
This document summarizes an analysis of seismic vulnerability along the east coast of India. It discusses the geotectonic setting of the region as a passive continental margin and reports some moderate seismic activity from offshore in recent decades. While seismic stability cannot be assumed given events like the 2004 tsunami, no major earthquakes have been recorded along this coast historically. The document calls for further study of active faults, neotectonics, and implementation of improved seismic building codes to mitigate vulnerability.
Can fracture mechanics predict damage due disaster of structureseSAT Publishing House
This document discusses how fracture mechanics can be used to better predict damage and failure of structures. It notes that current design codes are based on small-scale laboratory tests and do not account for size effects, which can lead to more brittle failures in larger structures. The document outlines how fracture mechanics considers factors like size effect, ductility, and minimum reinforcement that influence the strength and failure behavior of structures. It provides examples of how fracture mechanics has been applied to problems like evaluating shear strength in deep beams and investigating a failure of an oil platform structure. The document argues that fracture mechanics provides a more scientific basis for structural design compared to existing empirical code provisions.
This document discusses the assessment of seismic susceptibility of reinforced concrete (RC) buildings. It begins with an introduction to earthquakes and the importance of vulnerability assessment in mitigating earthquake risks and losses. It then describes modeling the nonlinear behavior of RC building elements and performing pushover analysis to evaluate building performance. The document outlines modeling RC frames and developing moment-curvature relationships. It also summarizes the results of pushover analyses on sample 2D and 3D RC frames with and without shear walls. The conclusions emphasize that pushover analysis effectively assesses building properties but has limitations, and that capacity spectrum method provides appropriate results for evaluating building response and retrofitting impact.
A geophysical insight of earthquake occurred on 21 st may 2014 off paradip, b...eSAT Publishing House
1) A 6.0 magnitude earthquake occurred off the coast of Paradip, Odisha in the Bay of Bengal on May 21, 2014 at a depth of around 40 km.
2) Analysis of magnetic and bathymetric data from the area revealed the presence of major lineaments in NW-SE and NE-SW directions that may be responsible for seismic activity through stress release.
3) Movements along growth faults at the margins of large Bengal channels, due to large sediment loads, could also contribute to seismic events by triggering movements along the faults.
Effect of hudhud cyclone on the development of visakhapatnam as smart and gre...eSAT Publishing House
This document discusses the effects of Cyclone Hudhud on the development of Visakhapatnam as a smart and green city through a case study and preliminary surveys. The surveys found that 31% of participants had experienced cyclones, 9% floods, and 59% landslides previously in Visakhapatnam. Awareness of disaster alarming systems increased from 14% before the 2004 tsunami to 85% during Cyclone Hudhud, while awareness of disaster management systems increased from 50% before the tsunami to 94% during Hudhud. The surveys indicate that initiatives after the tsunami improved awareness and preparedness. Developing Visakhapatnam as a smart, green city should consider governance
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
Literature Review Basics and Understanding Reference Management.pptxDr Ramhari Poudyal
Three-day training on academic research focuses on analytical tools at United Technical College, supported by the University Grant Commission, Nepal. 24-26 May 2024
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
6th International Conference on Machine Learning & Applications (CMLA 2024)ClaraZara1
6th International Conference on Machine Learning & Applications (CMLA 2024) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of on Machine Learning & Applications.
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsVictor Morales
K8sGPT is a tool that analyzes and diagnoses Kubernetes clusters. This presentation was used to share the requirements and dependencies to deploy K8sGPT in a local environment.
International Conference on NLP, Artificial Intelligence, Machine Learning an...gerogepatton
International Conference on NLP, Artificial Intelligence, Machine Learning and Applications (NLAIM 2024) offers a premier global platform for exchanging insights and findings in the theory, methodology, and applications of NLP, Artificial Intelligence, Machine Learning, and their applications. The conference seeks substantial contributions across all key domains of NLP, Artificial Intelligence, Machine Learning, and their practical applications, aiming to foster both theoretical advancements and real-world implementations. With a focus on facilitating collaboration between researchers and practitioners from academia and industry, the conference serves as a nexus for sharing the latest developments in the field.
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELgerogepatton
As digital technology becomes more deeply embedded in power systems, protecting the communication
networks of Smart Grids (SG) has emerged as a critical concern. Distributed Network Protocol 3 (DNP3)
represents a multi-tiered application layer protocol extensively utilized in Supervisory Control and Data
Acquisition (SCADA)-based smart grids to facilitate real-time data gathering and control functionalities.
Robust Intrusion Detection Systems (IDS) are necessary for early threat detection and mitigation because
of the interconnection of these networks, which makes them vulnerable to a variety of cyberattacks. To
solve this issue, this paper develops a hybrid Deep Learning (DL) model specifically designed for intrusion
detection in smart grids. The proposed approach is a combination of the Convolutional Neural Network
(CNN) and the Long-Short-Term Memory algorithms (LSTM). We employed a recent intrusion detection
dataset (DNP3), which focuses on unauthorized commands and Denial of Service (DoS) cyberattacks, to
train and test our model. The results of our experiments show that our CNN-LSTM method is much better
at finding smart grid intrusions than other deep learning algorithms used for classification. In addition,
our proposed approach improves accuracy, precision, recall, and F1 score, achieving a high detection
accuracy rate of 99.50%.
Understanding Inductive Bias in Machine LearningSUTEJAS
This presentation explores the concept of inductive bias in machine learning. It explains how algorithms come with built-in assumptions and preferences that guide the learning process. You'll learn about the different types of inductive bias and how they can impact the performance and generalizability of machine learning models.
The presentation also covers the positive and negative aspects of inductive bias, along with strategies for mitigating potential drawbacks. We'll explore examples of how bias manifests in algorithms like neural networks and decision trees.
By understanding inductive bias, you can gain valuable insights into how machine learning models work and make informed decisions when building and deploying them.
1. IJRET: International Journal of Research in Engineering and Technology ISSN: 2319-1163
__________________________________________________________________________________________
Volume: 02 Issue: 02 | Feb-2013, Available @ http://www.ijret.org 196
MAPPING OF GENES USING CLOUD TECHNOLOGIES
Subhendu Bhusan Rou1
, Sarojananda Mishra2
, Bhabani Sankar Prasad Mishra3
1 Dept. of Computer Science Engineering & Application, IGIT Sarang, Odisha, India, subhendu.as@gmail.com
2 Dept. of Computer Science Engineering & Application, IGIT Sarang, Odisha, India, sarose.mishra@gmail.com
3 School of Computer Engineering, KIIT University, Bhubaneswar, Odisha, India, bspmishrafcs@kiit.ac.in
Abstract
Bioinformatics is a much updated topic for the recent researchers. There are various tasks of bioinformatics, like Alignment and
comparison of DNA and RNA, Gene mapping on chromosomes, Protein structure prediction, gene finding from DNA sequences etc.
Mapping of gene is the procedure of calculating the distance between the genes in chromosomes. In the real time application the
medicine researchers goes for processing huge amount of data that may comes from different clusters or from different location. The
mapping results also differ from each other with huge differences. In order to process a huge amount of data we need such a good
platform which will serve without fail. Cloud computing is such a good technique that can be applied for various data as well as a
huge amount of data to compare, distinguish, or simply for mapping the genes in chromosomes. In the field of cloud computing,
Apache hadoop is a platform which provides a good platform for processing huge amount of data. Till now Apache hadoop is having
the similar type of application in Face book & Yahoo. It carries the properties like fault tolerance which can be very useful in
securing the data. In this paper we have discussed about the application of cloud technologies in Gene Mapping in chromosomes and
as a real time application we have discussed about the Apache hadoop that can be applied for this purpose also.
Index Terms: Bioinformatics, Cloud computing, Gene mapping, Protein structure prediction, Apache hadoop,
Chromosome.
-----------------------------------------------------------------------***----------------------------------------------------------------------
1. INTRODUCTION
Cloud computing is an internet based computing of shared
resources, databases, soft wares etc. These services are
generally provided over Internet with an on demand basis like
an electricity grid. One can access any of the resources that
live in the cloud across the Internet and don't have to worry
about computing capacity, bandwidth, storage, security, and
reliability. The advantages of cloud computing over traditional
computing include: agility, lower entry cost, device
independency, location independency, and scalability. As the
popularity of Internet is growing day by day, so many
applications from different sides of the world can be combined
through internet to process, gather or to integrate for a
particular research. To develop a drug it need a high scale
research of different chromosomes, DNA, RNA etc. In order
to process the huge amount of data it needs a good platform
that will serve dedicative for the simulation of data or simply
for research purpose. The platform should be unbiased and
error free. The cloud computing is such a good technology for
this purpose.
The research upon genes is a major project all over the world
for scientist as well as researchers. It is the process of
identifying the location of the genes in chromosomes. In order
to do this there has to be a way to find the specific location of
genes on each individual chromosome. There are three ways
in which chromosomes are mapped. One way is to map a
cytogenetic map in which chromosome bands, each
representing 1 million to 5 million bases, are stained and the
investigator finds a correlation between people who show a
particular trait and exhibit a similar staining pattern. Another
way is to produce a physical map using enzymes to cut pieces
of DNA into fragments containing markers along with genes
whose location is to be determined. By using computers to
"walk" or overlay these fragments into their proper sequence
we can produce a map of a long strand of DNA. The third
technique is a method that has been used for the longest time
that is mapping by crossover frequency.
Genes travel as packaged trains on chromosomes. During
meiosis, chromosomes can do some fairly interesting things
such as losing pieces (deletion), flipping sections up-side
down (inversion), and not separating from their homologous
partner when they are supposed to (non- disjunction).
Crossover occurs when homologous chromosomes separate
towards the end of the prophase, but are still attached at a few
points along their lengths. It is during this attachment that
these chromosomes can exchange pieces of their genetic
instructions. The frequency of this crossover is directly related
to the physical distance that genes are separated from each
other on the same chromosome. Genes close to one another
have a lower frequency of crossover than do genes farther
apart. By keeping records of genetic experiments, we can
calculate the crossover frequency; this being the number of
times that gene behaves and moves in the chromosomes [1]. In
this genetic mapping assignment, we have proposed a
2. IJRET: International Journal of Research in Engineering and Technology ISSN: 2319-1163
__________________________________________________________________________________________
Volume: 02 Issue: 02 | Feb-2013, Available @ http://www.ijret.org 197
technique of applying cloud technology for the research
purpose through which number of drug researchers can share
their knowledge in a common platform. It will give more
efficiency and good result in less time.
The Apache Hadoop project develops open-source software
for reliable, scalable, distributed computing. The Apache
Hadoop software library is a framework that allows for the
distributed processing of large data sets across clusters of
computers using a simple programming model. It is designed
to scale up from single servers to thousands of machines, each
offering local computation and storage. Rather than rely on
hardware to deliver high-availability, the library itself is
designed to detect and handle failures at the application layer,
so delivering a highly-available service on top of a cluster of
computers, each of which may be prone to failures. Not only
in a single cluster it may process data from multiple clusters
that may situate in several area of the globe through internet.
Researcher from different clusters may share distribute or
communicate their research work and knowledge over the
network. It can possible only for the capability of handling
large amount of data.
In this paper in section 2 we have discussed briefly about the
cloud computing with all about multitenant architecture and
Apache hadoop in subsection „2.1‟ and „2.2‟. In section 3 we
have provide a brief idea about gene, gene mapping, and some
gene mapping technologies. In section 4 of this paper we have
proposed an idea about the application of cloud computing in
gene mapping as well as a proposed example for real time
application of apache Hadoop in processing of huge amount of
data. Finally the paper concludes in section 5.
2. CLOUD COMPUTING: A BRIEF
INTRODUCTION
Clouds have emerged as a computing infrastructure that
enables rapid delivery of computing resources as a utility in a
dynamically scalable, virtualized manner. The advantages of
cloud computing over traditional computing include: agility,
lower entry cost, device independency, location independency,
and scalability. There are many cloud computing initiatives
from IT giants such as Microsoft, IBM, Google, Amazon as
well as start-ups such as Parascale, Elastra and Appirio.
Basically there are three types of resources that can be shared
and consumed over the Internet. They can be shared among
users by leveraging economy of scale. One of the major
objectives of Cloud Computing is to leverage Internet for
various applications and to provision resources to users [2].
The three type of resources that can be consumed by means of
cloud computing is
Infrastructure as a service
Platform as a service
Software as a service
2.1 Multi Tenant Architecture
Multi-tenancy architecture has the capability to handle Single
application Instance and Multiple service instances. According
to SOCCA [2], it supports Single application instance and
multiple service instances. The motivation behind this pattern
is that the workloads are often not distributed evenly among
application components, and the performance of the single
application instance is limited by the application components
having lower throughput. Moreover, to enhance scalability, we
want to reduce unnecessary duplications as much as possible
as opposed to Multiple Application Instances pattern.
By using multitenant architecture a user can get multiple
services in a single application that services can be a single
cloud or from multiple clouds. If a particular service is not
available form one resource at a time, than from another
source this service can be provided to the customer. Better
scalability is not only the benefit from this Single Application
Instance and Multiple Service Instance pattern but easy
customizability is another gain of service [2]. If at a time
according to the list of demand services a service is not in the
service instance, it can be easily plugged into the existing
service instances group from another. This may the application
of fault tolerance system.
2.2 Apache Hadoop: A Cloud Computing Approach
Apache hadoop is a real time application of cloud computing.
Now a day it has the applications in Facebook, Google,
Yahoo, etc. Hadoop Distributed File System (HDFS) is a
distributed file system that provides high-throughput access to
application data. Hadoop has the fault tolerance capacity. In
August 2010 hadoop applied to worlds largest data cluster
Facebook [3]. Now Hadoop is used in searching machines like
yahoo, amazon, zvents etc. It has the application as log
processing in case of Facebook, Yahoo, ContextWeb. Joost,
Last.fm etc. It has the data warehousing & processing
applications as well as video and image analysis in the field of
Facebook, AOL and New York Times, Eyealike respectively.
In these field it can able to process a huge amount of data that
comes from many users or simply many part of the globe. It
has the capability to handle, retrieve or manipulate huge
amount of data.
In case of distributed file system there is a single namespace
for the entire cluster. Data co-herency is available so that once
writing many nodes can able to read it. Clients can only
append to existing files. Files are broken in to typically 128-
256 of block size & each block can replicated on multiple data
nodes. Clients can find location of various blocks that are
available & clients can access data directly from data nodes.
The HDFS (Hadoop distributed file system) is having 10K
nodes, 1 billion files, 100pb of data. The files are replicated to
handle hardware failure and by detecting failures it recover
from them. It is optimized for batch processing system i.e.
Data locations exposed so that computations can move to
where data resides which works with high bandwidth also. It is
3. IJRET: International Journal of Research in Engineering and Technology ISSN: 2319-1163
__________________________________________________________________________________________
Volume: 02 Issue: 02 | Feb-2013, Available @ http://www.ijret.org 198
very user friendly and runs on heterogeneous operating
system.
Now a day‟s more than 500 million active facebook users
generate & share 30billion pieces of content every month. As
a statistics 20 TB of compressed new data added per day with
3 PB of compressed data scanned per day. After all 480K
compute hours is spending per day. In some cases HDFS is
used for the storage of online application form. In this way
hadoop has a large scale application in the field of distributed
operating as well as a capability of processing huge amount of
data [3].
3. GENOME THEORY & GENE MAPPING
Biology is the science of living things. Living organisms are
characterized by both diversity and unity. The evolutionary
theory is originally developed in the nineteenth century and
currently undergoing a renaissance of deeper understanding
that helps us to pinpoint the mechanisms, which lead to the
amazing diversity we find among living beings. We are using
biology science in various ways and multitudes of causes. This
is for various problems regarding living bodies or living cells.
The Gene Theory is one of the basic principles of biology. The
main concept of this theory is that traits are passed from
parents to offspring through gene transmission. Genes are
located on chromosomes and consist of DNA. They are passed
from parent to offspring through reproduction. The principles
that govern heredity were introduced by a Gregor Mendel in
the 1860's. These principles are now called Mendel's law of
segregation and law of independent assortment[4]. This new
Science topic of Genome theory is a revolution in the way, life
is understood and in the way scientific information is available
online. Information Technology helps us in touch with this
emerging scientific theory, the researchers involved and
related news stories. The researches that are established in
various clusters are connected by means of a network
connectivity which is associated with information technology.
Fig 1 Gene structure in a chromosome
Gene mapping, also called genome mapping is the creation of
a genetic map assigning DNA fragments to chromosomes.
When a genome is first investigated, this map is nonexistent.
The map improves with the scientific applications progress
and becomes perfect when the genomic DNA sequencing of
the species has been completed. In fig 1 it provides the gene
structure in a chromosome. If we want to map between two
genes we have to distinguish between the genes in the
chromosomes. During this process, and for the investigation of
differences in strain, the fragments are identified by small
tags. These may be genetic markers or the unique sequence
dependent pattern of DNA-cutting enzymes. The ordering is
taken from genetic observations for these markers. Mapping is
used in two different but related contexts. Two different ways
of mapping are distinguished.
Genetic mapping uses classical genetic techniques to
determine sequence features within a genome. Using modern
molecular biology techniques for the same purpose is usually
referred to as physical mapping. In physical mapping, the
DNA is cut by a restriction enzyme. Once cut, the DNA
fragments are separated by electrophoresis. The resulting
pattern of DNA migration is used to identify what stretch of
DNA is in the clone. By analysing the fingerprints, contigs are
assembled by automated or manual means into overlapping
DNA stretches. Macro restriction is a type of physical
mapping where in the high molecular weight DNA is digested
with a restriction enzyme having a low number of restriction
sites. Once the map is determined, the clones can be used as a
resource to efficiently contain large stretches of the genome.
This type of mapping is more accurate than genetic maps.
Genes can be mapped prior to the complete sequencing by
independent approaches like in situ hybridization [5].
The process to identify a genetic element that signs partially or
fully for a disease is also referred to as mapping. If the locus
in which the search is performed is already considerably
constrained, the search is called the "fine-mapping" of a gene.
This information is derived from the investigation of disease-
manifestations in large families (Genetic linkage) or from
populations-based genetic association studies. As these types
of process or research take place with a huge amount with
large number of clusters so it needs to share data, information
in order to develop a final result.
4. APPLICATION OF CLOUD COMPUTING IN GENE
MAPPING
Cloud computing is a internet based computing of shared
resources, database, software etc in which one can access any
of the resources that live in the cloud across the Internet and
don't have to worry about computing capacity, bandwidth,
storage, security, and reliability of the system. By using cloud
technologies many user can share data on a single platform so
that a common result can be concluded. Using cloud
4. IJRET: International Journal of Research in Engineering and Technology ISSN: 2319-1163
__________________________________________________________________________________________
Volume: 02 Issue: 02 | Feb-2013, Available @ http://www.ijret.org 199
technologies a huge amount of data can be taken as a research.
In a medical research it need to process data that come from
different cluster with huge amount. So in this case a huge
amount of data, or a large number of chromosomes, or a huge
amount of genomic data can be considered for the research.
In order to develop a new drug it needs a lot of research &
application that can be done upon a chromosome or gene.
Many times the research takes place in many places by
different researcher, but the goal may same. In this case the
sharing of data, information, or the research result plays a
major rule for the conclusion or simply for the development of
the drug. By applying a particular drug to a chromosome the
researcher should study the behavior, the reaction, and the
changes that happened to that particular chromosome. the
development of the drug the application should be applied to
various living cells, various chromosomes, or simply various
genes. After a high level of research upon the various particles
and even after studying the reaction or changes, the drug
should be designed according to that.
Apache hadoop is such a cloud technology that provides a plat
form which is used to process a huge amount of data. Till now
it has several application in the field of searching, log
processing, data warehouse, video and image analysis etc. It
has the real time application in facebook. Now a day more than
500 million active users share, upload, communicate with each
other through facebook. Hadoop file system maintains data
coherency that is once written it can be read by several users.
In fig.2. We have discussed some features of HDFS. Hadoop
distributed file system is a high fault tolerance system although
in many place it may be a single point of failure. It has a query
able data base which can be shared by many users or
researchers. That has the open data format that is common to
all users. Any people or user can share or update his data
according to their own activity. It has the high quality
databases which never delete any data. So data can be
frequently used and can be access any time for further use.
Fig. 2 Hadoop Distributed File System
So Cloud technologies can be used to various medical
researches that take place with huge amount of data. Gene
mapping is nothing but the distance between two genes in a
chromosomes which may changes from time to time. Before
development of a drug it is applied to various chromosomes to
study the behaviour, changes that happen to that particular
drug. Not only a single cluster it is generally applied to various
genes or chromosomes that comes from various area. It is not
possible to keep data in a single place for the research. As in a
single work many scientists or researcher works so here there is
the necessity to compare, distinguish and to interpret data for
the result. After a heavy testing or application a drug may
develop. So in order to connect the entire researcher into a
single platform it needs a good and error free platform so that a
good result can be established. Cloud technologies are having
the similar type of application in other fields. Basically
Apache Hadoop is such a good technology in this filed which
has the similar type of application. By using cloud technology
many researchers can communicate, share there opinions and
research result in a single platform so that a common
conclusion can be extract at last for the development of any
medicines or drugs.
CONCLUSIONS
Cloud computing is always a good technology to develop a
good networking platform. It provides such a plat form so that
a dedicative communication can be take place without
thinking about computing capacity, bandwidth, storage,
security, and reliability of the system. Mapping of gene is
nothing but to calculate the distance between genes in a
chromosome, which generally comes in a huge number from
different clusters. In order to develop a result it should be a
communication and sharing of data between these researchers.
Cloud technologies like Apache hadoop will be helpful in this
concept to develop a dedicated, error free, quarry able
database for the research. Our future work will focus towards
other real time application of cloud technology upon this task.
ACKNOWLEDGEMENTS
We are very much thankful towards the Faculty and staffs of
IGIT Sarang for their cooperation and providing various
facilities regarding this research work. We are also thankful
towards the anonymous reviewers for their valuable
suggestions regarding this research work which improved the
quality of the research paper.
REFERENCES
[1] http://www.woodrow.org/teachers/bi/1994/chromosomes.
html [Last accessed January 2013]
[2] Wei-Tek Tsai, Xin Sun, Janaka Balasooriya, “Service-
Oriented Cloud Computing Architecture”, Seventh
International conference on Information Technology, IEEE,
pp. 684-689, 2010.
Hadoop
Fault
Tolerence
Open Data
Format
Never
Delete Data
Queryable
Database
5. IJRET: International Journal of Research in Engineering and Technology ISSN: 2319-1163
__________________________________________________________________________________________
Volume: 02 Issue: 02 | Feb-2013, Available @ http://www.ijret.org 200
[3] Dhruba Borthakur, http://cloud.berkeley.edu/data/hdfs.pdf,
“Apache Hadoop File System and its Usage in Facebook”
Presented at UC Berkeley, 2011.
[4]. http://www.biology.about.com/od/geneticsglossary/g/
genetheory.htm, [Last Accessed January, 2013].
[5] http://www.en.wikipedia.org/wiki/Gene_mapping [Last
accessed January 2013].
[6] Sarkis M. , Goebel B. , Dawy Z. , Hagenauer J. , Hanus
P. , Mueller J.C. , “Gene mapping of complex diseases - A
comparison of methods from statistics information theory, and
signal processing” Signal processing magazine, IEEE, vol-
24(1)pp. 83-90, 2007.
[7] Bo Gao, Changjie Guo, Zhihu Wang, Wenhao An, Wei
Sun, “Develop and Deploy Multi-Tenant Web-delivered
Solutions using IBM middleware: Part 3: Resource sharing,
isolation and customization in the single instance multi-tenant
application”. http://www.ibm.com/developerworks/
webservices/ library/wsmultitenant/index.html, 2009.
[8] Rajkumar Buyya, Chee Shin Yeo, "Cloud Computing and
Emerging IT Platforms: Vision, Hype and Reality for
Delivering Computing as the 5th Utility," Future Generation
Computer Systems, pp. 599-616, 2009.
[9] Tinos, R., Yang, S. A self-organizing random immigrants
genetic algorithm for dynamic optimization problems. Genetic
Programming and Evolvable Machines, 8(3), pp. 255-286,
2007.
[10] Sarkis M., Diepold K., Westad F., “A new algorithm for
gene mapping: Application of partial least squares regression
with cross model validation”, IEEE International Workshop
on Genomic Signal Processing and Statistics, 2006.
[11] Dawy Z., Goebel B., Hagenauer J., Andreoli C.,
Meitinger T., Mueller J.C., “Gene mapping and marker
clustering using Shannon's mutual information”, Transactions
on Computational Biology and Bioinformatics, IEEE/ACM ,
Vol-3(1) pp. 47-56, 2006.
[12] S. Mitra, R. Das, Y. Hayashi, “Genetic Networks and
Soft Computing”, IEEE/ACM Transactions on Computational
Biology & Bioinformatics,Vol-8(1) pp. 616-635, 2011.