This document discusses outlier detection in network traffic data collected from a Secure Shell (SSH) honeypot. It implements the SSH honeypot to collect inbound and outbound network traffic data. A subset of this data is clustered using Particle Swarm Optimization (PSO), Genetic Algorithm (GA), and Differential Evolution (DE) algorithms to detect outliers. The performance of these three clustering-based outlier detection methods are then compared based on metrics like cost function and number of iterations/clusters. PSO showed the best results in detecting outliers with the lowest cost function.
IRJET - Crime Analysis and Prediction - by using DBSCAN AlgorithmIRJET Journal
This document summarizes a research paper that proposes using the DBSCAN clustering algorithm and data mining techniques to analyze crime data and predict crime prone areas.
The paper aims to develop a system that can analyze crime data, classify crimes by type, predict high probability areas for future crimes, and visualize crime clusters and predicted regions on a map. It uses a dummy crime dataset and applies the DBSCAN clustering algorithm, which forms effective clusters and is more accurate than K-means clustering. The results include crime cluster maps and maps of predicted crime regions that could help law enforcement agencies take preventative measures.
This document outlines a methodology for detecting terrorist activity on the internet. It discusses learning typical terrorist behaviors by analyzing the content of known terrorist websites. It then describes monitoring user internet traffic to detect similarities to the learned typical terrorist behavior profiles. If a user's activities are sufficiently similar to the terrorist profiles, an alarm is triggered. The methodology is evaluated in a case study using traffic data from students and individuals accessing terrorist websites. Different detection sensitivities are tested by varying the similarity threshold for triggering alarms. Deployment options within internet service providers or as a network system are also discussed.
Phishing Websites Detection Using Back Propagation Algorithm: A Reviewtheijes
Phishing is an illicit modus operandi employing both societal engineering and technological subterfuge to theft client’s private identity data and monetary account credentials. Influence of phishing is pretty radical as it engrosses the menace of identity larceny and financial losses. This paper elucidates the back propagation paradigm to instruct the neural network for phishing forecast. We execute the root-cause analysis of phishing and incentive for phishing. This analysis is intended at serving developers the effectiveness of neural networks in data mining and provides the grounds proving neural networks in phishing detection.
This document compares the k-means data mining and outlier detection approaches for network-based intrusion detection. It analyzes four datasets capturing network traffic using both approaches. The k-means approach clusters traffic into normal and abnormal flows, while outlier detection calculates an outlier score for each flow. The document finds that k-means was more accurate and precise, with a better classification rate than outlier detection. It requires less computer resources than outlier detection. This comparison of the approaches can help network administrators choose the best intrusion detection method.
Verifying and Correctness of Frequent Itemset MininingIRJET Journal
This document discusses efficient verification approaches for outsourced frequent itemset mining. It proposes two integrity verification methods: 1) A probabilistic approach that constructs evidence for frequent/infrequent itemsets by removing some items and inserting artificial transactions to check correctness and completeness. 2) A deterministic approach requiring the server to construct cryptographic proofs of mining results to verify correctness and completeness with 100% certainty. Experimental results show the efficiency and effectiveness of the approaches. The goal is to allow clients to outsource computationally intensive frequent itemset mining to untrusted servers while verifying the integrity of the returned results.
PUBLIC INTEGRIYT AUDITING FOR SHARED DYNAMIC DATA STORAGE UNDER ONTIME GENERA...paperpublications3
Abstract: Nowadays verifying the result of the remote computation plays a crucial role in addressing in issue of trust. The outsourced data collection comes for multiple data sources to diagnose the originator of errors by allotting each data sources a unique secrete key which requires the inner product conformation to be performed under any two parties different keys. The proposed methods outperform AISM technique to minimize the running time. The multi-key setting is given different secrete keys, multiple data sources can be upload their data streams along with their respective verifiable homomorphic tag. The AISM consist of three novel join techniques depending on the ADS availability: (i) Authenticated Indexed Sort Merge Join (AISM), which utilizes a single ADS on the join attribute, (ii) Authenticated Index Merge Join (AIM) that requires an ADS (on the join attribute) for both relations, and (iii) Authenticated Sort Merge Join (ASM), which does not rely on any ADS. The client should allow choosing any portion in the data streams for queries. The communication between the client and server is independent of input size. The inner product evaluation can be performed by any two sources and the result can be verified by using the particular tag.
Keywords: Computation of outsourcing, Data Stream, Multiple Key, Homomorphic encryption.
Title: PUBLIC INTEGRIYT AUDITING FOR SHARED DYNAMIC DATA STORAGE UNDER ONTIME GENERATED MULTIPLE KEYS
Author: C. NISHA MALAR, M. S. BONSHIA BINU
ISSN 2350-1049
International Journal of Recent Research in Interdisciplinary Sciences (IJRRIS)
Paper Publications
Privacy Preserving Data Leak Detection for Sensitive Datapaperpublications3
Abstract: Number of data leaks in the organization, research institutions and security firms have grown rapidly in recent years. The data leakage occurs if there is no proper protection. The common approach is to monitor the data that are stored in the organization local network. The existing method require the plaintext sensitive data. However, this requirement is undesirable, as it may threaten the confidentiality of the sensitive information. A privacy preserving data-leak detection solution is proposed which can be outsourced and be deployed in a semi-honest detection environment. Fuzzy fingerprint technique is designed and implemented that enhances data privacy during data-leak detection operations. The DLD provider computes fingerprints from network traffic and identifies potential leaks in them. To prevent the DLD provider from gathering exact knowledge about the sensitive data, the collection of potential leaks is composed of real leaks and noises. The evaluation results show that this method can provide accurate detection.
Application of Data Mining Technique in Invasion RecognitionIOSR Journals
This document discusses applying data mining techniques to network intrusion detection. It begins by introducing traditional intrusion detection methods and their limitations. It then establishes a data mining-based model for intrusion detection that is designed to address these limitations. The model collects network data, preprocesses it, applies data mining algorithms to extract patterns, and uses the patterns to detect intrusions. This allows detection of both known and unknown intrusion types while reducing false alarms and missed detections compared to traditional methods. Key aspects of the proposed system include data acquisition, preprocessing, pattern extraction using various data mining algorithms, and intrusion detection based on the extracted patterns.
IRJET - Crime Analysis and Prediction - by using DBSCAN AlgorithmIRJET Journal
This document summarizes a research paper that proposes using the DBSCAN clustering algorithm and data mining techniques to analyze crime data and predict crime prone areas.
The paper aims to develop a system that can analyze crime data, classify crimes by type, predict high probability areas for future crimes, and visualize crime clusters and predicted regions on a map. It uses a dummy crime dataset and applies the DBSCAN clustering algorithm, which forms effective clusters and is more accurate than K-means clustering. The results include crime cluster maps and maps of predicted crime regions that could help law enforcement agencies take preventative measures.
This document outlines a methodology for detecting terrorist activity on the internet. It discusses learning typical terrorist behaviors by analyzing the content of known terrorist websites. It then describes monitoring user internet traffic to detect similarities to the learned typical terrorist behavior profiles. If a user's activities are sufficiently similar to the terrorist profiles, an alarm is triggered. The methodology is evaluated in a case study using traffic data from students and individuals accessing terrorist websites. Different detection sensitivities are tested by varying the similarity threshold for triggering alarms. Deployment options within internet service providers or as a network system are also discussed.
Phishing Websites Detection Using Back Propagation Algorithm: A Reviewtheijes
Phishing is an illicit modus operandi employing both societal engineering and technological subterfuge to theft client’s private identity data and monetary account credentials. Influence of phishing is pretty radical as it engrosses the menace of identity larceny and financial losses. This paper elucidates the back propagation paradigm to instruct the neural network for phishing forecast. We execute the root-cause analysis of phishing and incentive for phishing. This analysis is intended at serving developers the effectiveness of neural networks in data mining and provides the grounds proving neural networks in phishing detection.
This document compares the k-means data mining and outlier detection approaches for network-based intrusion detection. It analyzes four datasets capturing network traffic using both approaches. The k-means approach clusters traffic into normal and abnormal flows, while outlier detection calculates an outlier score for each flow. The document finds that k-means was more accurate and precise, with a better classification rate than outlier detection. It requires less computer resources than outlier detection. This comparison of the approaches can help network administrators choose the best intrusion detection method.
Verifying and Correctness of Frequent Itemset MininingIRJET Journal
This document discusses efficient verification approaches for outsourced frequent itemset mining. It proposes two integrity verification methods: 1) A probabilistic approach that constructs evidence for frequent/infrequent itemsets by removing some items and inserting artificial transactions to check correctness and completeness. 2) A deterministic approach requiring the server to construct cryptographic proofs of mining results to verify correctness and completeness with 100% certainty. Experimental results show the efficiency and effectiveness of the approaches. The goal is to allow clients to outsource computationally intensive frequent itemset mining to untrusted servers while verifying the integrity of the returned results.
PUBLIC INTEGRIYT AUDITING FOR SHARED DYNAMIC DATA STORAGE UNDER ONTIME GENERA...paperpublications3
Abstract: Nowadays verifying the result of the remote computation plays a crucial role in addressing in issue of trust. The outsourced data collection comes for multiple data sources to diagnose the originator of errors by allotting each data sources a unique secrete key which requires the inner product conformation to be performed under any two parties different keys. The proposed methods outperform AISM technique to minimize the running time. The multi-key setting is given different secrete keys, multiple data sources can be upload their data streams along with their respective verifiable homomorphic tag. The AISM consist of three novel join techniques depending on the ADS availability: (i) Authenticated Indexed Sort Merge Join (AISM), which utilizes a single ADS on the join attribute, (ii) Authenticated Index Merge Join (AIM) that requires an ADS (on the join attribute) for both relations, and (iii) Authenticated Sort Merge Join (ASM), which does not rely on any ADS. The client should allow choosing any portion in the data streams for queries. The communication between the client and server is independent of input size. The inner product evaluation can be performed by any two sources and the result can be verified by using the particular tag.
Keywords: Computation of outsourcing, Data Stream, Multiple Key, Homomorphic encryption.
Title: PUBLIC INTEGRIYT AUDITING FOR SHARED DYNAMIC DATA STORAGE UNDER ONTIME GENERATED MULTIPLE KEYS
Author: C. NISHA MALAR, M. S. BONSHIA BINU
ISSN 2350-1049
International Journal of Recent Research in Interdisciplinary Sciences (IJRRIS)
Paper Publications
Privacy Preserving Data Leak Detection for Sensitive Datapaperpublications3
Abstract: Number of data leaks in the organization, research institutions and security firms have grown rapidly in recent years. The data leakage occurs if there is no proper protection. The common approach is to monitor the data that are stored in the organization local network. The existing method require the plaintext sensitive data. However, this requirement is undesirable, as it may threaten the confidentiality of the sensitive information. A privacy preserving data-leak detection solution is proposed which can be outsourced and be deployed in a semi-honest detection environment. Fuzzy fingerprint technique is designed and implemented that enhances data privacy during data-leak detection operations. The DLD provider computes fingerprints from network traffic and identifies potential leaks in them. To prevent the DLD provider from gathering exact knowledge about the sensitive data, the collection of potential leaks is composed of real leaks and noises. The evaluation results show that this method can provide accurate detection.
Application of Data Mining Technique in Invasion RecognitionIOSR Journals
This document discusses applying data mining techniques to network intrusion detection. It begins by introducing traditional intrusion detection methods and their limitations. It then establishes a data mining-based model for intrusion detection that is designed to address these limitations. The model collects network data, preprocesses it, applies data mining algorithms to extract patterns, and uses the patterns to detect intrusions. This allows detection of both known and unknown intrusion types while reducing false alarms and missed detections compared to traditional methods. Key aspects of the proposed system include data acquisition, preprocessing, pattern extraction using various data mining algorithms, and intrusion detection based on the extracted patterns.
This document summarizes a research paper on identifying authorized users based on typing speed comparison. The paper proposes using a user's typing speed and patterns as a behavioral biometric for authentication. It analyzes keystroke dynamics data such as dwell times and flight times between keys. A neural network classifier is used to model users' typing behaviors based on monograph and digraph mappings. The proposed framework achieved reduced false positive and negative rates compared to existing password-based authentication methods. It provides a simple, low-cost way to increase computer security without additional hardware or training for users.
New Hybrid Intrusion Detection System Based On Data Mining Technique to Enhan...ijceronline
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
DETECTION OF APPLICATION LAYER DDOS ATTACKS USING INFORMATION THEORY BASED ME...cscpconf
Distributed Denial-of-Service (DDoS) attacks are a critical threat to the Internet. Recently,
there are an increasing number of DDoS attacks against online services and Web applications.
These attacks are targeting the application level. Detecting application layer DDOS attack is
not an easy task. A more sophisticated mechanism is required to distinguish the malicious flow
from the legitimate ones. This paper proposes a detection scheme based on the information
theory based metrics. The proposed scheme has two phases: Behaviour monitoring and
Detection. In the first phase, the Web user browsing behaviour (HTTP request rate, page
viewing time and sequence of the requested objects) is captured from the system log during nonattack
cases. Based on the observation, Entropy of requests per session and the trust score for
each user is calculated. In the detection phase, the suspicious requests are identified based on
the variation in entropy and a rate limiter is introduced to downgrade services to malicious
users. In addition, a scheduler is included to schedule the session based on the trust score of the
user and the system workload.
On Improving the Performance of Data Leak Prevention using White-list ApproachPatrick Nguyen
This document proposes improving data leak prevention performance using a white-list approach and Bloom filters. It summarizes previous related work using blacklists and keywords to detect leaks. The authors then improve upon prior work by Fang Hao et al. that used CRC to create fingerprints, by using hash functions with Bloom filters instead to generate fingerprints faster while maintaining accuracy. Experiments test five hash functions on a 9.3GB dataset to evaluate system throughput and percentage of leaked files.
A Survey on Cloud-Based IP Trace Back FrameworkIRJET Journal
This document summarizes a survey of cloud-based IP traceback frameworks. It proposes a cloud-based traceback architecture with three layers: an intra-AS layer where traceback servers in each Autonomous System (AS) collect and store traffic flow data; a traceback as a service layer where ASes expose their traceback capabilities; and an inter-AS logical links layer to facilitate efficient traceback across ASes. It then focuses on access control to prevent unauthorized users from requesting traceback information. To address this, it proposes a temporal token-based authentication framework called FACT that embeds tokens in traffic flows and delivers them to end hosts to authenticate traceback queries. The framework aims to ensure only actual recipients of packets can initiate traceback for those packets.
A predictive model for mapping crime using big data analyticseSAT Journals
Abstract Crime reduction and prevention challenges in today’s world are becoming increasingly complex and are in need of a new technique that can handle the vast amount of information that is being generated. Traditional police capabilities mostly fall short in depicting the original division of criminal activities, thus contribute less in the suitable allocation of police services. In this paper methods are described for crime event forecasting, using Hadoop, by studying the geographical areas which are at greater risk and outside the traditional policing limits. The developed method makes the use of a geographical crime mapping algorithm to identify areas that have relatively high cases of crime. The term used for such places is hot spots. The identified hotspot clusters give valuable data that can be used to train the artificial neural network which further can model the trends of crime. The artificial neural network specification and estimation approach is enhanced by processing capability of Hadoop platform. Keywords— Crime forecasting; Cluster analysis; artificial neural networks; Patrolling; Big data; Hadoop; Gamma test.
Web administrators should pay special attention and closely inspect web sessions that correspond to web robots; because the traffic of these autonomous systems occupies the bandwidth, reduces the performance of web servers and in some cases, threaten the security of human users. In this research, we propose a novel fuzzy algorithm based on the decision trees. In order to overcome the curse of dimensionality issue and facilitate the designing of the fuzzy inference system, we use a correlation analysis to eliminate some features. For converting each filtered attribute to a fuzzy variable, a C4.5 decision tree is used. It is worth mentioning that making a decision tree is based on choosing the best feature with the most information gain metric in each level of the tree. Therefore, we can reduce the number of attributes again. Finally, the fuzzy rules are extracted from the C4.5 decision tree and the fuzzy inference model is made.
Privacy Preserving Reputation Calculation in P2P Systems with Homomorphic Enc...IJCNCJournal
This document discusses a method for privacy-preserving reputation calculation in peer-to-peer systems using homomorphic encryption. Specifically, it proposes:
1) Extending the EigenTrust reputation system to calculate node reputations in a distributed manner while preserving evaluator privacy. It does this by successively updating encrypted reputation values through calculation to reflect trust values without disclosing the original values.
2) Improving calculation efficiency by offloading parts of the task to participating nodes and using different public keys during calculation to improve robustness against node churn.
3) Evaluating the performance of the proposed method, finding it reduces maximum circulation time for aggregating multiplication results by half, reducing computation time per round. The privacy preservation cost scales
International Journal of Engineering Inventions (IJEI) provides a multidisciplinary passage for researchers, managers, professionals, practitioners and students around the globe to publish high quality, peer-reviewed articles on all theoretical and empirical aspects of Engineering and Science.
Markle Tree Based Authentication Protocol for Lifetime Enhancement in Wireles...Eswar Publications
Wireless sensor networks are self organized, autonomous, automatic discovery of services, highly scalable, reliable, Infrastructure less service. Mainly applicable in the field of disaster, healthcare. The cryptographic operations such as hash based schemes are more energy consuming (more byte transmitted indirectly consumes more energy). To avoid the energy consumption over a cryptographic operations are design of Markle Tree Based Authentication protocol for Lifetime enhancement in wireless sensor networks called MALLI, which uses a famous structure of hash algorithm. To demonstrate that, the default hash tree and the security achieved by the proposed method are more effective than the existing methodologies.
This document discusses using a combined approach of K-medoids clustering and Naive Bayes classification for intrusion detection. It begins by introducing intrusion detection systems and common data mining techniques used, such as classification and clustering. It then proposes a hybrid approach where K-medoids clustering is first used to group similar data instances, followed by Naive Bayes classification to classify the resulting clusters. The approach aims to maintain high accuracy and detection rates while lowering false alarm rates. An experiment on a prepared dataset showed the proposed approach performed better in terms of accuracy and detection rate with a reasonable false alarm rate.
The document discusses optimizing bloom filters for peer-to-peer (P2P) networks to improve multi-keyword search efficiency. It proposes automatically updating bloom filters by polling peers for current service availability and using expiry dates. This reduces stale data and false positives. Evaluation shows the optimized bloom filter approach reduces search time for intersection and union queries by around 185 times compared to regular bloom filters.
Artificial Neural Content Techniques for Enhanced Intrusion Detection and Pre...AM Publications
This paper presents a novel approach for detecting network intrusions based on a competitive training neural
network. In the paper, the performance of this approach is compared to that of the self-organizing map (SOM), which is a
popular unsupervised training algorithm used in intrusion detection. While obtaining a similarly accurate detection rate as
the SOM does, the proposed approach uses only one forth of the computation times of the SOM. Furthermore, the
clustering result of this method is independent of the number of the initial neurons. This approach also exhibits the ability
to detect the known and unknown network attacks. The experimental results obtained by applying this approach to the
KDD-99 data set demonstrate that the proposed approach performs exceptionally in terms of both accuracy and
computation time.
IRJET- Analysis and Detection of E-Mail Phishing using PysparkIRJET Journal
This document proposes a method to analyze and detect phishing emails using PySpark. It involves using text analysis and link analysis techniques such as applying a Naive Bayes classifier to email text and checking links using the VirusTotal API. The method aims to provide more accurate phishing detection compared to existing approaches. It discusses related work on phishing email detection using machine learning and analyzes the proposed system's design and implementation steps, which include data preprocessing, feature extraction, classification using Naive Bayes, and link analysis.
A model to find the agent who responsible for data leakageeSAT Journals
Abstract In this research paper, we implement the model to find the agent who responsible for data leakage system. The data leakage is one type of risk. Many times distributor sends some important data to two or more agents, but several times the information is disclosure and found unauthorized place or unauthorized person. The multiple ways to distributed important information i.e. e-mail, web site, FTP, databases, disk, spreadsheet etc. Due to this purpose information accessing in a safe way is become new topic of research and it became a contestant part to finding leakages. In this work we implement a system for distributing information to agents. In this method we add fake object to the distributed original data to the agent in such a way that improves the changes to finding a leakage. If agent sends this sensitive data to unauthorized person then distributor can receive one data leaked SMS, after that distributor can find the guilty agent who leaked the data. Keywords: Significant data, fake data, guilty agent.
This document presents a model for detecting the agent responsible for data leakage. It discusses adding fake objects to distributed data in order to identify the source if a leakage occurs. The model is implemented using C# and SQL Server. When an agent requests data, the distributor sends the original data along with randomly allocated fake objects. If the data is leaked, the distributor can analyze the fake objects to determine the guilty agent. An algorithm is provided and screenshots show modules for login, data sharing, and detecting the guilty agent using a probability calculation. The model aims to overcome limitations of existing watermarking techniques for data leakage detection.
This document describes a proposed crime prediction system that uses naive bayes classification. The system collects crime data and compares new crimes to historical data to predict the criminal. It generates sample crime data for attributes like name, location, weapon, date and crime type. For a new crime, it enters the details and the system uses naive bayes theorem to calculate the probability of each past criminal being the suspect, outputting the most likely criminal. The system is implemented with web pages for login, signup, viewing records, entering new crimes, and outputting predictions. The goal is to help law enforcement agencies identify suspects by predicting crimes based on past data.
Unsupervised Distance Based Detection of Outliers by using Anti-hubsIRJET Journal
This document summarizes research on using anti-hubs for unsupervised outlier detection in high-dimensional data. It discusses how existing distance-based outlier detection methods struggle with high-dimensional data as distances become less meaningful. Anti-hubs, which are points that are infrequently in the k-nearest neighbor lists of other points, have been used for outlier detection. However, calculating anti-hubs is computationally expensive for high-dimensional data. The document proposes applying feature selection before calculating anti-hubs to reduce dimensionality and computational cost, thereby extending anti-hub based outlier detection to high-dimensional data more efficiently.
IRJET - An Automated System for Detection of Social Engineering Phishing Atta...IRJET Journal
1) The document presents a machine learning approach to detect phishing URLs using logistic regression. It trains a logistic regression model on a dataset of 420,467 URLs that have been classified as either phishing or legitimate.
2) It preprocesses the URLs using tokenization before training the logistic regression model. The trained model is able to classify new URLs with 96% accuracy as either phishing or legitimate based on the URL features.
3) The proposed approach provides an automated way to detect phishing URLs in real-time and help prevent phishing attacks. Future work could involve developing a browser extension using this approach and increasing the dataset size for higher accuracy.
Network Forensic Investigation of HTTPS ProtocolIJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
Intrusion Detection System Using Machine Learning: An OverviewIRJET Journal
This document provides an overview of machine learning approaches for intrusion detection systems (IDS). It discusses how IDS use data mining techniques like classification, clustering, and association rule mining to detect network intrusions based on patterns in data. The document reviews several papers applying methods like ant colony optimization, support vector machines, genetic algorithms, and convolutional neural networks to classify network activities as normal or intrusive. It compares the strengths and limitations of different machine learning algorithms for IDS and identifies areas for potential improvement in future research.
This document summarizes a research paper on identifying authorized users based on typing speed comparison. The paper proposes using a user's typing speed and patterns as a behavioral biometric for authentication. It analyzes keystroke dynamics data such as dwell times and flight times between keys. A neural network classifier is used to model users' typing behaviors based on monograph and digraph mappings. The proposed framework achieved reduced false positive and negative rates compared to existing password-based authentication methods. It provides a simple, low-cost way to increase computer security without additional hardware or training for users.
New Hybrid Intrusion Detection System Based On Data Mining Technique to Enhan...ijceronline
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
DETECTION OF APPLICATION LAYER DDOS ATTACKS USING INFORMATION THEORY BASED ME...cscpconf
Distributed Denial-of-Service (DDoS) attacks are a critical threat to the Internet. Recently,
there are an increasing number of DDoS attacks against online services and Web applications.
These attacks are targeting the application level. Detecting application layer DDOS attack is
not an easy task. A more sophisticated mechanism is required to distinguish the malicious flow
from the legitimate ones. This paper proposes a detection scheme based on the information
theory based metrics. The proposed scheme has two phases: Behaviour monitoring and
Detection. In the first phase, the Web user browsing behaviour (HTTP request rate, page
viewing time and sequence of the requested objects) is captured from the system log during nonattack
cases. Based on the observation, Entropy of requests per session and the trust score for
each user is calculated. In the detection phase, the suspicious requests are identified based on
the variation in entropy and a rate limiter is introduced to downgrade services to malicious
users. In addition, a scheduler is included to schedule the session based on the trust score of the
user and the system workload.
On Improving the Performance of Data Leak Prevention using White-list ApproachPatrick Nguyen
This document proposes improving data leak prevention performance using a white-list approach and Bloom filters. It summarizes previous related work using blacklists and keywords to detect leaks. The authors then improve upon prior work by Fang Hao et al. that used CRC to create fingerprints, by using hash functions with Bloom filters instead to generate fingerprints faster while maintaining accuracy. Experiments test five hash functions on a 9.3GB dataset to evaluate system throughput and percentage of leaked files.
A Survey on Cloud-Based IP Trace Back FrameworkIRJET Journal
This document summarizes a survey of cloud-based IP traceback frameworks. It proposes a cloud-based traceback architecture with three layers: an intra-AS layer where traceback servers in each Autonomous System (AS) collect and store traffic flow data; a traceback as a service layer where ASes expose their traceback capabilities; and an inter-AS logical links layer to facilitate efficient traceback across ASes. It then focuses on access control to prevent unauthorized users from requesting traceback information. To address this, it proposes a temporal token-based authentication framework called FACT that embeds tokens in traffic flows and delivers them to end hosts to authenticate traceback queries. The framework aims to ensure only actual recipients of packets can initiate traceback for those packets.
A predictive model for mapping crime using big data analyticseSAT Journals
Abstract Crime reduction and prevention challenges in today’s world are becoming increasingly complex and are in need of a new technique that can handle the vast amount of information that is being generated. Traditional police capabilities mostly fall short in depicting the original division of criminal activities, thus contribute less in the suitable allocation of police services. In this paper methods are described for crime event forecasting, using Hadoop, by studying the geographical areas which are at greater risk and outside the traditional policing limits. The developed method makes the use of a geographical crime mapping algorithm to identify areas that have relatively high cases of crime. The term used for such places is hot spots. The identified hotspot clusters give valuable data that can be used to train the artificial neural network which further can model the trends of crime. The artificial neural network specification and estimation approach is enhanced by processing capability of Hadoop platform. Keywords— Crime forecasting; Cluster analysis; artificial neural networks; Patrolling; Big data; Hadoop; Gamma test.
Web administrators should pay special attention and closely inspect web sessions that correspond to web robots; because the traffic of these autonomous systems occupies the bandwidth, reduces the performance of web servers and in some cases, threaten the security of human users. In this research, we propose a novel fuzzy algorithm based on the decision trees. In order to overcome the curse of dimensionality issue and facilitate the designing of the fuzzy inference system, we use a correlation analysis to eliminate some features. For converting each filtered attribute to a fuzzy variable, a C4.5 decision tree is used. It is worth mentioning that making a decision tree is based on choosing the best feature with the most information gain metric in each level of the tree. Therefore, we can reduce the number of attributes again. Finally, the fuzzy rules are extracted from the C4.5 decision tree and the fuzzy inference model is made.
Privacy Preserving Reputation Calculation in P2P Systems with Homomorphic Enc...IJCNCJournal
This document discusses a method for privacy-preserving reputation calculation in peer-to-peer systems using homomorphic encryption. Specifically, it proposes:
1) Extending the EigenTrust reputation system to calculate node reputations in a distributed manner while preserving evaluator privacy. It does this by successively updating encrypted reputation values through calculation to reflect trust values without disclosing the original values.
2) Improving calculation efficiency by offloading parts of the task to participating nodes and using different public keys during calculation to improve robustness against node churn.
3) Evaluating the performance of the proposed method, finding it reduces maximum circulation time for aggregating multiplication results by half, reducing computation time per round. The privacy preservation cost scales
International Journal of Engineering Inventions (IJEI) provides a multidisciplinary passage for researchers, managers, professionals, practitioners and students around the globe to publish high quality, peer-reviewed articles on all theoretical and empirical aspects of Engineering and Science.
Markle Tree Based Authentication Protocol for Lifetime Enhancement in Wireles...Eswar Publications
Wireless sensor networks are self organized, autonomous, automatic discovery of services, highly scalable, reliable, Infrastructure less service. Mainly applicable in the field of disaster, healthcare. The cryptographic operations such as hash based schemes are more energy consuming (more byte transmitted indirectly consumes more energy). To avoid the energy consumption over a cryptographic operations are design of Markle Tree Based Authentication protocol for Lifetime enhancement in wireless sensor networks called MALLI, which uses a famous structure of hash algorithm. To demonstrate that, the default hash tree and the security achieved by the proposed method are more effective than the existing methodologies.
This document discusses using a combined approach of K-medoids clustering and Naive Bayes classification for intrusion detection. It begins by introducing intrusion detection systems and common data mining techniques used, such as classification and clustering. It then proposes a hybrid approach where K-medoids clustering is first used to group similar data instances, followed by Naive Bayes classification to classify the resulting clusters. The approach aims to maintain high accuracy and detection rates while lowering false alarm rates. An experiment on a prepared dataset showed the proposed approach performed better in terms of accuracy and detection rate with a reasonable false alarm rate.
The document discusses optimizing bloom filters for peer-to-peer (P2P) networks to improve multi-keyword search efficiency. It proposes automatically updating bloom filters by polling peers for current service availability and using expiry dates. This reduces stale data and false positives. Evaluation shows the optimized bloom filter approach reduces search time for intersection and union queries by around 185 times compared to regular bloom filters.
Artificial Neural Content Techniques for Enhanced Intrusion Detection and Pre...AM Publications
This paper presents a novel approach for detecting network intrusions based on a competitive training neural
network. In the paper, the performance of this approach is compared to that of the self-organizing map (SOM), which is a
popular unsupervised training algorithm used in intrusion detection. While obtaining a similarly accurate detection rate as
the SOM does, the proposed approach uses only one forth of the computation times of the SOM. Furthermore, the
clustering result of this method is independent of the number of the initial neurons. This approach also exhibits the ability
to detect the known and unknown network attacks. The experimental results obtained by applying this approach to the
KDD-99 data set demonstrate that the proposed approach performs exceptionally in terms of both accuracy and
computation time.
IRJET- Analysis and Detection of E-Mail Phishing using PysparkIRJET Journal
This document proposes a method to analyze and detect phishing emails using PySpark. It involves using text analysis and link analysis techniques such as applying a Naive Bayes classifier to email text and checking links using the VirusTotal API. The method aims to provide more accurate phishing detection compared to existing approaches. It discusses related work on phishing email detection using machine learning and analyzes the proposed system's design and implementation steps, which include data preprocessing, feature extraction, classification using Naive Bayes, and link analysis.
A model to find the agent who responsible for data leakageeSAT Journals
Abstract In this research paper, we implement the model to find the agent who responsible for data leakage system. The data leakage is one type of risk. Many times distributor sends some important data to two or more agents, but several times the information is disclosure and found unauthorized place or unauthorized person. The multiple ways to distributed important information i.e. e-mail, web site, FTP, databases, disk, spreadsheet etc. Due to this purpose information accessing in a safe way is become new topic of research and it became a contestant part to finding leakages. In this work we implement a system for distributing information to agents. In this method we add fake object to the distributed original data to the agent in such a way that improves the changes to finding a leakage. If agent sends this sensitive data to unauthorized person then distributor can receive one data leaked SMS, after that distributor can find the guilty agent who leaked the data. Keywords: Significant data, fake data, guilty agent.
This document presents a model for detecting the agent responsible for data leakage. It discusses adding fake objects to distributed data in order to identify the source if a leakage occurs. The model is implemented using C# and SQL Server. When an agent requests data, the distributor sends the original data along with randomly allocated fake objects. If the data is leaked, the distributor can analyze the fake objects to determine the guilty agent. An algorithm is provided and screenshots show modules for login, data sharing, and detecting the guilty agent using a probability calculation. The model aims to overcome limitations of existing watermarking techniques for data leakage detection.
This document describes a proposed crime prediction system that uses naive bayes classification. The system collects crime data and compares new crimes to historical data to predict the criminal. It generates sample crime data for attributes like name, location, weapon, date and crime type. For a new crime, it enters the details and the system uses naive bayes theorem to calculate the probability of each past criminal being the suspect, outputting the most likely criminal. The system is implemented with web pages for login, signup, viewing records, entering new crimes, and outputting predictions. The goal is to help law enforcement agencies identify suspects by predicting crimes based on past data.
Unsupervised Distance Based Detection of Outliers by using Anti-hubsIRJET Journal
This document summarizes research on using anti-hubs for unsupervised outlier detection in high-dimensional data. It discusses how existing distance-based outlier detection methods struggle with high-dimensional data as distances become less meaningful. Anti-hubs, which are points that are infrequently in the k-nearest neighbor lists of other points, have been used for outlier detection. However, calculating anti-hubs is computationally expensive for high-dimensional data. The document proposes applying feature selection before calculating anti-hubs to reduce dimensionality and computational cost, thereby extending anti-hub based outlier detection to high-dimensional data more efficiently.
IRJET - An Automated System for Detection of Social Engineering Phishing Atta...IRJET Journal
1) The document presents a machine learning approach to detect phishing URLs using logistic regression. It trains a logistic regression model on a dataset of 420,467 URLs that have been classified as either phishing or legitimate.
2) It preprocesses the URLs using tokenization before training the logistic regression model. The trained model is able to classify new URLs with 96% accuracy as either phishing or legitimate based on the URL features.
3) The proposed approach provides an automated way to detect phishing URLs in real-time and help prevent phishing attacks. Future work could involve developing a browser extension using this approach and increasing the dataset size for higher accuracy.
Network Forensic Investigation of HTTPS ProtocolIJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
Intrusion Detection System Using Machine Learning: An OverviewIRJET Journal
This document provides an overview of machine learning approaches for intrusion detection systems (IDS). It discusses how IDS use data mining techniques like classification, clustering, and association rule mining to detect network intrusions based on patterns in data. The document reviews several papers applying methods like ant colony optimization, support vector machines, genetic algorithms, and convolutional neural networks to classify network activities as normal or intrusive. It compares the strengths and limitations of different machine learning algorithms for IDS and identifies areas for potential improvement in future research.
Detection of Phishing Websites using machine Learning AlgorithmIRJET Journal
This document discusses the detection of phishing websites using machine learning algorithms. It begins with an abstract that defines phishing and explains why attackers use it. The introduction provides more details on phishing techniques and the need for anti-phishing detection methods. The document then reviews related work on phishing detection using machine learning features. It proposes using algorithms like artificial neural networks, k-nearest neighbors, support vector machines, and random forests. Features for these algorithms are discussed like URL-based, HTML/JavaScript-based, and domain-based features. The document concludes that machine learning classifiers can help detect phishing websites but future work is still needed to develop more effective detection systems.
Online stream mining approach for clustering network trafficeSAT Journals
Abstract A large number of research have been proposed on intrusion detection system, which leads to the implementation of agent based intelligent IDS (IIDS), Non – intelligent IDS (NIDS), signature based IDS etc. While building such IDS models, learning algorithms from flow of network traffic plays crucial role in accuracy of IDS systems. The proposed work focuses on implementing the novel method to cluster network traffic which eliminates the limitations in existing online clustering algorithms and prove the robustness and accuracy over large stream of network traffic arriving at extremely high rate. We compare the existing algorithm with novel methods to analyse the accuracy and complexity. Keywords— NIDS, Data Stream Mining, Online Clustering, RAH algorithm, Online Efficient Incremental Clustering algorithm
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Intrusion detection and anomaly detection system using sequential pattern miningeSAT Journals
Abstract
Nowadays the security methods from password protected access up to firewalls which are used to secure the data as well as the networks from attackers. Several times these types of security methods are not enough to protect data. We can consider the use of Intrusion Detection Systems (IDS) is the one way to secure the data on critical systems. Most of the research work is going on the effectiveness and exactness of the intrusion detection, but these attempts are for the detection of the intrusions at the operating system and network level only. It is unable to detect the unexpected behavior of systems due to malicious transactions in databases. The method used for spotting any interferes on the information in the form of database known as database intrusion detection. It relies on enlisting the execution of a transaction. After that, if the recognized pattern is aside from those regular patterns actual is considered as an intrusion. But the identified problem with this process is that the accuracy algorithm which is used may not identify entire patterns. This type of challenges can affect in two ways. 1) Missing of the database with regular patterns. 2) The detection process neglects some new patterns. Therefore we proposed sequential data mining method by using new Modified Apriori Algorithm. The algorithm upturns the accurateness and rate of pattern detection by the process. The Apriori algorithm with modifications is used in the proposed model.
Keywords — Anomaly Detection, Modified Apriori Algorithm, Misuse detection, Sequential Pattern Mining
Intrusion detection and anomaly detection system using sequential pattern miningeSAT Journals
Abstract
Nowadays the security methods from password protected access up to firewalls which are used to secure the data as well as the networks from attackers. Several times these types of security methods are not enough to protect data. We can consider the use of Intrusion Detection Systems (IDS) is the one way to secure the data on critical systems. Most of the research work is going on the effectiveness and exactness of the intrusion detection, but these attempts are for the detection of the intrusions at the operating system and network level only. It is unable to detect the unexpected behavior of systems due to malicious transactions in databases. The method used for spotting any interferes on the information in the form of database known as database intrusion detection. It relies on enlisting the execution of a transaction. After that, if the recognized pattern is aside from those regular patterns actual is considered as an intrusion. But the identified problem with this process is that the accuracy algorithm which is used may not identify entire patterns. This type of challenges can affect in two ways. 1) Missing of the database with regular patterns. 2) The detection process neglects some new patterns. Therefore we proposed sequential data mining method by using new Modified Apriori Algorithm. The algorithm upturns the accurateness and rate of pattern detection by the process. The Apriori algorithm with modifications is used in the proposed model.
IRJET- Genetic Algorithm based Intrusion Detection-SurveyIRJET Journal
This document summarizes research on genetic algorithm based intrusion detection systems. It begins with an abstract describing using a two-step hybrid method combining binary classification and k-nearest neighbors (k-NN) algorithms to improve intrusion detection performance. It then reviews 10 related works on intrusion detection using various techniques including anomaly detection, neural networks, clustering, rule-based approaches, and ensemble methods. The document aims to develop a framework that can process network connections and identify intrusion risks using different classifiers and aggregation.
WLI-FCM and Artificial Neural Network Based Cloud Intrusion Detection SystemEswar Publications
Security and Performance aspects of cloud computing are the major issues which have to be tended to in Cloud Computing. Intrusion is one such basic and imperative security problem for Cloud Computing. Consequently, it is essential to create an Intrusion Detection System (IDS) to detect both inside and outside assaults with high detection precision in cloud environment. In this paper, cloud intrusion detection system at hypervisor layer is developed and assesses to detect the depraved activities in cloud computing environment. The cloud intrusion detection system uses a hybrid algorithm which is a fusion of WLI- FCM clustering algorithm and Back propagation artificial Neural Network to improve the detection accuracy of the cloud intrusion detection system. The proposed system is implemented and compared with K-means and classic FCM. The DARPA’s KDD cup dataset 1999 is used for simulation. From the detailed performance analysis, it is clear that the proposed system is able to detect the anomalies with high detection accuracy and low false alarm rate.
Enhanced Authentication in Wireless Sensor Networks for Effective Lifetime En...Eswar Publications
Remote sensor systems are self sorted out, self-governing, programmed revelation of administrations, exceedingly
versatile, solid, Infrastructure less administration. Predominantly material in the field of debacle, social
insurance. Secure spread of code updates in Wireless Sensor Network is testing assignment. Programming refreshes in Wireless Sensor Networks is accepting huge consideration as of late as it expands the lifetime of the system. Because of the expansive number of hubs, it is illogical to refresh the product physically on account of the size of such arrangements and the physical unavailability of certain sensor hubs. For certain WSN applications, securing the procedure of remote reconstructing is fundamental. For instance, code refreshes in military
applications must be confirmed to maintain a strategic distance from the download of malignant code into conveyed sensor hubs. Also, applications that require protection and obscurity ought not concede code refreshes that can reinvent the WSN to snoop on focuses without authorization. Given the presumption that more bytes transmitted means in a roundabout way more vitality squandered in the spread, vitality proficiency is accomplished by lessening the rate of overhead hash bytes per page.
Machine Learning Techniques Used for the Detection and Analysis of Modern Typ...IRJET Journal
This document summarizes research on using machine learning techniques to detect distributed denial-of-service (DDoS) attacks. It discusses how DDoS attacks have become more sophisticated over time. The document examines using naïve Bayes, multilayer perceptron, and other machine learning classifiers on a new dataset containing modern DDoS attack types like HTTP floods and SQL injection DDoS. According to the experimental results, multilayer perceptron achieved the highest accuracy for detecting these modern DDoS attacks.
“ALERT SYSTEM FOR NEW USER TO CREATE SAFE AREA USING BLOCK CHAIN”IRJET Journal
This document summarizes a research paper on developing an alert system to create safe areas using blockchain technology. The system aims to provide security for both crime data and humans by encrypting crime reports on the blockchain to prevent hackers from accessing or modifying the data. It involves building a registration and login system for both new users and the crime branch. The crime branch would upload area-wise crime reports that are securely stored on the blockchain network. New users could then access reports for a given area to stay informed on local crime. The system aims to securely store data, reduce paperwork, allow data access from anywhere, lower administrative burdens, and help prevent cybercrime through blockchain data encryption and security.
A Survey on Privacy-Preserving Data Aggregation Without Secure ChannelIRJET Journal
This document summarizes a research paper on privacy-preserving data aggregation without a secure channel. It discusses two models for aggregating private data from multiple participants: one with an external aggregator and one where participants calculate the aggregation jointly. The paper proposes protocols for the aggregator or participants to calculate the sum and product of the private data in a way that preserves the privacy of each participant's data, without requiring secure pairwise channels between participants. The protocols are based on the computational hardness of solving certain cryptographic problems like the discrete logarithm problem.
Cybercrime is increasing at a faster pace and sometimes causes billions of dollars of business- losses so
investigating attackers after commitment is of utmost importance and become one of the main concerns of
network managers. Network forensics as the process of Collecting, identifying, extracting and analyzing
data and systematically monitoring traffic of network is one of the main requirements in detection and
tracking of criminals. In this paper, we propose an architecture for network forensic system. Our proposed
architecture consists of five main components: collection and indexing, database management, analysis
component, SOC communication component and the database.
The main difference between our proposed architecture and other systems is in analysis component. This
component is composed of four parts: Analysis and investigation subsystem, Reporting subsystem, Alert
and visualization subsystem and the malware analysis subsystem. The most important differentiating
factors of the proposed system with existing systems are: clustering and ranking of malware, dynamic
analysis of malware, collecting and analysis of network flows and anomalous behavior analysis.
IRJET - Network Traffic Monitoring and Botnet Detection using K-ANN AlgorithmIRJET Journal
This document discusses a system for network traffic monitoring and botnet detection using the K-ANN (Kohonen Artificial Neural Network) algorithm. The system aims to address challenges in analyzing large amounts of network traffic data in real-time to accurately detect abnormal network activities and security threats. It involves collecting network packets using packet capture libraries, filtering the packets using K-ANN to detect botnet behavior, and notifying administrators of any detected threats. The system architecture and K-ANN algorithm are described. Results show the system was able to detect a SYN flood attack based on analyzing packet attributes. Future work will involve detecting additional types of network threats.
Intrusion Detection Systems By Anamoly-Based Using Neural NetworkIOSR Journals
To improve network security different steps has been taken as size and importance of the network has
increases day by day. Then chances of a network attacks increases Network is mainly attacked by some
intrusions that are identified by network intrusion detection system. These intrusions are mainly present in data
packets and each packet has to scan for its detection. This paper works to develop a intrusion detection system
which utilizes the identity and signature of the intrusion for identifying different kinds of intrusions. As network
intrusion detection system need to be efficient enough that chance of false alarm generation should be less,
which means identifying as a intrusion but actually it is not an intrusion. Result obtained after analyzing this
system is quite good enough that nearly 90% of true alarms are generated. It detect intrusion for various
services like Dos, SSH, etc by neural network
IRJET- Privacy Enhancing Routing Algorithm using Backbone Flooding SchemesIRJET Journal
The document proposes a novel privacy enhancing routing algorithm called Optimal Privacy Enhancing Routing Algorithm (OPERA) for wireless networks. OPERA uses a statistical game-theoretic framework to optimize privacy given a utility function. It considers a global adversary that can observe transmissions in the whole network. OPERA formulates the privacy-utility tradeoff problem as a linear program that can be efficiently solved. Simulation results show that OPERA reduces the adversary's identification probability compared to random and greedy heuristics and the information-theoretic mutual information approach. The algorithm provides improved privacy protection while balancing overhead in wireless networks.
IRJET- A Review of the Concept of Smart GridIRJET Journal
This document proposes a novel privacy enhancing routing algorithm called Optimal Privacy Enhancing Routing Algorithm (OPERA) for wireless networks. OPERA uses a statistical game-theoretic framework to optimize routing privacy given a utility function. It considers a global adversary that can observe transmissions across the entire network. OPERA formulates the privacy-utility tradeoff problem as a linear program that can be efficiently solved. Simulation results show that OPERA reduces the adversary's identification likelihood by up to half compared to random and greedy heuristics, and up to five times compared to a pattern matching scheme. OPERA also outperforms traditional information-theoretic approaches.
A Comparative Study for Source Privacy Preserving and Message Authentication ...AM Publications
Source node privacy and message authentication are the most important issues to be addressed in wireless sensor networks. Many schemes have come up to deal with message authentication. However, some of the schemes have stood by with some limitations like lack of scalability and high communication and computational overhead. Later these issues were solved by a polynomial based scheme, but failed to transmit number of messages beyond its threshold. To overcome this limitation an ECC and RSA algorithm has been used. To fix all these issues, a source node privacy based message authentication using Greedy Random walk algorithm has been proposed in this paper. A comparative study is done for the work that is implemented using ns2 and matlab.
Similar to Outlier Detection in Secure Shell Honeypot using Particle Swarm Optimization Technique (20)
Content-Based Image Retrieval (CBIR) systems have been used for the searching of relevant images in various research areas. In CBIR systems features such as shape, texture and color are used. The extraction of features is the main step on which the retrieval results depend. Color features in CBIR are used as in the color histogram, color moments, conventional color correlogram and color histogram. Color space selection is used to represent the information of color of the pixels of the query image. The shape is the basic characteristic of segmented regions of an image. Different methods are introduced for better retrieval using different shape representation techniques; earlier the global shape representations were used but with time moved towards local shape representations. The local shape is more related to the expressing of result instead of the method. Local shape features may be derived from the texture properties and the color derivatives. Texture features have been used for images of documents, segmentation-based recognition,and satellite images. Texture features are used in different CBIR systems along with color, shape, geometrical structure and sift features.
This document discusses clickjacking attacks, which hijack users' clicks to perform unintended actions. It provides an overview of clickjacking, describes different types of attacks, and analyzes vulnerabilities that make websites susceptible. Experiments are conducted on a sample social networking site, applying various clickjacking techniques. Potential defenses are tested, including X-Frame-Options headers and frame busting code. A proposed solution detects transparent iframes to warn users and check for hidden mouse pointers to mitigate cursorjacking. Analysis of top Jammu and Kashmir websites found most were vulnerable, while browser behavior studies showed varying support for defenses.
Performance Analysis of Audio and Video Synchronization using Spreaded Code D...Eswar Publications
The audio and video synchronization plays an important role in speech recognition and multimedia communication. The audio-video sync is a quite significant problem in live video conferencing. It is due to use of various hardware components which introduces variable delay and software environments. The objective of the synchronization is used to preserve the temporal alignment between the audio and video signals. This paper proposes the audio-video synchronization using spreading codes delay measurement technique. The performance of the proposed method made on home database and achieves 99% synchronization efficiency. The audio-visual
signature technique provides a significant reduction in audio-video sync problems and the performance analysis of audio and video synchronization in an effective way. This paper also implements an audio- video synchronizer and analyses its performance in an efficient manner by synchronization efficiency, audio-video time drift and audio-video delay parameters. The simulation result is carried out using mat lab simulation tools and simulink. It is automatically estimating and correcting the timing relationship between the audio and video signals and maintaining the Quality of Service.
Due to the availability of complicated devices in industry, models for consumers at lower cost of resources are developed. Home Automation systems have been developed by several researchers. The limitations of home automation includes complexity in architecture, higher costs of the equipment, interface inflexibility. In this paper as we have proposed, the working protocol of PIC 16F72 technology is which is secure, cost efficient, flexible that leads to the development of efficient home automation systems. The system is operational to control various home appliances like fans, Bulbs, Tube light. The following paper describes about components used and working of all components connected. The home automation system makes use of Android app entitled “Home App” which gives
flexibility and easy to use GUI.
Semantically Enchanced Personalised Adaptive E-Learning for General and Dysle...Eswar Publications
E-learning plays an important role in providing required and well formed knowledge to a learner. The medium of e- learning has achieved advancement in various fields such as adaptive e-learning systems. The need for enhancing e-learning semantically can enhance the retrieval and adaptability of the learning curriculum. This paper provides a semantically enhanced module based e-learning for computer science programme on a learnercentric perspective. The learners are categorized based on their proficiency for providing personalized learning environment for users. Learning disorders on the platform of e-learning still require lots of research. Therefore, this paper also provides a personalized assessment theoretical model for alphabet learning with learning objects for
children’s who face dyslexia.
Agriculture plays an important role in the economy of our country. Over 58 percent of the rural households depend on the agriculture sector as their means of livelihood. Agriculture is one of the major contributors to Gross Domestic Product(GDP). Seeds are the soul of agriculture. This application helps in reducing the time for the researchers as well as farmers to know the seedling parameters. The application helps the farmers to know about the percentage of seedlings that will grow and it is very essential in estimating the yield of that particular crop. Manual calculation may lead to some error, to minimize that error, the developed app is used. The scientist and farmers require the app to know about the physiological seed quality parameters and to take decisions regarding their farming activities. In this article a desktop app for seed germination percentage and vigour index calculation are developed in PHP scripting language.
What happens when adaptive video streaming players compete in time-varying ba...Eswar Publications
Competition among adaptive video streaming players severely diminishes user-QoE. When players compete at a bottleneck link many do not obtain adequate resources. This imbalance eventually causes ill effects such as screen flickering and video stalling. There have been many attempts in recent years to overcome some of these problems. However, added to the competition at the bottleneck link there is also the possibility of varying network bandwidth which can make the situation even worse. This work focuses on such a situation. It evaluates current heuristic adaptive video players at a bottleneck link with time-varying bandwidth conditions. Experimental setup includes the TAPAS player and emulated network conditions. The results show PANDA outperforms FESTIVE, ELASTIC and the Conventional players.
Spreading Trade Union Activities through Cyberspace: A Case StudyEswar Publications
This report present the outcome of an investigative research conducted to examine the modu-operandi of academic staff union of polytechnics (ASUP) YabaTech. The investigation covered the logistics and cost implication for spreading union activities among members. It was discovered that cost of management and dissemination of information to members was at high side, also logistics problem constitutes to loss of information in transit hence cut away some members from union activities. To curtail the problem identified, we proposed the
design of secure and dynamic website for spreading union activities among members and public. The proposed system was implemented using HTML5 technology, interface frameworks like Bootstrap and j query which enables the responsive feature of the application interface. The backend was designed using PHPMYSQL. It was discovered from the evaluation of the new system that cost of managing information has reduced considerably, and logistic problems identified in the old system has become a forgotten issue.
Identifying an Appropriate Model for Information Systems Integration in the O...Eswar Publications
Nowadays organizations are using information systems for optimizing processes in order to increase coordination and interoperability across the organizations. Since Oil and Gas Industry is one of the large industries in whole of the world, there is a need to compatibility of its Information Systems (IS) which consists three categories of systems: Field IS, Plant IS and Enterprise IS to create interoperability and approach the
optimizing processes as its result. In this paper we introduce the different models of information systems integration, identify the types of information systems that are using in the upstream and downstream sectors of petroleum industry, and finally based on expert’s opinions will identify a suitable model for information systems integration in this industry.
Link-and Node-Disjoint Evaluation of the Ad Hoc on Demand Multi-path Distance...Eswar Publications
This work illustrates the AOMDV routing protocol. Its ancestor, the AODV routing protocol is also described. This tutorial demonstrates how forward and reverse paths are created by the AOMDV routing protocol. Loop free paths formulation is described, together with node and link disjoint paths. Finally, the performance of the AOMDV routing protocol is investigated along link and node disjoint paths. The WSN with the AOMDV routing protocol using link disjoint paths is better than the WSN with the AOMDV routing protocol using node disjoint paths for energy consumption.
Bridging Centrality: Identifying Bridging Nodes in Transportation NetworkEswar Publications
To identify the importance of node of a network, several centralities are used. Majority of these centrality measures are dominated by components' degree due to their nature of looking at networks’ topology. We propose a centrality to identification model, bridging centrality, based on information flow and topological aspects. We apply bridging centrality on real world networks including the transportation network and show that the nodes distinguished by bridging centrality are well located on the connecting positions between highly connected regions. Bridging centrality can discriminate bridging nodes, the nodes with more information flowed through them and locations between highly connected regions, while other centrality measures cannot.
Now a days we are living in an era of Information Technology where each and every person has to become IT incumbent either intentionally or unintentionally. Technology plays a vital role in our day to day life since last few decades and somehow we all are depending on it in order to obtain maximum benefit and comfort. This new era equipped with latest advents of technology, enlightening world in the form of Internet of Things (IoT). Internet of things is such a specified and dignified domain which leads us to the real world scenarios where each object can perform some task while communicating with some other objects. The world with full of devices, sensors and other objects which will communicate and make human life far better and easier than ever. This paper provides an overview of current research work on IoT in terms of architecture, a technology used and applications. It also highlights all the issues related to technologies used for IoT, after the literature review of research work. The main purpose of this survey is to provide all the latest technologies, their corresponding
trends and details in the field of IoT in systematic manner. It will be helpful for further research.
Automatic Monitoring of Soil Moisture and Controlling of Irrigation SystemEswar Publications
In past couple of decades, there is immediate growth in field of agricultural technology. Utilization of proper method of irrigation by drip is very reasonable and proficient. A various drip irrigation methods have been proposed, but they have been found to be very luxurious and dense to use. The farmer has to maintain watch on irrigation schedule in the conventional drip irrigation system, which is different for different types of crops. In remotely monitored embedded system for irrigation purposes have become a new essential for farmer to accumulate his energy, time and money and will take place only when there will be requirement of water. In this approach, the soil test for chemical constituents, water content, and salinity and fertilizer requirement data collected by wireless and processed for better drip irrigation plan. This paper reviews different monitoring systems and proposes an automatic monitoring system model using Wireless Sensor Network (WSN) which helps the farmer to improve the yield.
Multi- Level Data Security Model for Big Data on Public Cloud: A New ModelEswar Publications
With the advent of cloud computing the big data has emerged as a very crucial technology. The certain type of cloud provides the consumers with the free services like storage, computational power etc. This paper is intended to make use of infrastructure as a service where the storage service from the public cloud providers is going to leveraged by an individual or organization. The paper will emphasize the model which can be used by anyone without any cost. They can store the confidential data without any type of security issue, as the data will be altered
in such a way that it cannot be understood by the intruder if any. Not only that but the user can retrieve back the original data within no time. The proposed security model is going to effectively and efficiently provide a robust security while data is on cloud infrastructure as well as when data is getting migrated towards cloud infrastructure or vice versa.
Impact of Technology on E-Banking; Cameroon PerspectivesEswar Publications
The financial services industry is experiencing rapid changes in services delivery and channels usage, and financial companies and users of financial services are looking at new technologies as they emerge and deciding whether or not to embrace them and the new opportunities to save and manage enormous time, cost and stress.
There is no doubt about the favourable and manifold impact of technology on e-banking as pictured in this review paper, almost all banks are with the least and most access e-banking Technological equipments like ATMs and Cards. On the other Hand cheap and readily available technology has opened a favourable competition in ebanking services business with a lot of wide range competitors competing with Commercial Banks in Cameroon in providing digital financial services.
Classification Algorithms with Attribute Selection: an evaluation study using...Eswar Publications
Attribute or feature selection plays an important role in the process of data mining. In general the data set contains more number of attributes. But in the process of effective classification not all attributes are relevant.
Attribute selection is a technique used to extract the ranking of attributes. Therefore, this paper presents a comparative evaluation study of classification algorithms before and after attribute selection using Waikato Environment for Knowledge Analysis (WEKA). The evaluation study concludes that the performance metrics of the classification algorithm, improves after performing attribute selection. This will reduce the work of processing irrelevant attributes.
Mining Frequent Patterns and Associations from the Smart meters using Bayesia...Eswar Publications
In today’s world migration of people from rural areas to urban areas is quite common. Health care services are one of the most challenging aspect that is must require to the people with abnormal health. Advancements in the technologies lead to build the smart homes, which contains various sensor or smart meter devices to automate the process of other electronic device. Additionally these smart meters can be able to capture the daily activities of the patients and also monitor the health conditions of the patients by mining the frequent patterns and
association rules generated from the smart meters. In this work we proposed a model that is able to monitor the activities of the patients in home and can send the daily activities to the corresponding doctor. We can extract the frequent patterns and association rules from the log data and can predict the health conditions of the patients and can give the suggestions according to the prediction. Our work is divided in to three stages. Firstly, we used to record the daily activities of the patient using a specific time period at three regular intervals. Secondly we applied the frequent pattern growth for extracting the association rules from the log file. Finally, we applied k means clustering for the input and applied Bayesian network model to predict the health behavior of the patient and precautions will be given accordingly.
Network as a Service Model in Cloud Authentication by HMAC AlgorithmEswar Publications
Resource pooling on internet-based accessing on use as pay environmental technology and ruled in IT field is the
cloud. Present, in every organization has trusted the web, however, the information must flow but not hold the
data. Therefore, all customers have to use the cloud. While the cloud progressing info by securing-protocols. Third
party observing and certain circumstances directly stale in flow and kept of packets in the virtual private cloud.
Global security statistics in the year 2017, hacking sensitive information in cloud approximately maybe 75.35%,
and the world security analyzer said this calculation maybe reached to 100%. For this cause, this proposed
research work concentrates on Authentication-Message-Digest-Key with authentication in routing the Network as
a Service of packets in OSPF (Open Shortest Path First) implementing Cloud with GNS3 has tested them to
securing from attackers.
Microstrip patch antennas are recently used in wireless detection applications due to their low power consumption, low cost, versatility, field excitation, ease of fabrication etc. The microstrip patch antennas are also called as printed antennas which is suffer with an array elements of antenna and narrow bandwidth. To overcome the above drawbacks, Flame Retardant Material is used as the substrate. Rectangular shape of microstrip patch antenna with FR4 material as the substrate which is more suitable for the explosive detection applications. The proposed printed antenna was designed with the dimension of 60 x 60 mm2. FR-4 material has a dielectric constant value of 4.3 with thickness 1.56 mm, length and width 60 mm and 60 mm respectively. One side of the substrate contains the ground plane of dimensions 60 x60 mm2 made of copper and the other side of the substrate contains the patch which have dimensions 34 x 29 mm2 and thickness 0.03mm which is also made of copper. RMPA without slot, Vertical slot RMPA, Double horizontal slot RMPA and Centre slot RMPA structures were
designed and the performance of the antennas were analysed with various parameters such as gain, directivity, Efield, VSWR and return loss. From the performance analysis, double horizontal slot RMPA antenna provides a better result and it provides maximum gain (8.61dB) and minimum return loss (-33.918dB). Based on the E-field excitation value the SEMTEX explosive material is detected and it was simulated using CST software.
Bandwidth Estimation Techniques for Relative ‘Fair’ Sharing in DASHEswar Publications
In the adaptive video streaming (AVS) literature the term fair sharing has been used to describe equal amounts of bandwidth allocated to adaptive client players. However, we argue that even though bandwidth sharing is an important aspect in some problems the same does not apply to AVS. Here the term relative ‘fair’ sharing is more applicable. The reason is that videos have different quality levels and will require differing amounts of the bandwidth to satisfy their needs. A 90% to 10% sharing may be sufficient for two players, one with high demands and the other with low demands. A 50% sharing may lead the player with the high bandwidth demand to get too little of the needed bandwidth resource. In addition, channel conditions may lead to players requiring different amount of bandwidth. Again, the concept of fair sharing has to be extended to relative ‘fair’ sharing for such scenarios. Hence, bandwidth estimation techniques players use to estimate the network bandwidth is very important in segment selection. A player utilizes one of the many techniques to determine what share of the bandwidth it can utilize among competing players. In this paper we explore some of the techniques used in state of-the-art players in their attempt to obtain a ‘fair’ share of the network bandwidth.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
What is an RPA CoE? Session 1 – CoE VisionDianaGray10
In the first session, we will review the organization's vision and how this has an impact on the COE Structure.
Topics covered:
• The role of a steering committee
• How do the organization’s priorities determine CoE Structure?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...Jason Yip
The typical problem in product engineering is not bad strategy, so much as “no strategy”. This leads to confusion, lack of motivation, and incoherent action. The next time you look for a strategy and find an empty space, instead of waiting for it to be filled, I will show you how to fill it in yourself. If you’re wrong, it forces a correction. If you’re right, it helps create focus. I’ll share how I’ve approached this in the past, both what works and lessons for what didn’t work so well.
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframePrecisely
Inconsistent user experience and siloed data, high costs, and changing customer expectations – Citizens Bank was experiencing these challenges while it was attempting to deliver a superior digital banking experience for its clients. Its core banking applications run on the mainframe and Citizens was using legacy utilities to get the critical mainframe data to feed customer-facing channels, like call centers, web, and mobile. Ultimately, this led to higher operating costs (MIPS), delayed response times, and longer time to market.
Ever-changing customer expectations demand more modern digital experiences, and the bank needed to find a solution that could provide real-time data to its customer channels with low latency and operating costs. Join this session to learn how Citizens is leveraging Precisely to replicate mainframe data to its customer channels and deliver on their “modern digital bank” experiences.
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
5th LF Energy Power Grid Model Meet-up SlidesDanBrown980551
5th Power Grid Model Meet-up
It is with great pleasure that we extend to you an invitation to the 5th Power Grid Model Meet-up, scheduled for 6th June 2024. This event will adopt a hybrid format, allowing participants to join us either through an online Mircosoft Teams session or in person at TU/e located at Den Dolech 2, Eindhoven, Netherlands. The meet-up will be hosted by Eindhoven University of Technology (TU/e), a research university specializing in engineering science & technology.
Power Grid Model
The global energy transition is placing new and unprecedented demands on Distribution System Operators (DSOs). Alongside upgrades to grid capacity, processes such as digitization, capacity optimization, and congestion management are becoming vital for delivering reliable services.
Power Grid Model is an open source project from Linux Foundation Energy and provides a calculation engine that is increasingly essential for DSOs. It offers a standards-based foundation enabling real-time power systems analysis, simulations of electrical power grids, and sophisticated what-if analysis. In addition, it enables in-depth studies and analysis of the electrical power grid’s behavior and performance. This comprehensive model incorporates essential factors such as power generation capacity, electrical losses, voltage levels, power flows, and system stability.
Power Grid Model is currently being applied in a wide variety of use cases, including grid planning, expansion, reliability, and congestion studies. It can also help in analyzing the impact of renewable energy integration, assessing the effects of disturbances or faults, and developing strategies for grid control and optimization.
What to expect
For the upcoming meetup we are organizing, we have an exciting lineup of activities planned:
-Insightful presentations covering two practical applications of the Power Grid Model.
-An update on the latest advancements in Power Grid -Model technology during the first and second quarters of 2024.
-An interactive brainstorming session to discuss and propose new feature requests.
-An opportunity to connect with fellow Power Grid Model enthusiasts and users.
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Outlier Detection in Secure Shell Honeypot using Particle Swarm Optimization Technique
1. Int. J. Advanced Networking and Applications
Volume: 09 Issue: 03 Pages: 3443-3450 (2017) ISSN: 0975-0290
3443
Outlier Detection in Secure Shell Honeypot using
Particle Swarm Optimization Technique
M.Sithara
Department of Computer Science, SRMV College of Arts and Science, Coimbatore-20
Email: sitharamohammed@gmail.com
M.Chandran
Department of Computer Applications, SRMV College of Arts and Science, Coimbatore-20
Email: onchandran@gmail.com
G.Padmavathi
Department of Computer Science, Avinashilingam Institute for Home Science and Higher Education for Women
University, Coimbatore-43
Email: ganapathi.padmavathi@gmail.com
-------------------------------------------------------------------ABSTRACT---------------------------------------------------------------
With trends and technologies, developments and deployments, network communication has become vital and
inevitable with human beings. On the other side, a network communication without security is powerless. There
are so many technologies and developments have been rooted to provide a secure and an efficient means of
communication through network. Parallel to this, network threats and attacks are also trendy and much
technologized. In order to detect such a kind of threats and attacks, this research work proposes honeypot
technology. Honeypot is a supplemented active defense system for network security. It traps attacks, records
intrusion information about tools and activities of the hacking process, and prevents attacks outbound from the
compromised system. This research work implements a kind of honeypot called Secure Shell (SSH) honeypot.
SSH honeypot is a secure communication channel which allows users to remotely control computer systems. With
the implementation of SSH honeypot, this research work collects the incoming and outgoing traffic data in a
network. The collected traffic data can be then analyzed to detect outliers in order to find the abnormal or
malicious traffic. This research work detects outliers from the collected SSH honeypot data using Particle Swarm
Optimization technique which belongs to the category of cluster-based outlier detection method. With
experiments and results, Particle Swarm Optimization shows best results in detecting outliers and has best cost
function when compared to other cluster-based algorithms like Genetic Algorithm and Differential Evolution
algorithm.
Keywords - Differential Evolution, Genetic Algorithm, Honeypots, Particle Swarm Optimization, Secure Shell
--------------------------------------------------------------------------------------------------------------------------------------------------
Date of Submission: Nov 13, 2017 Date of Acceptance: Nov 24, 2017
--------------------------------------------------------------------------------------------------------------------------------------------------
1. INTRODUCTION
Honeypot is a supplemented active defense system for
network security. It traps attacks, records intrusion
information about tools and activities of the hacking
process, and prevents attacks outbound from the
compromised system [1]. It appears as an ordinary system
doing work, but all the data and transactions are phony.
Honeypots actively give way to attacker to gain
information about new intrusions. Honeypots can be
classified into three different types based on its level of
interaction. They are low-interaction, medium-interaction
and high-interaction honeypots [3]. This research work
implements a honeypot called Secure Shell (SSH)
honeypot which belongs to the category of both low and
medium-interactions.
Secure Shell (SSH) is a protocol for secure remote
login and secures network services over an insecure
network using client/server architecture. It is a secure
communication channel which allows users to remotely
control computer systems [4]. Remote monitoring can be
accomplished by deploying SSH in a remote server
system. This remote server system can then be monitored
using a SSH client system. The data traffic, data transfers,
data logs, data failures, data speed, network performance
and other criteria in the SSH server can be monitored
remotely through SSH client system. Once the SSH
protocol is deployed in SSH server system through the
SSH client, it is possible to take over the control of SSH
server. The idea of deploying honeypots is to lure the
attackers and to log the network data. The SSH server logs
all the incoming and outgoing network traffic, and this
traffic can be monitored and controlled through the SSH
client system. In order to authenticate the data log and data
transmission between both SSH client and server there are
three main authentication methods that are being followed.
They are through passwords, keys and hybrid. The
deployed scenario in this research work is authenticated
through password and SSH server plays the role of
honeypot by logging all the incoming and outgoing
network traffic. Once the SSH client and server connection
is established, the authentication is provided and the
network traffic is logged in SSH server. The traffic data
can now be analyzed for detection of outliers.
2. Int. J. Advanced Networking and Applications
Volume: 09 Issue: 03 Pages: 3443-3450 (2017) ISSN: 0975-0290
3444
An outlier is a data point which is very different from
the rest of the data points based on some measures. Such a
data point often contains useful information on abnormal
behavior of the system described by data. Abnormal
behavior in the network includes malicious traffic,
malicious data, extrusions and intrusions like malwares
etc. The outlier detection methods try to find these
anomalous types of traffic data among the normal traffic
data. There are different types of outlier detection methods
and they are statistical-based, distance-based, deviation-
based, distribution-based, depth-based, density- based and
clustering-based. This research work is about detecting
outliers in Secure Shell (SSH) honeypot using clustering-
based outlier detection methods such as Particle Swarm
Optimization (PSO), Genetic Algorithm (GA) and
Differential Evolution (DE) algorithms. The results of
Particle Swarm Optimization are then compared with the
results of Genetic Algorithm and Differential Evolution
Algorithm.
This section discussed about the concepts required for
the study of research. Section 2 gives the review of
literature about Honeypots, Honeypot Mechanisms,
Outlier Detection techniques and Clustering techniques.
Section 3 explains about the implementation of proposed
methodology. Section 4 discusses about the results and
outcomes of the experiments and Section 5 concludes the
research with future work.
2. RELATED WORKS
There are number of security schemes that have been
proposed and implemented in literature. Honeypots
provide security to the network by logging traffic data and
it acts as a normal user system which does not give any
kind of information about its presence to the attackers. It is
observed that some of the recent works in honeypot
technology uses different schemes for implementation and
different methods for outlier detection. Some of them are
listed below.
Feng Zhang et al. [1] have proposed key components of
data capture and data control in honeypot, and have given
a classification for honeypot according to security and
application goals. This paper have introduced honeypot
and honeypot related technologies from the viewpoint of
security management for network and have discussed
about detection methods, reaction methods, data capture
and data store methods. Deployment of virtual honeypots
is also discussed here.
Ioannis Koniaris et al. [2] proposed details about two
distinct types of honeypots. The first honeypot act as a
malware collector, a device usually deployed in order to
capture self-propagating malware and to monitor their
activity. The second acts as a decoy server to log every
malicious connection attempts. They have also shown the
usage of honeypots for malware monitoring and attack
logging.
Aaditya Jain et al. [3] discussed about the honeypot
technology with its classification based on various factors.
This paper classifies honeypots based on level of
interactions, purpose and physical presence. As two
honeypots have been deployed here, one was used for
inbound traffics and another one was for outbound traffics.
This paper has also discussed about SSH Honeypots.
Shaik Bhanu et al. [4] presented the results of SSH
honeypot operations in which it undertook the web trap of
attackers who target SSH service in order to gain illegal
services. They have collected the data and analyzed the
information from SSH honeypots. Here the focus is on
brute-force and dictionary attacks. They have analyzed the
data collected from a large number of SSH attacks against
a Virtual Private Server (VPS) which was set up as a
honeypot to log all malicious activity.
Abdallah Ghourabi et al. [5] proposed a data analysis
tool for honeypot router. The proposed tool is based on
data mining clustering. They extracted useful features
from the data captured by the honeypot router. These data
will be then clustered by using the DBSCAN clustering
algorithm in order to classify the captured packets and to
extract the suspicious data. Suspicious packets will be then
verified by a human expert.
Ren Hui Gong et al. [6] presented a genetic algorithm
(GA) based approach to network intrusion detection.
Genetic algorithm is employed to derive a set of
classification rules from network audit data and is utilized
as fitness function to judge the quality of each rule. The
generated rules are then used to detect or classify network
intrusions in a real-time environment.
Table 1. Comparison of Methods
Authors Techniques Observations Year
Published
Feng
Zhang et
al.
Typical
honeypot
solutions are
proposed and
the
deployment
of virtual
honeypots is
discussed.
Honeypot is
not a solution
to network
security but a
good tool
supplements
other security
technologies.
2003
Ioannis
Koniaris
et al.
Presented an
open source
visualization
tool which
was
developed to
help security
professionals
in analysis of
data.
Using
honeypots
with other
defense
systems such
as firewalls
and IPS can be
a good step
2014
Aaditya
Jain et
al.
Proposed
honeypots
that has
multilayer
data storing
capacity with
Strong control
mechanism
like IPS is
required
2015
3. Int. J. Advanced Networking and Applications
Volume: 09 Issue: 03 Pages: 3443-3450 (2017) ISSN: 0975-0290
3445
firewalls and
helps in
analyzing
vulnerable
activities.
Shaik
Bhanu et
al.
Presented
SSH as a
medium
interaction
honeypot and
the focus was
to detect
brute-force
and
dictionary
attacks
Results are
very low in
percentage.
2014
Abdallah
Ghourab
i et al.
Proposed a
data analysis
tool for
honeypot
router based
on DBSCAN
clustering
algorithm.
False positives
are generated
with more in
numbers.
2015
Ren Hui
Gong et
al.
Presented a
genetic
algorithm
(GA) based
approach to
network
intrusion
detection.
Sometimes the
generated
rules can be
biased to the
training data
set.
2005
It is observed that Secure Shell (SSH) honeypot logs all
the incoming and outgoing traffic and clustering
algorithms are applied to detect outliers. This work
implements clustering algorithms like PSO, GA and DE
techniques. Then the results are compared based on the
metrics such as cost function, number of clusters and
number of iterations.
3. METHODOLOGY
There are different types of outlier detection
mechanisms that have been used and studied by the
researchers to detect abnormal traffic data. This proposed
research work implements clustering-based outlier
detection method.
3.1. METHODOLOGY OVERVIEW
The methodology implementation starts by creating the
network in Cisco Packet Tracer 7.0. Then the incoming
and outgoing traffic data in SSH honeypot is collected.
Once the data is collected, a subset of data is created. Now
the subset of data is clustered using Genetic Algorithm
(GA), Differential Evolution (DE) and Particle Swarm
Optimization (PSO) algorithms using MATLAB. Finally,
the performance of all the three algorithms is compared
and the best algorithm is identified based on the
experimental results. The methodology overview is given
in the figure 1.
3.2. CREATION OF NETWORK
The network scenario is created and simulated using
the tool Cisco Packet Tracer 7.0. The type of network used
here is wired network. The network consists of DHCP
server, user systems, routers and switches. DHCP server is
deployed in the network to provide web connections
between the user systems. It allows the user systems to
establish communication between them by using
webpages through internet.
Fig 1. Methodology Overview
3.3. DATA COLLECTION IN SSH THROUGH
NETFLOW
One of the major objectives of the proposed work is to
collect data in the network using honeypots. With varying
types of honeypots available, this work implements Secure
Shell Honeypot (SSH). SSH honeypot can be implemented
and monitored remotely. SSH honeypot is implemented in
a router and this router acts as a SSH server. Now this
SSH server collects traffic logs and this logs are monitored
remotely by implementing SSH client in any one of the
systems in the created network. Logging and monitoring in
SSH client in Cisco Packet Tracer is achieved by applying
Netflow. Netflow service provides with access of
information about IP flows within the network. Now the
traffic data logs in the SSH server is monitored and
collected remotely by the SSH client. The collected data
includes IPV4 Source Address, IPV4 Destination Address,
IPV4 source mask, IPV4 destination mask, next hop
4. Int. J. Advanced Networking and Applications
Volume: 09 Issue: 03 Pages: 3443-3450 (2017) ISSN: 0975-0290
3446
address, source interface , source IP address , destination
IP address , destination interface, protocol, source
protocol, destination protocol, no of packets, flow sampler
id, idflow direction, source mask, destination mask,
counter packets and timestamp.
3.4. CREATION OF SUBSET DATA
The creation of subset is to extract only the needed data
from the whole set of collected data. The collected data is
stored and is accessed using Microsoft Access. The
extracted data includes protocol, source and destination
protocol, no of packets, flow sampler id, source mask,
destination mask, counter bytes and counter packets. The
collected data is now clustered to detect the outliers.
3.5. DATA CLUSTERING USING GENETIC
ALGORITHM
Genetic Algorithm is one of the efficient evolutionary
algorithms to detect outliers using clustering technique.
Genetic Algorithm can detect misuse and outlier data in
the network. This employs metaphor to iteratively evolve a
population of high quality individual, where each
individual represents a solution to the problem. The
operation of Genetic Algorithm starts from an initial
population of randomly generated individuals. Then the
population is evolved for a number of generations and the
qualities of the individuals are gradually improved. During
each generation, three basic genetic operators are
sequentially applied to each individual with certain
probabilities (selection, crossover and mutation). First a
number of best-fit individuals are selected based on a user-
defined fitness function. The remaining individuals are
discarded. Next, a number of individuals are selected and
paired with each other. Each individual pair produces one
offspring by partially exchanging their genes around one
are more randomly selected crossing points. At the end, a
certain number of individuals are selected and the
mutation operations are applied [6].
3.6. DATA CLUSTERING USING DIFFERENTIAL
EVOLUTION
The Differential Algorithm is a heuristic algorithm or
an evolution strategy. This algorithm is a heuristic
algorithm for global optimization and is operated by using
decision variables in a real number form. The individuals
occurring in this algorithm are represented by real number
strings. Its searching space must be continuous. By
computing the difference between two individuals chosen
randomly from the population, the Differential Evolution
algorithm determines a function gradient within a given
area (not at a single point). Therefore, this algorithm
prevents the solution of sticking at a local extreme of the
optimized function. Another important property of this
algorithm is a local limitation of the selection operator to
only two individuals (parent (xi) and child (ui)), and,
owing to this property, the selection operator is more
effective and faster. Also, to accelerate the convergence of
the algorithm, it is assumed that the index r1 points to the
best individual in the population. [7].
3.7. DATA CLUSTERING USING PARTICLE
SWARM OPTIMIZATION
Particle Swarm Optimization algorithm is one among
the evolutionary algorithm to detect outliers using
clustering technique. Particle Swarm Optimization is
similar to Genetic Algorithm in that system is initialized
with the population of random solutions. This is unlike to
Genetic Algorithm with each of the potential solution is
also assigned a random velocity and the potential solutions
called particles are then flown through the problem space.
Each particle keeps track its coordinates in the problem
space which are associated with the best solution (fitness)
it has achieved and the fitness value is stored. This value is
called pbest. Another best value that is tracked by the
global version of the particle swarm optimizer is the
overall best value, and its location, obtained so far by any
particle in the population. This location is called gbest.
The Particle Swarm Optimization concept consists of each
time step, changing the velocity (accelerating) of each
particle towards its pbest and gbest locations. Acceleration
is weighted by a random term, with separate random
numbers being generated for acceleration towards pbest
and gbest locations. There is also a local version of
Particle Swarm Optimization in which, in addition to
pbest, each particle keeps track of the best solution, called
lbest, attained within a local topological neighborhood of
particles [8].
Table 2. Pseudo Code of Particle Swarm Optimization
begin
Initialize population;
While stopping condition not satisfied do
for j = 1 to no of particles
Evaluate fitness of particle;
Set i=1
for g=1, number of groups
for k=1, number of agents in the group
for n=1, number of dimensions
Random initialization of xk,g
n
Random initialization of vk,g
n
next n
next k
next g
do while (Sub boundary case)
Flag set global best = FALSE
for g=1, number of groups
for k=1, number of agents in the group
Evaluate fitnessk,g
(i), the fitness of agent k in group g at
instant i
next k
next g
for g=1, number of groups
Rank the fitness values of all agents included only in group
g
next g
for g=1, number of groups
for k=1, number of agents in the group
if fitnessk,g
(i) is the best value ever found by agent k in
group g then
5. Int. J. Advanced Networking and Applications
Volume: 09 Issue: 03 Pages: 3443-3450 (2017) ISSN: 0975-0290
3447
pk,g
best,n(i) = xk,g
n(i)
end if
if fitnessk,g
(i) is the best value ever found by all agents then
gg
best,n(i) = x k,g
n(i)
end if
next k
next g
i=i+1
if (i >= sub boundaries iterations) then
Sub boundary case= FALSE
end do
if (Flag set global best = FALSE) then
Flag set global best = TRUE
Rank all the gg
best,n(i) and set the actual g g
best,n(i)
else
end if
end
The overall performance of Genetic Algorithm,
Differential Evolution and Particle Swarm Optimization
algorithms are measured. The performance for these
algorithms is measured in terms of cost function, number
of clusters and number of iterations.
4. RESULTS AND DISCUSSIONS
4.1. EXPERIMENTAL SETUP
The experimental setup of the proposed research work
is carried out using the tool Cisco Packet Tracer 7.0. Cisco
Packet Tracer is a powerful network simulation program
that allows users to experiment with network behavior.
Cisco Packet Tracer provides simulation, visualization,
authoring, assessment and collaboration capabilities. The
proposed work starts experimental setup by placing user
nodes, server, routers and switches. Wired channel is
chosen as a medium of transmission between devices.
Protocols like TCP/IP, UDP and ICMP are used to
perform the network simulation. The other simulation
parameters used in the experimental setup is given in the
below table 3.
Table 3. Simulation Parameters
Simulation Parameters Values
Channel Type Wired Channel
Medium Type Fast Ethernet, Serial
DCE
IP Address Type IPv4
Number of Nodes and
Type
100, Generic-PC-PT
Number of Server and
Type
1, Dynamic Host
Configuration Protocol
(DHCP)
Number of Routers and
Type
3, Generic-PT-Empty,
829
Number of Switches 4, 2960-24TT
Protocols TCP/IP, UDP, ICMP
IP Addresses 192.168.1.1,10 to 34
192.168.2.1,10 to 34
192.168.3.1,2
192.168.4.1,10 to 59
192.168.5.1,2
Server Address 192.168.5.2
Gateway Address 192.168.5.1
Honeypot Node (SSH
Server)
192.168.1.1
SSH Client 192.168.1.34
SSH Password 123
Netflow Node 192.168.1.33,34
In order to perform clustering and to detect outliers
from the collected honeypot data, MATLAB is used.
MATLAB (matrix laboratory) is a fourth-generation high-
level programming language and interactive environment
for numerical computation, visualization and
programming.
4.2. PERFORMANCE EVALUATION
Performance of algorithms is measured based on the
metrics such as Cost Function, Number of Clusters and
Number of Iterations.
4.2.1. COST FUNCTION
Cost function of a clustering algorithm gives a low cost
for the best. It depends on whether all the cluster have
same size, content inside each cluster is the same and
consecutive clusters do not have the same content.
Table 4. Comparison of Cost Function
Algorithms Cost Function
Genetic Algorithm 3972.172
Differential Evolution 4027.1142
Particle Swarm
Optimization
3957.6313
4.2.2. NUMBER OF CLUSTERS
Number of clusters in a clustering technique is the total
number of different clusters with varying group of data
obtained from the given data.
Table 5. Comparison of Number of Clusters
Algorithms Number of Clusters
Genetic Algorithm 3
Differential Evolution 3
Particle Swarm
Optimization
3
4.2.3. NUMBER OF ITERATIONS
Iteration is a process or function which is obtained by
composing other function with it and process itself with a
certain number of times. Iteration applies the same
function repeatedly.
6. Int. J. Advanced Networking and Applications
Volume: 09 Issue: 03 Pages: 3443-3450 (2017) ISSN: 0975-0290
3448
Table 6. Comparison of Number of Iterations
Algorithms Number of Iterations
Genetic Algorithm 200
Differential Evolution 200
Particle Swarm
Optimization
200
The overall comparisons of Genetic Algorithm,
Differential Evolution and Particle Swarm Optimization
are given in the table 7.
Table 7. Overall Comparisons of GA, DE and PSO
Metrics GA DE PSO
Cost
Functions
3972.172
3985.2668
3817.2913
3993.4905
3979.6956
4027.1142
3970.3906
3945.7321
3983.8924
3960.6388
3957.6313
3957.6313
3954.1534
3959.6528
3951.7585
No. of
Clusters
3 3 3
No. of
Iterations
200 200 200
From the comparisons it is clear that the Particle Swarm
Optimization has lowest cost function values and it shows
best costs when compared to Genetic Algorithm and
Differential Evolution algorithms. Hence, Particle Swarm
Optimization has the best cost and it is better than Genetic
Algorithm and Differential Evolution algorithms.
4.3. RESULTS
Sample results for 100 nodes with Secure Shell (SSH)
Honeypot and the graph performance of PSO is given in
the following figures from 2 to 7.
Fig 2. Scenario Setup for 100 Nodes in the Network
Fig 3. Traffic Collected in SSH Client from SSH Server
Fig 4. Cost Function of PSO
Fig 5. Number of Clusters for PSO
7. Int. J. Advanced Networking and Applications
Volume: 09 Issue: 03 Pages: 3443-3450 (2017) ISSN: 0975-0290
3449
Fig 6. Number of Iterations for PSO
Fig 7. Detected Outlier Data from the Collected
Honeypot Data
5. CONCLUSION AND SCOPE FOR FUTURE
ENHANCEMENT
An efficient network is measured not only by its
performance and working rather it is also measured by the
terms of security which plays a key role for good, efficient
and a secure network. So far, many security techniques
have been implemented. This research work has proposed
SSH honeypot technology. From the observation, SSH
honeypot is efficient in collecting network traffic. From
the collected traffic, the research work has implemented
clustering algorithms like GA, DE and PSO to detect
outliers. Once the outlier data is detected and analyzed,
then it will be an easy task to detect the source of the
abnormal traffic. The proposed clustering algorithms
detect outliers from the collected SSH honeypot data.
From the experiments conducted and results obtained,
PSO has performed better than GA and DE. PSO has good
detection mechanism and best cost function than other two
algorithms.
In future, high-interaction honeypots can be used to get
a more level of interactions with the attackers. Honeypots
can be improved more effectively by combining it with
other security mechanisms like firewalls, Intrusion
Detection Systems (IDS) and Intrusion Prevention
Systems (IPS). Other outlier detection methods like
statistical, distance, deviation, distribution, depth and
density based can be used.
REFERENCES
[1] Feng Zhang, Shijie Zhou. Zhiguang Qin, Jinde Liu,
Honeypot: A Supplemented Active Defense System
for Network Security, IEEE, 2003, 231-235.
[2] Ioannis Koniaris, Georgios Papadimitriou, Petros
Nicopolitidis, Mohammad Obaidat, Honeypots
Deployment for the Analysis and Visualization of
Malware Activity and Malicious Connections,
IEEE, 2014, 1825-1830.
[3] Aaditya Jain, Dr. Bala Buksh, Advance Trends in
Network Security with Honeypot and its
Comparative Study with other Techniques,
International Journal of Engineering Trends and
Technology, 29, 2015, 304-312.
[4] Shaik Bhanu, Girish Khilari, Varun Kumar,
Analysis of SSH attacks of Darknet using
Honeypots, International Journal of Engineering
Development and Research, 3, 2014, 348- 350.
[5] Abdallah Ghourabi, Adel Bouhoula, Data Analyzer
Based on Data Mining for Honeypot Router, IEEE,
2015, 1-7.
[6] Ren Hui Gong, Mohammad Zulkernine, Purang
Abolmaesumi, A Software Implementation of a
Genetic Algorithm Based Approach to Network
Intrusion Detection, IEEE, 2005, 1-5.
[7] Adam Slowik, Application of an Adaptive
Differential Evolution Algorithm with Multiple
Trial Vectors to Artificial Neural Network
Training, IEEE, 58, 2011, 3160-3167.
[8] Russell C. Eberhart, Yuhui Shi, Particle Swarm
Optimization: Developments, Applications and
Resources, IEEE, 2001, 81-86.
[9] Enrique Alba and Marco Tomassini, Parallelism
and Evolutionary Algorithms, IEEE, 6, 2002, 443-
462.
[10] P. Garcı´a-Teodoro, J. Dı´az-Verdejo, G. Macia´-
Ferna´ndez, E. Va´zquez, Anomaly-Based
Network Intrusion Detection: Techniques, Systems
and Challenges, Elsevier, 2009, 18-28.
[11] Roshan Chitrakar, Huang Chuanhe, Anomaly based
Intrusion Detection using Hybrid Learning
Approach of Combining k-Medoids Clustering and
Naïve Bayes Classification, IEEE, 2015, 1-5.
8. Int. J. Advanced Networking and Applications
Volume: 09 Issue: 03 Pages: 3443-3450 (2017) ISSN: 0975-0290
3450
[12] Robin Berthier, Michel Cukier, Profiling Attacker
Behaviour Following SSH Compromises, IEEE,
2007, 1-7.
[13] A. M. Riad, Ibrahim Elhenawy, Ahmed Hassan and
Nancy Awadallah, Visualize Network Anomaly
Detection by Using K-Means Clustering
Algorithm, International Journal of Computer
Networks & Communications (IJCNC), 5, 2013,
195-208.
[14] Naila Belhadj Aissa, Mohamed Guerroumi, Semi-
Supervised Statistical Approach for Network
Anomaly Detection, Elsevier, 2016, 1090 – 1095.
[15] Amandeep Singh, Navdeep Singh, Review of
Implementing a Working Honeypot System,
International Journal of Advanced Research in
Computer Science and Software Engineering, 3(6),
2013, 1007-1011.
Biographies and Photographs
M. Sithara, She received her M.Sc
Computer Science in 2013 from
Avinashilingam Institute for Home
Science and Higher Education for
Women University, Coimbatore. She is
pursuing her M.Phil at Sri Ramakrishna
Mission Vidyalaya College of Arts and Science,
Coimbatore. Her areas of interest are Network Security,
Cryptography and Mobile Technology.
M. Chandran, He is the Assistant
Professor in the Department of Computer
Applications at Sri Ramakrishna Mission
Vidyalaya College of Arts and Science,
Coimbatore. He has 11 years of teaching
experience. His areas of interest include Java and Software
Engineering.
G. Padmavathi, She is the Professor in
the Department of Computer Science at
Avinashilingam Institute for Home
Science and Higher Education for
Women University, Coimbatore. She has
29 years of teaching experience and one year of industrial
experience. Her areas of interest include Real Time
Communication, Wireless Communication, Network
Security and Cryptography. She has significant number of
publications in peer reviewed International and National
Journals. Life member of CSI, ISTE, WSEAS, AACE and
ACRS.