nformation Assurance and Security (IAS) is a crucial component in the corporate environment to ensure that the secrecy of
sensitive data is protected, the integrity of important data is not violated, and the availability of critical systems is guaranteed. The
advancement of Information communication and technology into a new era and domain such as mobility and Internet of Things,
its ever growing user’s base and sophisticated cyber-attacks forces the organizations to deploy automated and robust defense
mechanism to manage resultant digital security incidences in real time. Digital forensic is a scientific process that facilitates
detection of illegal activities and in-appropriate behaviors using scientific tools, techniques and investigation frameworks. This
research aims at identifying processes that facilitate and improves digital forensic investigation process. Existing digital forensic
framework will be reviewed and the analysis will be compiled toderive a network forensic investigation framework that include
evidence collection, preservation and analysis at a sensor level and in real time. It is aimed to discover complete relationship with
optimal performance among known and unseen/new alerts generated by multiple network sensors in order to improve the quality
of alert and recognize attack strategy
Use of network forensic mechanisms to formulate network securityIJMIT JOURNAL
Network Forensics is fairly a new area of research which would be used after an intrusion in various
organizations ranging from small, mid-size private companies and government corporations to the defence
secretariat of a country. At the point of an investigation valuable information may be mishandled which
leads to difficulties in the examination and time wastage. Additionally the intruder could obliterate tracks
such as intrusion entry, vulnerabilities used in an entry, destruction caused, and most importantly the
identity of the intruder. The aim of this research was to map the correlation between network security and
network forensic mechanisms. There are three sub research questions that had been studied. Those have
identified Network Security issues, Network Forensic investigations used in an incident, and the use of
network forensics mechanisms to eliminate network security issues. Literature review has been the
research strategy used in order study the sub research questions discussed. Literature such as research
papers published in Journals, PhD Theses, ISO standards, and other official research papers have been
evaluated and have been the base of this research. The deliverables or the output of this research was
produced as a report on how network forensics has assisted in aligning network security in case of an
intrusion. This research has not been specific to an organization but has given a general overview about
the industry. Embedding Digital Forensics Framework, Network Forensic Development Life Cycle, and
Enhanced Network Forensic Cycle could be used to develop a secure network. Through the mentioned
framework, and cycles the author has recommended implementing the 4R Strategy (Resistance,
Recognition, Recovery, Redress) with the assistance of a number of tools. This research would be of
interest to Network Administrators, Network Managers, Network Security personnel, and other personnel interested in obtaining knowledge in securing communication devices/infrastructure. This research provides a framework that can be used in an organization to eliminate digital anomalies through network forensics, helps the above mentioned persons to prepare infrastructure readiness for threats and also enables further research to be carried on in the fields of computer, database, mobile, video, and audio.
A Novel and Advanced Data Mining Model Based Hybrid Intrusion Detection Frame...Radita Apriana
The document proposes a hybrid intrusion detection framework that uses two classifiers: Tree Augmented Naive Bayes (TAN) as the base classifier and Reduced Error Pruning (REP) as the meta classifier. The TAN classifier performs initial classification on the KDD Cup 99 dataset and the results are then used as input for the REP meta classifier, which reclassifies the instances to improve overall classification performance. The framework is evaluated using a testing dataset, with the results analyzed to assess the performance of the hybrid approach.
The mobile device is one of the fasted growing technologies that is widely used in a diversifying sector.
Mobile devices are used for everyday life, such as personal information exchange – chatting, email,
shopping, and mobile banking, contributing to information security threats. Users' behavior can influence
information security threats. More research is needed to understand users' threat avoidance behavior and
motivation. Using Technology threat avoidance theory (TTAT), this study assessed factors that influenced
mobile device users' threat avoidance motivations and behaviors as it relates to phishing attacks.
Cyber Warfare is the current single greatest emerging threat to National Security. Network security has become an essential component of any computer network. As computer networks and systems become ever more fundamental to modern society, concerns about security has become increasingly important. There are a multitude of different applications open source and proprietary available for the protection +-system administrator, to decide on the most suitable format for their purpose requires knowledge of the available safety measures, their features and how they affect the quality of service, as well as the kind of data they will be allowing through un flagged. A majority of methods currently used to ensure the quality of a networks service are signature based. From this information, and details on the specifics of popular applications and their implementation methods, we have carried through the ideas, incorporating our own opinions, to formulate suggestions on how this could be done on a general level. The main objective was to design and develop an Intrusion Detection System. While the minor objectives were to; Design a port scanner to determine potential threats and mitigation techniques to withstand these attacks. Implement the system on a host and Run and test the designed IDS. In this project we set out to develop a Honey Pot IDS System. It would make it easy to listen on a range of ports and emulate a network protocol to track and identify any individuals trying to connect to your system. This IDS will use the following design approaches: Event correlation, Log analysis, Alerting, and policy enforcement. Intrusion Detection Systems (IDSs) attempt to identify unauthorized use, misuse, and abuse of computer systems. In response to the growth in the use and development of IDSs, we have developed a methodology for testing IDSs. The methodology consists of techniques from the field of software testing which we have adapted for the specific purpose of testing IDSs. In this paper, we identify a set of general IDS performance objectives which is the basis for the methodology. We present the details of the methodology, including strategies for test-case selection and specific testing procedures. We include quantitative results from testing experiments on the Network Security Monitor (NSM), an IDS developed at UC Davis. We present an overview of the software platform that we have used to create user-simulation scripts for testing experiments. The platform consists of the UNIX tool expect and enhancements that we have developed, including mechanisms for concurrent scripts and a record-and-replay feature. We also provide background information on intrusions and IDSs to motivate our work.
Intrusion Detection System (IDS): Anomaly Detection using Outlier Detection A...Drjabez
This document describes a proposed approach for anomaly detection in intrusion detection systems using outlier detection. It begins with background on intrusion detection systems and issues with existing approaches. It then presents the proposed two-stage approach using outlier detection: 1) Training with large normal datasets in a distributed storage environment, and 2) Testing intrusion datasets to compute an error value compared to the trained model. If the error value exceeds a threshold, the test data is flagged as anomalous. Experimental results on network packet datasets demonstrate the approach can effectively identify anomalies.
VPN usage across the world has increased due to the COVID-19 pandemic. With companies trying to lay
the course through this unfamiliar state, corporations had to implement a Business Continuity Plan which
included several elements to maintain a scalable and robust VPN connection. During this time of
uncertainty, best practices need to be deployed by corporations and government entities more than ever.
The purpose of this study is to highlight the necessary path SD Telecom would take to ensure a secure,
reliable network during global traffic surge. Specific VPN solutions, access needs, and eligibility
requirements vary based on the end user.
Intrusion detection and anomaly detection system using sequential pattern miningeSAT Journals
Abstract
Nowadays the security methods from password protected access up to firewalls which are used to secure the data as well as the networks from attackers. Several times these types of security methods are not enough to protect data. We can consider the use of Intrusion Detection Systems (IDS) is the one way to secure the data on critical systems. Most of the research work is going on the effectiveness and exactness of the intrusion detection, but these attempts are for the detection of the intrusions at the operating system and network level only. It is unable to detect the unexpected behavior of systems due to malicious transactions in databases. The method used for spotting any interferes on the information in the form of database known as database intrusion detection. It relies on enlisting the execution of a transaction. After that, if the recognized pattern is aside from those regular patterns actual is considered as an intrusion. But the identified problem with this process is that the accuracy algorithm which is used may not identify entire patterns. This type of challenges can affect in two ways. 1) Missing of the database with regular patterns. 2) The detection process neglects some new patterns. Therefore we proposed sequential data mining method by using new Modified Apriori Algorithm. The algorithm upturns the accurateness and rate of pattern detection by the process. The Apriori algorithm with modifications is used in the proposed model.
Keywords — Anomaly Detection, Modified Apriori Algorithm, Misuse detection, Sequential Pattern Mining
Detection and Prevention of security vulnerabilities associated with mobile b...Clinton DSouza
The document summarizes a team's research on detecting and preventing security vulnerabilities in mobile banking applications. It outlines their objective to analyze current exploitation techniques and intrusion detection methods, and propose an efficient authentication methodology. It provides background on the growth of electronic and mobile banking. It then describes common mobile application attacks like information disclosure, logical attacks, phishing and sniffing. It discusses related work on intrusion detection techniques and their approach using profile-based detection and two-factor authentication. The results section addresses the pros and cons of their prevention and detection methods. In conclusion, the team addressed key attacks and current intrusion detection systems, and proposed an authentication mechanism to help secure mobile banking.
Use of network forensic mechanisms to formulate network securityIJMIT JOURNAL
Network Forensics is fairly a new area of research which would be used after an intrusion in various
organizations ranging from small, mid-size private companies and government corporations to the defence
secretariat of a country. At the point of an investigation valuable information may be mishandled which
leads to difficulties in the examination and time wastage. Additionally the intruder could obliterate tracks
such as intrusion entry, vulnerabilities used in an entry, destruction caused, and most importantly the
identity of the intruder. The aim of this research was to map the correlation between network security and
network forensic mechanisms. There are three sub research questions that had been studied. Those have
identified Network Security issues, Network Forensic investigations used in an incident, and the use of
network forensics mechanisms to eliminate network security issues. Literature review has been the
research strategy used in order study the sub research questions discussed. Literature such as research
papers published in Journals, PhD Theses, ISO standards, and other official research papers have been
evaluated and have been the base of this research. The deliverables or the output of this research was
produced as a report on how network forensics has assisted in aligning network security in case of an
intrusion. This research has not been specific to an organization but has given a general overview about
the industry. Embedding Digital Forensics Framework, Network Forensic Development Life Cycle, and
Enhanced Network Forensic Cycle could be used to develop a secure network. Through the mentioned
framework, and cycles the author has recommended implementing the 4R Strategy (Resistance,
Recognition, Recovery, Redress) with the assistance of a number of tools. This research would be of
interest to Network Administrators, Network Managers, Network Security personnel, and other personnel interested in obtaining knowledge in securing communication devices/infrastructure. This research provides a framework that can be used in an organization to eliminate digital anomalies through network forensics, helps the above mentioned persons to prepare infrastructure readiness for threats and also enables further research to be carried on in the fields of computer, database, mobile, video, and audio.
A Novel and Advanced Data Mining Model Based Hybrid Intrusion Detection Frame...Radita Apriana
The document proposes a hybrid intrusion detection framework that uses two classifiers: Tree Augmented Naive Bayes (TAN) as the base classifier and Reduced Error Pruning (REP) as the meta classifier. The TAN classifier performs initial classification on the KDD Cup 99 dataset and the results are then used as input for the REP meta classifier, which reclassifies the instances to improve overall classification performance. The framework is evaluated using a testing dataset, with the results analyzed to assess the performance of the hybrid approach.
The mobile device is one of the fasted growing technologies that is widely used in a diversifying sector.
Mobile devices are used for everyday life, such as personal information exchange – chatting, email,
shopping, and mobile banking, contributing to information security threats. Users' behavior can influence
information security threats. More research is needed to understand users' threat avoidance behavior and
motivation. Using Technology threat avoidance theory (TTAT), this study assessed factors that influenced
mobile device users' threat avoidance motivations and behaviors as it relates to phishing attacks.
Cyber Warfare is the current single greatest emerging threat to National Security. Network security has become an essential component of any computer network. As computer networks and systems become ever more fundamental to modern society, concerns about security has become increasingly important. There are a multitude of different applications open source and proprietary available for the protection +-system administrator, to decide on the most suitable format for their purpose requires knowledge of the available safety measures, their features and how they affect the quality of service, as well as the kind of data they will be allowing through un flagged. A majority of methods currently used to ensure the quality of a networks service are signature based. From this information, and details on the specifics of popular applications and their implementation methods, we have carried through the ideas, incorporating our own opinions, to formulate suggestions on how this could be done on a general level. The main objective was to design and develop an Intrusion Detection System. While the minor objectives were to; Design a port scanner to determine potential threats and mitigation techniques to withstand these attacks. Implement the system on a host and Run and test the designed IDS. In this project we set out to develop a Honey Pot IDS System. It would make it easy to listen on a range of ports and emulate a network protocol to track and identify any individuals trying to connect to your system. This IDS will use the following design approaches: Event correlation, Log analysis, Alerting, and policy enforcement. Intrusion Detection Systems (IDSs) attempt to identify unauthorized use, misuse, and abuse of computer systems. In response to the growth in the use and development of IDSs, we have developed a methodology for testing IDSs. The methodology consists of techniques from the field of software testing which we have adapted for the specific purpose of testing IDSs. In this paper, we identify a set of general IDS performance objectives which is the basis for the methodology. We present the details of the methodology, including strategies for test-case selection and specific testing procedures. We include quantitative results from testing experiments on the Network Security Monitor (NSM), an IDS developed at UC Davis. We present an overview of the software platform that we have used to create user-simulation scripts for testing experiments. The platform consists of the UNIX tool expect and enhancements that we have developed, including mechanisms for concurrent scripts and a record-and-replay feature. We also provide background information on intrusions and IDSs to motivate our work.
Intrusion Detection System (IDS): Anomaly Detection using Outlier Detection A...Drjabez
This document describes a proposed approach for anomaly detection in intrusion detection systems using outlier detection. It begins with background on intrusion detection systems and issues with existing approaches. It then presents the proposed two-stage approach using outlier detection: 1) Training with large normal datasets in a distributed storage environment, and 2) Testing intrusion datasets to compute an error value compared to the trained model. If the error value exceeds a threshold, the test data is flagged as anomalous. Experimental results on network packet datasets demonstrate the approach can effectively identify anomalies.
VPN usage across the world has increased due to the COVID-19 pandemic. With companies trying to lay
the course through this unfamiliar state, corporations had to implement a Business Continuity Plan which
included several elements to maintain a scalable and robust VPN connection. During this time of
uncertainty, best practices need to be deployed by corporations and government entities more than ever.
The purpose of this study is to highlight the necessary path SD Telecom would take to ensure a secure,
reliable network during global traffic surge. Specific VPN solutions, access needs, and eligibility
requirements vary based on the end user.
Intrusion detection and anomaly detection system using sequential pattern miningeSAT Journals
Abstract
Nowadays the security methods from password protected access up to firewalls which are used to secure the data as well as the networks from attackers. Several times these types of security methods are not enough to protect data. We can consider the use of Intrusion Detection Systems (IDS) is the one way to secure the data on critical systems. Most of the research work is going on the effectiveness and exactness of the intrusion detection, but these attempts are for the detection of the intrusions at the operating system and network level only. It is unable to detect the unexpected behavior of systems due to malicious transactions in databases. The method used for spotting any interferes on the information in the form of database known as database intrusion detection. It relies on enlisting the execution of a transaction. After that, if the recognized pattern is aside from those regular patterns actual is considered as an intrusion. But the identified problem with this process is that the accuracy algorithm which is used may not identify entire patterns. This type of challenges can affect in two ways. 1) Missing of the database with regular patterns. 2) The detection process neglects some new patterns. Therefore we proposed sequential data mining method by using new Modified Apriori Algorithm. The algorithm upturns the accurateness and rate of pattern detection by the process. The Apriori algorithm with modifications is used in the proposed model.
Keywords — Anomaly Detection, Modified Apriori Algorithm, Misuse detection, Sequential Pattern Mining
Detection and Prevention of security vulnerabilities associated with mobile b...Clinton DSouza
The document summarizes a team's research on detecting and preventing security vulnerabilities in mobile banking applications. It outlines their objective to analyze current exploitation techniques and intrusion detection methods, and propose an efficient authentication methodology. It provides background on the growth of electronic and mobile banking. It then describes common mobile application attacks like information disclosure, logical attacks, phishing and sniffing. It discusses related work on intrusion detection techniques and their approach using profile-based detection and two-factor authentication. The results section addresses the pros and cons of their prevention and detection methods. In conclusion, the team addressed key attacks and current intrusion detection systems, and proposed an authentication mechanism to help secure mobile banking.
Analytical survey of active intrusion detection techniques in mobile ad hoc n...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
The document discusses a proposed intrusion detection framework for mobile database systems. It introduces a unique profiling method using carefully selected database objects and data concerning the location of database requests. Experiments implementing the system achieved promising detection rates with low false alarm rates. The document reviews existing literature on intrusion detection systems, location-aware IDS, and IDS at the database level. It identifies gaps in current approaches, including high false positive/negative rates. The proposed framework aims to provide a more robust detection method for insider threats in mobile environments.
The document summarizes 7 papers related to digital forensics research. It outlines the objectives, methodologies, challenges, and directions for future research discussed in the papers. The papers examine topics such as digital forensic models, challenges faced by researchers and practitioners, mobile device forensics, and the need for standardization and new tools to address growing data volumes, encryption, and other issues created by emerging technologies.
An efficient control of virus propagationUltraUploader
This document discusses the development of an Efficient Control of Virus Propagation (ECOVP) system using case-based reasoning and object-oriented methodology. It conducted a questionnaire survey that found many computer users in Malaysia lack awareness of computer viruses and there is a need for an effective system to guide users in handling virus incidents. The ECOVP system was developed to educate users and help control virus propagation by providing customized solutions based on the symptoms users describe. It was tested for accuracy and usability and found to successfully help users clean infected machines and prevent future infections.
Automatic Insider Threat Detection in E-mail System using N-gram TechniqueIRJET Journal
This document summarizes research on automatic insider threat detection in email systems using n-gram techniques. It discusses how n-grams can be used to classify documents and detect potential data leakage through email. The system would verify emails sent outside the network using SHA, n-grams and thresholds. If a threat is detected, the user would be blocked. The document also provides a literature review on 10 other papers related to insider threat detection using methods like user profiling, activity logs, data mining and visualization techniques. It describes how n-grams work by breaking words into character sequences and creating profiles based on frequency to classify documents.
1) The document discusses security issues in computer networks and proposes contemporary solutions. It covers topics like cryptography, secure data access, intrusion detection, and secure routing.
2) The literature review discusses previous research on wireless sensor network security including common attacks, requirements, and defenses. It also examines security issues that arise from the unique characteristics of wireless networks.
3) The document proposes that more research is still needed on topics like quantifying security costs and benefits, data integrity, survivability, and security for data-centric wireless sensor networks. A holistic security model is needed that integrates solutions at each network layer.
The spread of information networks in communities and organizations have led to a daily huge volume of information exchange between different networks which, of course, has resulted in new threats to the national organizations. It can be said that information security has become today one of the most challenging areas. In other words, defects and disadvantages of computer network security address irreparable damage for enterprises. Therefore, identification of security threats and ways of dealing with them is essential. But the question raised in this regard is that what are the strategies and policies to deal with security threats that must be taken to ensure the security of computer networks? In this context, the present study intends to do a review of the literature by using earlier researches and library approach, to provide security solutions in the face of threats to their computer networks. The results of this research can lead to more understanding of security threats and ways to deal with them and help to implement a secure information platform.
This document summarizes and characterizes trends in data mining techniques for intrusion detection systems. It discusses how data mining can help address limitations in traditional intrusion detection by identifying anomalies, false alarms, and patterns across data. The document then characterizes several data mining techniques that have been applied for intrusion detection, including feature selection, statistical techniques, machine learning approaches like classification and clustering, and specific algorithms like decision trees, genetic algorithms, fuzzy logic, neural networks, and support vector machines.
The Practical Data Mining Model for Efficient IDS through Relational DatabasesIJRES Journal
Enterprise network information system is not only the platform for information sharing and information exchanging, but also the platform for enterprise production automation system and enterprise management system working together. As a result, the security defense of enterprise network information system does not only include information system network security and data security, but also include the security of network business running on information system network, which is the confidentiality, integrity, continuity and real-time of network business. Network security technology has become crucial in protecting government and industry computing infrastructure. Modern intrusion detection applications face complex requirements – they need to be reliable, extensible, easy to manage, and have low maintenance cost. In recent years, data mining-based intrusion detection systems (IDSs) have demonstrated high accuracy, good generalization to novel types of intrusion, and robust behavior in a changing environment. Still, significant challenges exist in the design and implementation of production quality IDSs. Incrementing components such as data transformations, model deployment, and cooperative distributed detection remain a labor intensive and complex engineering endeavor. This paper describes DAID, a database-centric architecture that leverages data mining within the Relational RDBMS to address these challenges. DAID also offers numerous advantages in terms of scheduling capabilities, alert infrastructure, data analysis tools, security, scalability, and reliability. DAID is illustrated with an Intrusion Detection Center application prototype that leverages existing functionality in Relational Database 10g. Intrusion detection system work at many levels in the network fabric and are taking the concept of security to a whole new sphere by incorporating intelligence as a tool to protect networks against un-authorized intrusions and newer forms of attack. We have described formal model for the construction of network security situation measurement based on d-s evidence theory, frequent mode, and sequence model extracted from the data on network security situation based on the knowledge found method and convert the pattern on the related rules of the network security situation, and automatic generation of network security situation.
This document provides guidance for lawyers on data security issues and how to help clients meet data security standards. It discusses how lack of security knowledge is common among both personal and enterprise computer users. Various threats like viruses, worms, Trojans, bots, and spyware/adware are described. Examples of data security risks include loss of portable devices containing personal information, insecure home networks that employees access for work, and insecure disposal of physical documents and digital media. The document advises evaluating security controls and investing in tools to detect breaches and audit compliance.
Fusion of data from multiple sources is generating new information from existing data. Now users can access any information from inside or outside of the organization very easily. It helps to increase the user productivity and knowledge shared within the organization. But this leads to a new area of network security threat, “Inside Threat”. Now users can share critical information of organization to outside the organization if he/she has access to the information. The current network security tool cannot prevent the new threat. In this paper, we address this issue by “Building real time anomaly detection system based on users’ current behavior and previous behavior”.
Optimised malware detection in digital forensicsIJNSA Journal
On the Internet, malware is one of the most serious threats to system security. Most complex issues and
problems on any systems are caused by malware and spam. Networks and systems can be accessed and
compromised by malware known as botnets, which compromise other systems through a coordinated
attack. Such malware uses anti-forensic techniques to avoid detection and investigation. To prevent systems
from the malicious activity of this malware, a new framework is required that aims to develop an optimised
technique for malware detection. Hence, this paper demonstrates new approaches to perform malware
analysis in forensic investigations and discusses how such a framework may be developed.
IRJET- 3 Juncture based Issuer Driven Pull Out System using Distributed ServersIRJET Journal
This document discusses network security visualization and proposes a classification system for network security visualization systems. It begins by introducing the importance of visualizing network security data due to the large quantities of data produced. It then reviews existing network security visualization systems and outlines key aspects they monitor like host/server monitoring, port activity, and intrusion detection. The document proposes a taxonomy to classify network security visualization systems based on their data sources and techniques. It concludes by stating papers were selected for review based on their relevance to network security, novelty of techniques, and inclusion of evaluations.
Biometric Security Poses Huge Privacy Risks in KenyaPauline Wamere
1. This paper investigates the personal and informational privacy concerns regarding biometric technology in Kenya. It examines legal reviews and identifies potential risks with biometric data storage and use.
2. Biometrics are increasingly used for authentication in Kenya without fully considering privacy risks. Key risks include spoofing biometric data, intercepting biometric data transmitted over networks, system errors from power outages, and residual biometric traces allowing unauthorized access.
3. The paper analyzes Kenya's legal framework for privacy and guidelines for implementing biometric systems. While biometric technology promises better security and convenience, its widespread use also risks privacy violations that must be mitigated through legal reforms and technical controls.
PRIVACY-PRESERVING MACHINE AUTHENTICATED KEY AGREEMENT FOR INTERNET OF THINGSIJCNCJournal
This document proposes a new privacy-preserving machine authenticated key agreement (AKA) protocol for the Internet of Things (IoT) called IoTMAKA. The protocol aims to eliminate human factors in IoT authentication by using machine biometrics like machine fingerprints. It also prioritizes privacy over security to protect the anonymity and untraceability of communicating entities. IoTMAKA is designed to reduce computational and communication overheads to improve efficiency compared to previous related works. The protocol involves three entities - an IoT device, a central server, and a service server. The central server authenticates the IoT device and service server and facilitates secure communication and key agreement between them.
IoT Network Attack Detection using Supervised Machine LearningCSCJournals
The use of supervised learning algorithms to detect malicious traffic can be valuable in designing intrusion detection systems and ascertaining security risks. The Internet of things (IoT) refers to the billions of physical, electronic devices around the world that are often connected over the Internet. The growth of IoT systems comes at the risk of network attacks such as denial of service (DoS) and spoofing. In this research, we perform various supervised feature selection methods and employ three classifiers on IoT network data. The classifiers predict with high accuracy if the network traffic against the IoT device was malicious or benign. We compare the feature selection methods to arrive at the best that can be used for network intrusion prediction.
This document discusses cryptography and security implementations for Internet of Things (IoT) devices. It begins with an introduction to IoT and the need for security protocols as IoT devices collect and transmit large amounts of sensitive data. Challenges to IoT security include the diversity of devices which makes vulnerabilities complex, and limited computational resources. The document then explores using symmetric and public key cryptography algorithms as well as proposed lightweight cryptography solutions for IoT security. It concludes that while traditional security solutions are inadequate, lightweight cryptography protocols have the potential to help secure IoT communications and address current challenges if standardized for diverse IoT hardware.
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
Abstract—With the heightening reliance on Information Technology in recent times, it has becoming more relevant to find measures to secure every online device, data and information. A Network Intrusion Detection System (NIDS) is one of the security options to consider to help protect such devices, data and information. However, IDS needs to be up to date to mitigate current threats to secure systems. A critical issue in the development of the right IDS is the scarcity of current data sets used for training these IDS and the impact on system performance. This paper presents an On-demand Network Data Set Creation Application (ONDaSCA) a Graphical User Interface software capable of generating labelled network intrusion data set. ONDaSCA grants IDS users or researchers the option to choose a raw data set and processed this data set as output, real-time packet capture and offline upload of existing PCAP file and two (2) difference packet capturing methods (Tshark and Dumpcap). ONDaSCA is highly customisable and an IDS user or researcher can leverage its capabilities to suit their needs. The abilities of this software are compared with other similar products that generate data set for use by IDS model.
International Journal of Computer Science and Information Security,IJCSIS ISSN 1947-5500, Pittsburgh, PA, USA
Email: ijcsiseditor@gmail.com
http://sites.google.com/site/ijcsis/
https://google.academia.edu/JournalofComputerScience
https://www.linkedin.com/in/ijcsis-research-publications-8b916516/
http://www.researcherid.com/rid/E-1319-2016
This document discusses reasons for disliking digital forensics and identifies areas for improvement. It begins by introducing the author's background and motivation. The document then examines issues with naming conventions, tools/practices, standards/definitions, training/certification, and subfields. Key problems highlighted include a lack of standardization, compatibility issues between tools, outdated mindsets, and insufficient computing foundations in training. The author advocates treating digital forensics as an engineering science and applying best computing practices. Overall, the document critically analyzes challenges currently facing the field and questions how these issues may impact the future if not addressed.
Digital media refers to on-demand access to content from any device, along with interactive user feedback. It includes any media that is machine-readable, such as films, graphics, videos, games, and web pages. When creating a horror trailer, a digital camcorder will be used to film footage that can be edited in Adobe Premiere to add sound, cut scenes, and adjust speeds. The completed trailer will then be uploaded to YouTube to share it worldwide.
Digital technology refers to on-demand access to content from any digital device as well as interactive user feedback. It includes internet technologies like websites and social media that are digital, networkable, dense, compressible and allow for interactivity. When creating a horror movie trailer, different digital technologies must be used for filming, producing, burning to disc, packaging, and uploading to YouTube to make the trailer available worldwide.
Analytical survey of active intrusion detection techniques in mobile ad hoc n...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
The document discusses a proposed intrusion detection framework for mobile database systems. It introduces a unique profiling method using carefully selected database objects and data concerning the location of database requests. Experiments implementing the system achieved promising detection rates with low false alarm rates. The document reviews existing literature on intrusion detection systems, location-aware IDS, and IDS at the database level. It identifies gaps in current approaches, including high false positive/negative rates. The proposed framework aims to provide a more robust detection method for insider threats in mobile environments.
The document summarizes 7 papers related to digital forensics research. It outlines the objectives, methodologies, challenges, and directions for future research discussed in the papers. The papers examine topics such as digital forensic models, challenges faced by researchers and practitioners, mobile device forensics, and the need for standardization and new tools to address growing data volumes, encryption, and other issues created by emerging technologies.
An efficient control of virus propagationUltraUploader
This document discusses the development of an Efficient Control of Virus Propagation (ECOVP) system using case-based reasoning and object-oriented methodology. It conducted a questionnaire survey that found many computer users in Malaysia lack awareness of computer viruses and there is a need for an effective system to guide users in handling virus incidents. The ECOVP system was developed to educate users and help control virus propagation by providing customized solutions based on the symptoms users describe. It was tested for accuracy and usability and found to successfully help users clean infected machines and prevent future infections.
Automatic Insider Threat Detection in E-mail System using N-gram TechniqueIRJET Journal
This document summarizes research on automatic insider threat detection in email systems using n-gram techniques. It discusses how n-grams can be used to classify documents and detect potential data leakage through email. The system would verify emails sent outside the network using SHA, n-grams and thresholds. If a threat is detected, the user would be blocked. The document also provides a literature review on 10 other papers related to insider threat detection using methods like user profiling, activity logs, data mining and visualization techniques. It describes how n-grams work by breaking words into character sequences and creating profiles based on frequency to classify documents.
1) The document discusses security issues in computer networks and proposes contemporary solutions. It covers topics like cryptography, secure data access, intrusion detection, and secure routing.
2) The literature review discusses previous research on wireless sensor network security including common attacks, requirements, and defenses. It also examines security issues that arise from the unique characteristics of wireless networks.
3) The document proposes that more research is still needed on topics like quantifying security costs and benefits, data integrity, survivability, and security for data-centric wireless sensor networks. A holistic security model is needed that integrates solutions at each network layer.
The spread of information networks in communities and organizations have led to a daily huge volume of information exchange between different networks which, of course, has resulted in new threats to the national organizations. It can be said that information security has become today one of the most challenging areas. In other words, defects and disadvantages of computer network security address irreparable damage for enterprises. Therefore, identification of security threats and ways of dealing with them is essential. But the question raised in this regard is that what are the strategies and policies to deal with security threats that must be taken to ensure the security of computer networks? In this context, the present study intends to do a review of the literature by using earlier researches and library approach, to provide security solutions in the face of threats to their computer networks. The results of this research can lead to more understanding of security threats and ways to deal with them and help to implement a secure information platform.
This document summarizes and characterizes trends in data mining techniques for intrusion detection systems. It discusses how data mining can help address limitations in traditional intrusion detection by identifying anomalies, false alarms, and patterns across data. The document then characterizes several data mining techniques that have been applied for intrusion detection, including feature selection, statistical techniques, machine learning approaches like classification and clustering, and specific algorithms like decision trees, genetic algorithms, fuzzy logic, neural networks, and support vector machines.
The Practical Data Mining Model for Efficient IDS through Relational DatabasesIJRES Journal
Enterprise network information system is not only the platform for information sharing and information exchanging, but also the platform for enterprise production automation system and enterprise management system working together. As a result, the security defense of enterprise network information system does not only include information system network security and data security, but also include the security of network business running on information system network, which is the confidentiality, integrity, continuity and real-time of network business. Network security technology has become crucial in protecting government and industry computing infrastructure. Modern intrusion detection applications face complex requirements – they need to be reliable, extensible, easy to manage, and have low maintenance cost. In recent years, data mining-based intrusion detection systems (IDSs) have demonstrated high accuracy, good generalization to novel types of intrusion, and robust behavior in a changing environment. Still, significant challenges exist in the design and implementation of production quality IDSs. Incrementing components such as data transformations, model deployment, and cooperative distributed detection remain a labor intensive and complex engineering endeavor. This paper describes DAID, a database-centric architecture that leverages data mining within the Relational RDBMS to address these challenges. DAID also offers numerous advantages in terms of scheduling capabilities, alert infrastructure, data analysis tools, security, scalability, and reliability. DAID is illustrated with an Intrusion Detection Center application prototype that leverages existing functionality in Relational Database 10g. Intrusion detection system work at many levels in the network fabric and are taking the concept of security to a whole new sphere by incorporating intelligence as a tool to protect networks against un-authorized intrusions and newer forms of attack. We have described formal model for the construction of network security situation measurement based on d-s evidence theory, frequent mode, and sequence model extracted from the data on network security situation based on the knowledge found method and convert the pattern on the related rules of the network security situation, and automatic generation of network security situation.
This document provides guidance for lawyers on data security issues and how to help clients meet data security standards. It discusses how lack of security knowledge is common among both personal and enterprise computer users. Various threats like viruses, worms, Trojans, bots, and spyware/adware are described. Examples of data security risks include loss of portable devices containing personal information, insecure home networks that employees access for work, and insecure disposal of physical documents and digital media. The document advises evaluating security controls and investing in tools to detect breaches and audit compliance.
Fusion of data from multiple sources is generating new information from existing data. Now users can access any information from inside or outside of the organization very easily. It helps to increase the user productivity and knowledge shared within the organization. But this leads to a new area of network security threat, “Inside Threat”. Now users can share critical information of organization to outside the organization if he/she has access to the information. The current network security tool cannot prevent the new threat. In this paper, we address this issue by “Building real time anomaly detection system based on users’ current behavior and previous behavior”.
Optimised malware detection in digital forensicsIJNSA Journal
On the Internet, malware is one of the most serious threats to system security. Most complex issues and
problems on any systems are caused by malware and spam. Networks and systems can be accessed and
compromised by malware known as botnets, which compromise other systems through a coordinated
attack. Such malware uses anti-forensic techniques to avoid detection and investigation. To prevent systems
from the malicious activity of this malware, a new framework is required that aims to develop an optimised
technique for malware detection. Hence, this paper demonstrates new approaches to perform malware
analysis in forensic investigations and discusses how such a framework may be developed.
IRJET- 3 Juncture based Issuer Driven Pull Out System using Distributed ServersIRJET Journal
This document discusses network security visualization and proposes a classification system for network security visualization systems. It begins by introducing the importance of visualizing network security data due to the large quantities of data produced. It then reviews existing network security visualization systems and outlines key aspects they monitor like host/server monitoring, port activity, and intrusion detection. The document proposes a taxonomy to classify network security visualization systems based on their data sources and techniques. It concludes by stating papers were selected for review based on their relevance to network security, novelty of techniques, and inclusion of evaluations.
Biometric Security Poses Huge Privacy Risks in KenyaPauline Wamere
1. This paper investigates the personal and informational privacy concerns regarding biometric technology in Kenya. It examines legal reviews and identifies potential risks with biometric data storage and use.
2. Biometrics are increasingly used for authentication in Kenya without fully considering privacy risks. Key risks include spoofing biometric data, intercepting biometric data transmitted over networks, system errors from power outages, and residual biometric traces allowing unauthorized access.
3. The paper analyzes Kenya's legal framework for privacy and guidelines for implementing biometric systems. While biometric technology promises better security and convenience, its widespread use also risks privacy violations that must be mitigated through legal reforms and technical controls.
PRIVACY-PRESERVING MACHINE AUTHENTICATED KEY AGREEMENT FOR INTERNET OF THINGSIJCNCJournal
This document proposes a new privacy-preserving machine authenticated key agreement (AKA) protocol for the Internet of Things (IoT) called IoTMAKA. The protocol aims to eliminate human factors in IoT authentication by using machine biometrics like machine fingerprints. It also prioritizes privacy over security to protect the anonymity and untraceability of communicating entities. IoTMAKA is designed to reduce computational and communication overheads to improve efficiency compared to previous related works. The protocol involves three entities - an IoT device, a central server, and a service server. The central server authenticates the IoT device and service server and facilitates secure communication and key agreement between them.
IoT Network Attack Detection using Supervised Machine LearningCSCJournals
The use of supervised learning algorithms to detect malicious traffic can be valuable in designing intrusion detection systems and ascertaining security risks. The Internet of things (IoT) refers to the billions of physical, electronic devices around the world that are often connected over the Internet. The growth of IoT systems comes at the risk of network attacks such as denial of service (DoS) and spoofing. In this research, we perform various supervised feature selection methods and employ three classifiers on IoT network data. The classifiers predict with high accuracy if the network traffic against the IoT device was malicious or benign. We compare the feature selection methods to arrive at the best that can be used for network intrusion prediction.
This document discusses cryptography and security implementations for Internet of Things (IoT) devices. It begins with an introduction to IoT and the need for security protocols as IoT devices collect and transmit large amounts of sensitive data. Challenges to IoT security include the diversity of devices which makes vulnerabilities complex, and limited computational resources. The document then explores using symmetric and public key cryptography algorithms as well as proposed lightweight cryptography solutions for IoT security. It concludes that while traditional security solutions are inadequate, lightweight cryptography protocols have the potential to help secure IoT communications and address current challenges if standardized for diverse IoT hardware.
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
Abstract—With the heightening reliance on Information Technology in recent times, it has becoming more relevant to find measures to secure every online device, data and information. A Network Intrusion Detection System (NIDS) is one of the security options to consider to help protect such devices, data and information. However, IDS needs to be up to date to mitigate current threats to secure systems. A critical issue in the development of the right IDS is the scarcity of current data sets used for training these IDS and the impact on system performance. This paper presents an On-demand Network Data Set Creation Application (ONDaSCA) a Graphical User Interface software capable of generating labelled network intrusion data set. ONDaSCA grants IDS users or researchers the option to choose a raw data set and processed this data set as output, real-time packet capture and offline upload of existing PCAP file and two (2) difference packet capturing methods (Tshark and Dumpcap). ONDaSCA is highly customisable and an IDS user or researcher can leverage its capabilities to suit their needs. The abilities of this software are compared with other similar products that generate data set for use by IDS model.
International Journal of Computer Science and Information Security,IJCSIS ISSN 1947-5500, Pittsburgh, PA, USA
Email: ijcsiseditor@gmail.com
http://sites.google.com/site/ijcsis/
https://google.academia.edu/JournalofComputerScience
https://www.linkedin.com/in/ijcsis-research-publications-8b916516/
http://www.researcherid.com/rid/E-1319-2016
This document discusses reasons for disliking digital forensics and identifies areas for improvement. It begins by introducing the author's background and motivation. The document then examines issues with naming conventions, tools/practices, standards/definitions, training/certification, and subfields. Key problems highlighted include a lack of standardization, compatibility issues between tools, outdated mindsets, and insufficient computing foundations in training. The author advocates treating digital forensics as an engineering science and applying best computing practices. Overall, the document critically analyzes challenges currently facing the field and questions how these issues may impact the future if not addressed.
Digital media refers to on-demand access to content from any device, along with interactive user feedback. It includes any media that is machine-readable, such as films, graphics, videos, games, and web pages. When creating a horror trailer, a digital camcorder will be used to film footage that can be edited in Adobe Premiere to add sound, cut scenes, and adjust speeds. The completed trailer will then be uploaded to YouTube to share it worldwide.
Digital technology refers to on-demand access to content from any digital device as well as interactive user feedback. It includes internet technologies like websites and social media that are digital, networkable, dense, compressible and allow for interactivity. When creating a horror movie trailer, different digital technologies must be used for filming, producing, burning to disc, packaging, and uploading to YouTube to make the trailer available worldwide.
Digital technology refers to on-demand access to content from any device anywhere and allows for interactive engagement and community formation around media. New media is often digital, with characteristics of manipulation, connectivity, density, compression and interactivity like the internet, websites and computer games. When creating a horror trailer, a variety of digital technologies must be used such as filming, editing in PremierePro, burning to disc, designing packaging in Photoshop, and uploading to YouTube for worldwide access.
Digital technology refers to digitized content that can be transmitted over computer networks and the internet, and includes websites, multimedia, video games, and DVDs. When creating a horror film, digital technology is used for filming, editing using software like Premiere Pro, and designing marketing materials in Photoshop. Finally, distributing the horror trailer online through YouTube or Vimeo makes it accessible digitally to a wide audience.
Cybercrime, Digital Investigation and Public Private Partnership by Francesca...Tech and Law Center
The document discusses cybercrime and digital investigation. It begins with defining cybercrime and listing its common forms. It then discusses the underground economy of cybercrime, describing how criminal networks operate similarly to legitimate businesses. Several specific cybercrimes are examined in depth, including malware, data theft, identity theft, phishing, and botnets. The document also profiles some case studies of major cybercriminal groups and hacking incidents to illustrate how crimes are committed. It aims to outline the scope and techniques of cybercrime threats.
This document discusses the nature of computer-based electronic evidence and the devices and considerations involved in digital investigation. It covers topics such as latent evidence stored on computers, fragility of electronic evidence, devices that may contain evidence like computers, networks, and other digital devices. It also summarizes laws and guidelines related to digital investigation in the UK.
The paper emphasizes the human aspects of cyber incidents concerning protecting information and
technology assets by addressing behavioral analytics in cybersecurity for digital forensics applications.
The paper demonstrates the human vulnerabilities associated with information systems technologies and
components. This assessment is based on past literature assessments done in this area. This study also
includes analyses of various frameworks that have led to the adoption of behavioral analysis in digital
forensics. The study's findings indicate that behavioral evidence analysis should be included as part of the
digital forensics examination. The provision of standardized investigation methods and the inclusion of
human factors such as motives and behavioral tendencies are some of the factors attached to the use of
behavioral digital forensic frameworks. However, the study also appreciates the need for a more
generalizable digital forensic method.
The paper emphasizes the human aspects of cyber incidents concerning protecting information and
technology assets by addressing behavioral analytics in cybersecurity for digital forensics applications.
The paper demonstrates the human vulnerabilities associated with information systems technologies and
components. This assessment is based on past literature assessments done in this area. This study also
includes analyses of various frameworks that have led to the adoption of behavioral analysis in digital
forensics. The study's findings indicate that behavioral evidence analysis should be included as part of the
digital forensics examination. The provision of standardized investigation methods and the inclusion of
human factors such as motives and behavioral tendencies are some of the factors attached to the use of
behavioral digital forensic frameworks. However, the study also appreciates the need for a more
generalizable digital forensic method.
The paper emphasizes the human aspects of cyber incidents concerning protecting information and
technology assets by addressing behavioral analytics in cybersecurity for digital forensics applications.
The paper demonstrates the human vulnerabilities associated with information systems technologies and
components. This assessment is based on past literature assessments done in this area. This study also
includes analyses of various frameworks that have led to the adoption of behavioral analysis in digital
forensics. The study's findings indicate that behavioral evidence analysis should be included as part of the
digital forensics examination. The provision of standardized investigation methods and the inclusion of
human factors such as motives and behavioral tendencies are some of the factors attached to the use of
behavioral digital forensic frameworks. However, the study also appreciates the need for a more
generalizable digital forensic method.
Review on effectiveness of deep learning approach in digital forensicsIJECEIAES
Cyber forensics is use of scientific methods for definite description of cybercrime activities. It deals with collecting, processing and interpreting digital evidence for cybercrime analysis. Cyber forensic analysis plays very important role in criminal investigations. Although lot of research has been done in cyber forensics, it is still expected to face new challenges in near future. Analysis of digital media specifically photographic images, audio and video recordings are very crucial in forensics This paper specifically focus on digital forensics. There are several methods for digital forensic analysis. Currently deep learning (DL), mainly convolutional neural network (CNN) has proved very promising in classification of digital images and sound analysis techniques. This paper presents a compendious study of recent research and methods in forensic areas based on CNN, with a view to guide the researchers working in this area. We first, defined and explained preliminary models of DL. In the next section, out of several DL models we have focused on CNN and its usage in areas of digital forensic. Finally, conclusion and future work are discussed. The review shows that CNN has proved good in most of the forensic domains and still promise to be better.
USE OF NETWORK FORENSIC MECHANISMS TO FORMULATE NETWORK SECURITYIJMIT JOURNAL
Network Forensics is fairly a new area of research which would be used after an intrusion in various
organizations ranging from small, mid-size private companies and government corporations to the defence
secretariat of a country. At the point of an investigation valuable information may be mishandled which
leads to difficulties in the examination and time wastage. Additionally the intruder could obliterate tracks
such as intrusion entry, vulnerabilities used in an entry, destruction caused, and most importantly the
identity of the intruder. The aim of this research was to map the correlation between network security and
network forensic mechanisms. There are three sub research questions that had been studied. Those have
identified Network Security issues, Network Forensic investigations used in an incident, and the use of
network forensics mechanisms to eliminate network security issues. Literature review has been the
research strategy used in order study the sub research questions discussed. Literature such as research
papers published in Journals, PhD Theses, ISO standards, and other official research papers have been
evaluated and have been the base of this research. The deliverables or the output of this research was
produced as a report on how network forensics has assisted in aligning network security in case of an
intrusion. This research has not been specific to an organization but has given a general overview about
the industry. Embedding Digital Forensics Framework, Network Forensic Development Life Cycle, and
Enhanced Network Forensic Cycle could be used to develop a secure network. Through the mentioned
framework, and cycles the author has recommended implementing the 4R Strategy (Resistance,
Recognition, Recovery, Redress) with the assistance of a number of tools. This research would be of
interest to Network Administrators, Network Managers, Network Security personnel, and other personnel
interested in obtaining knowledge in securing communication devices/infrastructure. This research
provides a framework that can be used in an organization to eliminate digital anomalies through network
forensics, helps the above mentioned persons to prepare infrastructure readiness for threats and also
enables further research to be carried on in the fields of computer, database, mobile, video, and audio.
Enhancements in the world of digital forensicsIAESIJAI
Currently, the rapid advancement of computer systems and mobile phones has resulted in their utilization in unlawful acts. Ensuring adequate and effective security measures poses a difficult task due to the intricate nature of these devices, thereby exacerbating the challenges associated with investigating crimes involving them. Digital forensics, which involves investigating cyber crimes, plays a crucial role in this realm. Extensive research has been conducted in this field to aid forensic investigations in addressing contemporary obstacles. This paper aims to explore the progress made in the applications of digital forensics and security, encompassing various aspects, and provide insights into the evolution of digital forensics over the past five years.
Forensic the word which indicate the detective work, which searches for and attempting to discover information. Mainly search is carried out for collecting evidence for investigation which is useful in criminal, civil or corporate investigations. Investigation is applicable in presence of some legal rules.
As criminals are getting smarter to perform crime that is, using data hiding techniques such as encryption and steganography, so forensic department has become alert has introduced a new concept called as Digital Forensic, which handles sensitive data which is responsible and confidential.
Optimised Malware Detection in Digital Forensics IJNSA Journal
This summarizes a research paper that proposes developing a new framework to optimize malware detection in digital forensics investigations. The paper discusses challenges with existing detection methods, such as signature-based approaches requiring extensive manual analysis. Through a market research survey of forensics professionals, the paper finds weaknesses in current skills, tools, and accuracy rates. Most respondents agreed a new customized detection tool is needed that employs both dynamic and static analysis methods. The proposed framework aims to address these issues to more effectively detect and analyze malware.
Proposed Effective Solution for Cybercrime Investigation in Myanmartheijes
The rapid increase of ICT creates new attack surfaces for cybercrime forensics. In society, information is the new challenge for security, privacy, and cybercrime. In this paper, an applicable framework has been proposed for cybercrime forensics investigation in Myanmar, known as CCFIM. By using standard cyber laws and policy for cybercrime forensics investigation can provide an ethical, secure and monitored computing environment. This framework provides a secure analysis on both logical and physical data extractions. Acceptable Evidences can be obtained by examining sensible clues from any digital devices such as computer, mobile smart phones, tablets, GPS and IoT devices via traditional or cloud. The most important part of forensic investigation is to gather the “relevant” and “acceptable” information for cyber evidence on court. Therefore, forensic investigators need to emphasize how file system timestamps work. This paper emphasizes on the comparative timestamps of the various file and window operating systems.
A Novel Methodology for Offline Forensics Triage in Windows SystemsIRJET Journal
This document presents a novel methodology for offline cyber forensics triage of Windows systems. The methodology analyzes key artifacts like prefetch files, registry files, $USNJrnl files, browser files, and event logs to retrieve crucial evidence. This evidence includes lists of recently deleted/modified/accessed files, suspicious programs used, network information, timestomped files, web browser activity, and user account activity. Analyzing select artifacts can guide investigators and prioritize files to analyze, aiding more efficient investigations. The methodology was tested through experiments that could detect recent deleted activities and traces left by anti-forensics tools.
Network Forensic Investigation of HTTPS ProtocolIJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
Systematic Digital Forensic Investigation ModelCSCJournals
Law practitioners are in a uninterrupted battle with criminals in the application of digital/computer technologies, and require the development of a proper methodology to systematically search digital devices for significant evidence. Computer fraud and digital crimes are growing day by day and unfortunately less than two percent of the reported cases result in confidence. This paper explores the development of the digital forensics process model, compares digital forensic methodologies, and finally proposes a systematic model of the digital forensic procedure. This model attempts to address some of the shortcomings of previous methodologies, and provides the following advantages: a consistent, standardized and systematic framework for digital forensic investigation process; a framework which work systematically in team according the captured evidence; a mechanism for applying the framework to according the country digital forensic investigation technologies; a generalized methodology that judicial members can use to relate technology to non-technical observers. This paper present a brief overview of previous forensic models and propose a new model inspired from the DRFWS Digital Investigation Model, and finally compares it with other previous model to show relevant of this model. The proposed model in this paper explores the different processes involved in the investigation of cyber crime and cyber fraud in the form of an eleven-stage model. The Systematic digital forensic investigation model (SDFIM) has been developed with the aim of helping forensic practitioners and organizations for setting up appropriate policies and procedures in a systematic manner.
Applying Data Mining Principles in the Extraction of Digital EvidenceDr. Richard Otieno
This document summarizes research on applying data mining principles to digital evidence extraction. It begins with an introduction to data mining and its increasing role in criminal forensics. The paper then reviews digital forensic investigation models and how data mining fits within the analysis phase. Various data mining techniques are described that could support forensic investigations, such as association rule mining, outlier analysis, and support vector machines. The document concludes that integrating data mining principles can help boost the performance and reliability of digital investigations.
In this paper, we deal with introducing a technique of digital forensics for reconstruction of
events or evidences after the commitment of a crime through any of the digital devices. It shows
a clear transparency between Computer Forensics and Digital Forensics and gives a brief
description about the classification of Digital Forensics. It has also been described that how
the emergences of various digital forensic models help digital forensic practitioners and
examiners in doing digital forensics. Further, discussed Merits and Demerits of the required models and review of every major model.
This document discusses a case study involving a small-medium enterprise (SME) that has experienced anomalies in its accounting and product records. The SME has hired a digital forensic investigator to determine if any malicious activity has occurred and ensure its systems are free of malware. The investigator will conduct a malware investigation and digital forensic investigation following the four principles of the Association of Chief Police Officers (ACPO) guidelines. The investigator will use the Four Step Forensics Process (FSFP) model to identify, preserve, extract, and analyze evidence from the SME's systems to determine the cause of problems and make recommendations.
The document provides an overview of cyber forensics. It discusses how cyber forensics has become important for investigations due to increasing internet crimes. It outlines the typical phases of cyber forensics investigations - identification, acquisition, analysis, and reporting. The identification phase deals with identifying evidence and preserving the chain of custody. The acquisition phase involves creating copies of digital evidence like hard disk images. The analysis phase examines the acquired evidence to find relevant pieces. Finally, the reporting phase documents the findings and conclusions. A variety of cyber forensics tools are also mentioned.
Intrusion Detection System using Data MiningIRJET Journal
This document presents a proposed intrusion detection system using data mining techniques. It begins with an abstract that describes how internal intrusions are difficult to detect as internal users know the organization's information. It then discusses how anomaly detection can be used to create behavior profiles for each user and detect anomalous activities. The introduction provides background on intrusion detection systems and the need for more efficient and effective detection methods. It describes the proposed system which will use data mining techniques like k-means clustering to separate normal and abnormal network activities in order to detect internal attacks. It discusses the hardware and software requirements and specifications. Finally, it concludes that the proposed system can better detect anomalies in the network compared to other machine learning approaches.
Design for A Network Centric Enterprise Forensic SystemCSCJournals
Increased profitability and exposure of enterprise’s information incite more attackers to attempt exploitation on enterprise network, while striving not to leave any evidences. Although the area of digital forensic analysis is evolving to become more mature in the modern criminology, the scope of network and computer forensics in the large-scale commercial environment is still vague. The conventional forensic techniques, consisting of large proportion of manual operations and isolated processes, are not adequately compatible in modern enterprise context. Data volume of enterprise is usually overwhelming and the interference to business operation during the investigation is unwelcomed. To evidence and monitor these increasing and evolving cyber offences and criminals, forensic investigators are calling for more comprehensive forensic methodology. For comprehension of current insufficiencies, this paper starts from the probes for the weaknesses of various preliminary forensic techniques. Then it proposes an approach to design an enhanced forensic system that integrates the network distributed system concept and information fusion theory as a remedy to the drawbacks of existing forensic techniques.
The Anti-Forensics Challenge Kamal Dahbur [email pro.docxmehek4
The Anti-Forensics Challenge
Kamal Dahbur
[email protected]
Bassil Mohammad
[email protected]
School of Engineering and Computing Sciences
New York Institute of Technology
Amman, Jordan
ABSTRACT
Computer and Network Forensics has emerged as a new field in
IT that is aimed at acquiring and analyzing digital evidence for
the purpose of solving cases that involve the use, or more
accurately misuse, of computer systems. Many scientific
techniques, procedures, and technological tools have been
evolved and effectively applied in this field. On the opposite
side, Anti-Forensics has recently surfaced as a field that aims at
circumventing the efforts and objectives of the field of computer
and network forensics. The purpose of this paper is to highlight
the challenges introduced by Anti-Forensics, explore the various
Anti-Forensics mechanisms, tools and techniques, provide a
coherent classification for them, and discuss thoroughly their
effectiveness. Moreover, this paper will highlight the challenges
seen in implementing effective countermeasures against these
techniques. Finally, a set of recommendations are presented with
further seen research opportunities.
Categories and Subject Descriptors
K.6.1 [Management of Computing and Information
Systems]: Projects and People Management – System Analysis
and Design, System Development.
General Terms
Management, Security, Standardization.
Keywords
Computer Forensics (CF), Computer Anti-Forensics (CAF),
Digital Evidence, Data Hiding.
1. INTRODUCTION
The use of technology is increasingly spreading
covering various aspects of our daily lives. An equal increase, if
not even more, is realized in the methods and techniques created
with the intention to misuse the technologies serving varying
objectives being political, personal or anything else. This has
clearly been reflected in our terminology as well, where new
terms like cyber warfare, cyber security, and cyber crime,
amongst others, were introduced. It is also noticeable that such
attacks are getting increasingly more sophisticated, and are
utilizing novel methodologies and techniques. Fortunately, these
attacks leave traces on the victim systems that, if successfully
recovered and analyzed, might help identify the offenders and
consequently resolve the case(s) justly and in accordance with
applicable laws. For this purpose, new areas of research emerged
addressing Network Forensics and Computer Forensics in order
to define the foundation, practices and acceptable frameworks
for scientifically acquiring and analyzing digital evidence in to
be presented in support of filed cases. In response to Forensics
efforts, Anti-Forensics tools and techniques were created with
the main objective of frustrating forensics efforts, and taunting
its credibility and reliability.
This paper attempts to provide a clear definition for Computer
Anti-Forensics and consolidates various aspects of the topi ...
The Anti-Forensics Challenge Kamal Dahbur [email pro.docxmattinsonjanel
The Anti-Forensics Challenge
Kamal Dahbur
[email protected]
Bassil Mohammad
[email protected]
School of Engineering and Computing Sciences
New York Institute of Technology
Amman, Jordan
ABSTRACT
Computer and Network Forensics has emerged as a new field in
IT that is aimed at acquiring and analyzing digital evidence for
the purpose of solving cases that involve the use, or more
accurately misuse, of computer systems. Many scientific
techniques, procedures, and technological tools have been
evolved and effectively applied in this field. On the opposite
side, Anti-Forensics has recently surfaced as a field that aims at
circumventing the efforts and objectives of the field of computer
and network forensics. The purpose of this paper is to highlight
the challenges introduced by Anti-Forensics, explore the various
Anti-Forensics mechanisms, tools and techniques, provide a
coherent classification for them, and discuss thoroughly their
effectiveness. Moreover, this paper will highlight the challenges
seen in implementing effective countermeasures against these
techniques. Finally, a set of recommendations are presented with
further seen research opportunities.
Categories and Subject Descriptors
K.6.1 [Management of Computing and Information
Systems]: Projects and People Management – System Analysis
and Design, System Development.
General Terms
Management, Security, Standardization.
Keywords
Computer Forensics (CF), Computer Anti-Forensics (CAF),
Digital Evidence, Data Hiding.
1. INTRODUCTION
The use of technology is increasingly spreading
covering various aspects of our daily lives. An equal increase, if
not even more, is realized in the methods and techniques created
with the intention to misuse the technologies serving varying
objectives being political, personal or anything else. This has
clearly been reflected in our terminology as well, where new
terms like cyber warfare, cyber security, and cyber crime,
amongst others, were introduced. It is also noticeable that such
attacks are getting increasingly more sophisticated, and are
utilizing novel methodologies and techniques. Fortunately, these
attacks leave traces on the victim systems that, if successfully
recovered and analyzed, might help identify the offenders and
consequently resolve the case(s) justly and in accordance with
applicable laws. For this purpose, new areas of research emerged
addressing Network Forensics and Computer Forensics in order
to define the foundation, practices and acceptable frameworks
for scientifically acquiring and analyzing digital evidence in to
be presented in support of filed cases. In response to Forensics
efforts, Anti-Forensics tools and techniques were created with
the main objective of frustrating forensics efforts, and taunting
its credibility and reliability.
This paper attempts to provide a clear definition for Computer
Anti-Forensics and consolidates various aspects of the topi ...
Similar to A Proactive Approach in Network Forensic Investigation Process (20)
Text Mining in Digital Libraries using OKAPI BM25 ModelEditor IJCATR
The emergence of the internet has made vast amounts of information available and easily accessible online. As a result, most libraries have digitized their content in order to remain relevant to their users and to keep pace with the advancement of the internet. However, these digital libraries have been criticized for using inefficient information retrieval models that do not perform relevance ranking to the retrieved results. This paper proposed the use of OKAPI BM25 model in text mining so as means of improving relevance ranking of digital libraries. Okapi BM25 model was selected because it is a probability-based relevance ranking algorithm. A case study research was conducted and the model design was based on information retrieval processes. The performance of Boolean, vector space, and Okapi BM25 models was compared for data retrieval. Relevant ranked documents were retrieved and displayed at the OPAC framework search page. The results revealed that Okapi BM 25 outperformed Boolean model and Vector Space model. Therefore, this paper proposes the use of Okapi BM25 model to reward terms according to their relative frequencies in a document so as to improve the performance of text mining in digital libraries.
Green Computing, eco trends, climate change, e-waste and eco-friendlyEditor IJCATR
This document discusses green computing practices and sustainable IT services. It provides an overview of factors driving adoption of green computing to reduce costs and environmental impact of data centers, such as rising energy costs and density. Green strategies discussed include improving infrastructure efficiency, power management, thermal management, efficient product design, and virtualization to optimize resource utilization. The document examines how green computing aims to lower costs and environmental footprint, and how sustainable IT services take a broader approach considering economic, environmental and social impacts.
Policies for Green Computing and E-Waste in NigeriaEditor IJCATR
Computers today are an integral part of individuals’ lives all around the world, but unfortunately these devices are toxic to the environment given the materials used, their limited battery life and technological obsolescence. Individuals are concerned about the hazardous materials ever present in computers, even if the importance of various attributes differs, and that a more environment -friendly attitude can be obtained through exposure to educational materials. In this paper, we aim to delineate the problem of e-waste in Nigeria and highlight a series of measures and the advantage they herald for our country and propose a series of action steps to develop in these areas further. It is possible for Nigeria to have an immediate economic stimulus and job creation while moving quickly to abide by the requirements of climate change legislation and energy efficiency directives. The costs of implementing energy efficiency and renewable energy measures are minimal as they are not cash expenditures but rather investments paid back by future, continuous energy savings.
Performance Evaluation of VANETs for Evaluating Node Stability in Dynamic Sce...Editor IJCATR
Vehicular ad hoc networks (VANETs) are a favorable area of exploration which empowers the interconnection amid the movable vehicles and between transportable units (vehicles) and road side units (RSU). In Vehicular Ad Hoc Networks (VANETs), mobile vehicles can be organized into assemblage to promote interconnection links. The assemblage arrangement according to dimensions and geographical extend has serious influence on attribute of interaction .Vehicular ad hoc networks (VANETs) are subclass of mobile Ad-hoc network involving more complex mobility patterns. Because of mobility the topology changes very frequently. This raises a number of technical challenges including the stability of the network .There is a need for assemblage configuration leading to more stable realistic network. The paper provides investigation of various simulation scenarios in which cluster using k-means algorithm are generated and their numbers are varied to find the more stable configuration in real scenario of road.
Optimum Location of DG Units Considering Operation ConditionsEditor IJCATR
The optimal sizing and placement of Distributed Generation units (DG) are becoming very attractive to researchers these days. In this paper a two stage approach has been used for allocation and sizing of DGs in distribution system with time varying load model. The strategic placement of DGs can help in reducing energy losses and improving voltage profile. The proposed work discusses time varying loads that can be useful for selecting the location and optimizing DG operation. The method has the potential to be used for integrating the available DGs by identifying the best locations in a power system. The proposed method has been demonstrated on 9-bus test system.
Analysis of Comparison of Fuzzy Knn, C4.5 Algorithm, and Naïve Bayes Classifi...Editor IJCATR
Early detection of diabetes mellitus (DM) can prevent or inhibit complication. There are several laboratory test that must be done to detect DM. The result of this laboratory test then converted into data training. Data training used in this study generated from UCI Pima Database with 6 attributes that were used to classify positive or negative diabetes. There are various classification methods that are commonly used, and in this study three of them were compared, which were fuzzy KNN, C4.5 algorithm and Naïve Bayes Classifier (NBC) with one identical case. The objective of this study was to create software to classify DM using tested methods and compared the three methods based on accuracy, precision, and recall. The results showed that the best method was Fuzzy KNN with average and maximum accuracy reached 96% and 98%, respectively. In second place, NBC method had respective average and maximum accuracy of 87.5% and 90%. Lastly, C4.5 algorithm had average and maximum accuracy of 79.5% and 86%, respectively.
Web Scraping for Estimating new Record from Source SiteEditor IJCATR
Study in the Competitive field of Intelligent, and studies in the field of Web Scraping, have a symbiotic relationship mutualism. In the information age today, the website serves as a main source. The research focus is on how to get data from websites and how to slow down the intensity of the download. The problem that arises is the website sources are autonomous so that vulnerable changes the structure of the content at any time. The next problem is the system intrusion detection snort installed on the server to detect bot crawler. So the researchers propose the use of the methods of Mining Data Records and the method of Exponential Smoothing so that adaptive to changes in the structure of the content and do a browse or fetch automatically follow the pattern of the occurrences of the news. The results of the tests, with the threshold 0.3 for MDR and similarity threshold score 0.65 for STM, using recall and precision values produce f-measure average 92.6%. While the results of the tests of the exponential estimation smoothing using ? = 0.5 produces MAE 18.2 datarecord duplicate. It slowed down to 3.6 datarecord from 21.8 datarecord results schedule download/fetch fix in an average time of occurrence news.
Evaluating Semantic Similarity between Biomedical Concepts/Classes through S...Editor IJCATR
Most of the existing semantic similarity measures that use ontology structure as their primary source can measure semantic similarity between concepts/classes using single ontology. The ontology-based semantic similarity techniques such as structure-based semantic similarity techniques (Path Length Measure, Wu and Palmer’s Measure, and Leacock and Chodorow’s measure), information content-based similarity techniques (Resnik’s measure, Lin’s measure), and biomedical domain ontology techniques (Al-Mubaid and Nguyen’s measure (SimDist)) were evaluated relative to human experts’ ratings, and compared on sets of concepts using the ICD-10 “V1.0” terminology within the UMLS. The experimental results validate the efficiency of the SemDist technique in single ontology, and demonstrate that SemDist semantic similarity techniques, compared with the existing techniques, gives the best overall results of correlation with experts’ ratings.
Semantic Similarity Measures between Terms in the Biomedical Domain within f...Editor IJCATR
The techniques and tests are tools used to define how measure the goodness of ontology or its resources. The similarity between biomedical classes/concepts is an important task for the biomedical information extraction and knowledge discovery. However, most of the semantic similarity techniques can be adopted to be used in the biomedical domain (UMLS). Many experiments have been conducted to check the applicability of these measures. In this paper, we investigate to measure semantic similarity between two terms within single ontology or multiple ontologies in ICD-10 “V1.0” as primary source, and compare my results to human experts score by correlation coefficient.
A Strategy for Improving the Performance of Small Files in Openstack Swift Editor IJCATR
This is an effective way to improve the storage access performance of small files in Openstack Swift by adding an aggregate storage module. Because Swift will lead to too much disk operation when querying metadata, the transfer performance of plenty of small files is low. In this paper, we propose an aggregated storage strategy (ASS), and implement it in Swift. ASS comprises two parts which include merge storage and index storage. At the first stage, ASS arranges the write request queue in chronological order, and then stores objects in volumes. These volumes are large files that are stored in Swift actually. During the short encounter time, the object-to-volume mapping information is stored in Key-Value store at the second stage. The experimental results show that the ASS can effectively improve Swift's small file transfer performance.
Integrated System for Vehicle Clearance and RegistrationEditor IJCATR
Efficient management and control of government's cash resources rely on government banking arrangements. Nigeria, like many low income countries, employed fragmented systems in handling government receipts and payments. Later in 2016, Nigeria implemented a unified structure as recommended by the IMF, where all government funds are collected in one account would reduce borrowing costs, extend credit and improve government's fiscal policy among other benefits to government. This situation motivated us to embark on this research to design and implement an integrated system for vehicle clearance and registration. This system complies with the new Treasury Single Account policy to enable proper interaction and collaboration among five different level agencies (NCS, FRSC, SBIR, VIO and NPF) saddled with vehicular administration and activities in Nigeria. Since the system is web based, Object Oriented Hypermedia Design Methodology (OOHDM) is used. Tools such as Php, JavaScript, css, html, AJAX and other web development technologies were used. The result is a web based system that gives proper information about a vehicle starting from the exact date of importation to registration and renewal of licensing. Vehicle owner information, custom duty information, plate number registration details, etc. will also be efficiently retrieved from the system by any of the agencies without contacting the other agency at any point in time. Also number plate will no longer be the only means of vehicle identification as it is presently the case in Nigeria, because the unified system will automatically generate and assigned a Unique Vehicle Identification Pin Number (UVIPN) on payment of duty in the system to the vehicle and the UVIPN will be linked to the various agencies in the management information system.
Assessment of the Efficiency of Customer Order Management System: A Case Stu...Editor IJCATR
The Supermarket Management System deals with the automation of buying and selling of good and services. It includes both sales and purchase of items. The project Supermarket Management System is to be developed with the objective of making the system reliable, easier, fast, and more informative.
Energy-Aware Routing in Wireless Sensor Network Using Modified Bi-Directional A*Editor IJCATR
Energy is a key component in the Wireless Sensor Network (WSN)[1]. The system will not be able to run according to its function without the availability of adequate power units. One of the characteristics of wireless sensor network is Limitation energy[2]. A lot of research has been done to develop strategies to overcome this problem. One of them is clustering technique. The popular clustering technique is Low Energy Adaptive Clustering Hierarchy (LEACH)[3]. In LEACH, clustering techniques are used to determine Cluster Head (CH), which will then be assigned to forward packets to Base Station (BS). In this research, we propose other clustering techniques, which utilize the Social Network Analysis approach theory of Betweeness Centrality (BC) which will then be implemented in the Setup phase. While in the Steady-State phase, one of the heuristic searching algorithms, Modified Bi-Directional A* (MBDA *) is implemented. The experiment was performed deploy 100 nodes statically in the 100x100 area, with one Base Station at coordinates (50,50). To find out the reliability of the system, the experiment to do in 5000 rounds. The performance of the designed routing protocol strategy will be tested based on network lifetime, throughput, and residual energy. The results show that BC-MBDA * is better than LEACH. This is influenced by the ways of working LEACH in determining the CH that is dynamic, which is always changing in every data transmission process. This will result in the use of energy, because they always doing any computation to determine CH in every transmission process. In contrast to BC-MBDA *, CH is statically determined, so it can decrease energy usage.
Security in Software Defined Networks (SDN): Challenges and Research Opportun...Editor IJCATR
In networks, the rapidly changing traffic patterns of search engines, Internet of Things (IoT) devices, Big Data and data centers has thrown up new challenges for legacy; existing networks; and prompted the need for a more intelligent and innovative way to dynamically manage traffic and allocate limited network resources. Software Defined Network (SDN) which decouples the control plane from the data plane through network vitalizations aims to address these challenges. This paper has explored the SDN architecture and its implementation with the OpenFlow protocol. It has also assessed some of its benefits over traditional network architectures, security concerns and how it can be addressed in future research and related works in emerging economies such as Nigeria.
Measure the Similarity of Complaint Document Using Cosine Similarity Based on...Editor IJCATR
Report handling on "LAPOR!" (Laporan, Aspirasi dan Pengaduan Online Rakyat) system depending on the system administrator who manually reads every incoming report [3]. Read manually can lead to errors in handling complaints [4] if the data flow is huge and grows rapidly, it needs at least three days to prepare a confirmation and it sensitive to inconsistencies [3]. In this study, the authors propose a model that can measure the identities of the Query (Incoming) with Document (Archive). The authors employed Class-Based Indexing term weighting scheme, and Cosine Similarities to analyse document similarities. CoSimTFIDF, CoSimTFICF and CoSimTFIDFICF values used in classification as feature for K-Nearest Neighbour (K-NN) classifier. The optimum result evaluation is pre-processing employ 75% of training data ratio and 25% of test data with CoSimTFIDF feature. It deliver a high accuracy 84%. The k = 5 value obtain high accuracy 84.12%
Hangul Recognition Using Support Vector MachineEditor IJCATR
The recognition of Hangul Image is more difficult compared with that of Latin. It could be recognized from the structural arrangement. Hangul is arranged from two dimensions while Latin is only from the left to the right. The current research creates a system to convert Hangul image into Latin text in order to use it as a learning material on reading Hangul. In general, image recognition system is divided into three steps. The first step is preprocessing, which includes binarization, segmentation through connected component-labeling method, and thinning with Zhang Suen to decrease some pattern information. The second is receiving the feature from every single image, whose identification process is done through chain code method. The third is recognizing the process using Support Vector Machine (SVM) with some kernels. It works through letter image and Hangul word recognition. It consists of 34 letters, each of which has 15 different patterns. The whole patterns are 510, divided into 3 data scenarios. The highest result achieved is 94,7% using SVM kernel polynomial and radial basis function. The level of recognition result is influenced by many trained data. Whilst the recognition process of Hangul word applies to the type 2 Hangul word with 6 different patterns. The difference of these patterns appears from the change of the font type. The chosen fonts for data training are such as Batang, Dotum, Gaeul, Gulim, Malgun Gothic. Arial Unicode MS is used to test the data. The lowest accuracy is achieved through the use of SVM kernel radial basis function, which is 69%. The same result, 72 %, is given by the SVM kernel linear and polynomial.
Application of 3D Printing in EducationEditor IJCATR
This paper provides a review of literature concerning the application of 3D printing in the education system. The review identifies that 3D Printing is being applied across the Educational levels [1] as well as in Libraries, Laboratories, and Distance education systems. The review also finds that 3D Printing is being used to teach both students and trainers about 3D Printing and to develop 3D Printing skills.
Survey on Energy-Efficient Routing Algorithms for Underwater Wireless Sensor ...Editor IJCATR
In underwater environment, for retrieval of information the routing mechanism is used. In routing mechanism there are three to four types of nodes are used, one is sink node which is deployed on the water surface and can collect the information, courier/super/AUV or dolphin powerful nodes are deployed in the middle of the water for forwarding the packets, ordinary nodes are also forwarder nodes which can be deployed from bottom to surface of the water and source nodes are deployed at the seabed which can extract the valuable information from the bottom of the sea. In underwater environment the battery power of the nodes is limited and that power can be enhanced through better selection of the routing algorithm. This paper focuses the energy-efficient routing algorithms for their routing mechanisms to prolong the battery power of the nodes. This paper also focuses the performance analysis of the energy-efficient algorithms under which we can examine the better performance of the route selection mechanism which can prolong the battery power of the node
Comparative analysis on Void Node Removal Routing algorithms for Underwater W...Editor IJCATR
The designing of routing algorithms faces many challenges in underwater environment like: propagation delay, acoustic channel behaviour, limited bandwidth, high bit error rate, limited battery power, underwater pressure, node mobility, localization 3D deployment, and underwater obstacles (voids). This paper focuses the underwater voids which affects the overall performance of the entire network. The majority of the researchers have used the better approaches for removal of voids through alternate path selection mechanism but still research needs improvement. This paper also focuses the architecture and its operation through merits and demerits of the existing algorithms. This research article further focuses the analytical method of the performance analysis of existing algorithms through which we found the better approach for removal of voids
Decay Property for Solutions to Plate Type Equations with Variable CoefficientsEditor IJCATR
In this paper we consider the initial value problem for a plate type equation with variable coefficients and memory in
1 n R n ), which is of regularity-loss property. By using spectrally resolution, we study the pointwise estimates in the spectral
space of the fundamental solution to the corresponding linear problem. Appealing to this pointwise estimates, we obtain the global
existence and the decay estimates of solutions to the semilinear problem by employing the fixed point theorem
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Building Production Ready Search Pipelines with Spark and Milvus
A Proactive Approach in Network Forensic Investigation Process
1. International Journal of Computer Applications Technology and Research
Volume 5– Issue 5, 304 - 311, 2016, ISSN:- 2319–8656
www.ijcat.com 304
A Proactive Approach in Network Forensic
Investigation Process
Joseph MbuguaChahira
GarissaUniversityCollege,
Garissa- Kenya
Jane KinanuKiruki,Chuka
University,
Chuka- Kenya
Peter KipronoKemei
Egerton University,
Nakuru- Kenya
Abstract
Information Assurance and Security (IAS) is a crucial component in the corporate environment to ensure that the secrecy of
sensitive data is protected, the integrity of important data is not violated, and the availability of critical systems is guaranteed. The
advancement of Information communication and technology into a new era and domain such as mobility and Internet of Things,
its ever growing user’s base and sophisticated cyber-attacks forces the organizations to deploy automated and robust defense
mechanism to manage resultant digital security incidences in real time. Digital forensic is a scientific process that facilitates
detection of illegal activities and in-appropriate behaviors using scientific tools, techniques and investigation frameworks. This
research aims at identifying processes that facilitate and improves digital forensic investigation process. Existing digital forensic
framework will be reviewed and the analysis will be compiled toderive a network forensic investigation framework that include
evidence collection, preservation and analysis at a sensor level and in real time. It is aimed to discover complete relationship with
optimal performance among known and unseen/new alerts generated by multiple network sensors in order to improve the quality
of alert and recognize attack strategy
Key words: Digital forensic, cybercrimes, proactive network forensic, attack prediction, attack Strategy.
1.0 Introduction
The modern enterprise relies heavily on
electronicinformation systemsto improve productivity and
speed up processes, allowing new service, product
development and new business models. As a result, large
amount of information is generated, processed, distributed
and stored electronically via digital devices and computer
networks. However, their vulnerabilities creates
opportunities for hostile users to perform malicious activities
exposing the underlying critical informationto cyber threats
and attacks (Healy at el, 2008; Alharbi at el, 2011).
Currently, finding the most effective way to secure
information systems, networks and sensitive data is a
challenging task experienced by many organization.The
number of potential attackers targeting a given system has
increased drastically and the effect of successful attacks have
become more serious. For instance loss of fund, lack of
confidence from their clients, legal implications and denial
of services (Healy at el, 2008; panda Labs 2011). Skilled
attackers frequently changes their attacking strategies and
devise new methodologiesto negatively affect their
existence, amount and quality of evidence generated for
analysisin order to defeat the implementedsecurity
mechanisms (Garfinkel at el 2007; will at el, 2011).
Information Assurance and Security is a crucial component
in the corporate environment to ensure that the secrecy of
sensitive data is protected, the integrity of important data is
not violated, and the availability of critical systems is
guaranteed.It plays a key role on nation health, economy and
public security and hence continues to be a research area in
the pursuit of an efficient, scalable and intelligent system to
provide comprehensive security management domain.
Digital forensic is a scientificprocess that
facilitates detection of illegal activities and in-appropriate
behaviors using scientific tools, techniques and investigation
Frameworks which involves diverse digital devices such as
computer system, network, mobile and storage devices (Pilli
et al., 2010; Rahayu at el 2008). It comprises of a series of
steps followed by security experts to obtain accurate and
complete evidence which is forensically sound and
acceptable in a court of law. The advancement of Internet
into a new era and domain such as mobility and Internet of
Things, its ever growing user’s base and sophisticated cyber-
attacks demonstrate the need to deploy advanced IT security
infrastructure to handle the current demands in network
security (Wang at el, 2010; Maheyzah at el, 2015; Rahayu at
el, 2009). Therefore, it is essential to develop a framework
that provides tools,techniques and procedures forcollecting,
preserving and analyzing large heterogeneous datasets and
system’s information in a structured way and for supplying
detailed and complete information to IT security
management in real time.
This work proposes anetwork forensic
investigation framework for detecting, predicting and
managing cyber-security incidents in a real time multiple
sensor environment. The objective will be achieved through
a series of steps first by examining existing digital forensic
investigation framework. This study allowed us to identify
the missing part and the drawback of those systems. The next
section will provide the proposed design for an
effectiveframework to improve the whole forensic
investigation process. Lastly, we conclude the paper and
present potential future work
2. International Journal of Computer Applications Technology and Research
Volume 5– Issue 5, 304 - 311, 2016, ISSN:- 2319–8656
www.ijcat.com 305
2.0Existing Digital Forensic Investigation
Frameworks
Digital forensic approaches are generally categorized into
three sections: Integrated Digital Investigation Process
(IDIP) Framework, General Network Forensics Approaches
and proactive approaches. (Carrier at el, 2003).
2.1 Integrated Digital Investigation
Process (IDIP) Framework
IDIP by (Carrier at el, 2003), .is based on the
investigation process of a physical crime scene. The
framework has seventeen phases which are readiness
(operations and infrastructure) phases, deployment
(detection and notification and confirmation and
authorization) phases, physical crime scene investigation
(preservation, survey, documentation, search and collection,
reconstruction, and presentation) phases, digital crime scene
investigation (preservation, survey, documentation, search
and collection, reconstruction, and presentation) phases, and
review phase. The main limitations of IDIP based
framework depicts the deployment phase which consists of
confirmation of the incident as being independent of the
physical and digital investigation phase. In practice, this
seems impossible to confirm a digital or computer crime
unless and until some preliminary physical and digital
investigation is carried out. Also it does not offer sufficient
specificity and does not draw a clear distinction between
investigations at the victim’s (secondary crime) scene and
those at the scene where the first criminal act occurred
(primary source). Neither does it reflect the process of
arriving at the latter. Since a computer can be used both as a
tool and as a victim .It is common for investigations to be
carried out at both ends so that accurate reflections are made.
The process of tracing back the suspects seems very
challenging when dealing with larger networks.
End-to-End Digital Investigation Process
(Carrier at el, 2004), contains nine phases consisting of
evidence collection, analysis of individual events,
preliminary correlation, event normalizing, event
deconfliction (uncountable), second-level correlation,
timeline analysis, chain of evidence construction, and
corroboration,. It combines the tools of the traditional
investigative methods. The focus of the model is on the
analysis process, particularly correlation, normalization, and
deconfliction of events that are reported from different
locations. While the model differs from the other models by
the interest it gives to analysis, it does not give enough
consideration to evidence searching and finding which a
complex and time consuming process is. This model was an
advancement as it permits formal verification unlike the
preceding models. Any state changes that occurred during
the course of the event were clearly represented without
providing technical details of the incident.
Incident response to help organizations
investigate cybercrimes in a simple manner was developed
by (Mandia at el, 2003). The framework consists of seven
components: pre-incident preparation, detection of
incidents, initial response, and formulation of response
strategy, investigation of the incident, 3dczxreporting, and
resolution. The analysis phase is included in the
investigation component. The framework has limitation
since investigation component begins after collecting data
from the same components.
Enhanced Integrated Digital Investigation Process
framework by (Baryamureeba at el, 2006), consists of five
major phases that include sub-phases: readiness (operation
and infrastructure readiness), deployment (detection and
notification, physical crime scene investigation, digital
crime scene investigation, confirmation, and submission),
trace back (digital crime scene investigation and
authorization), dynamite (physical crime scene
investigation, digital crime scene investigation,
reconstruction, and communication), and review phase. The
approach of the framework classifies the investigation
processes into two phases; trace back and dynamite. These
phases separate the investigations conducted at the primary
and physical crime scenes and depicts the other phases as
iterative instead of linear.
Event-based digital forensic investigation
framework (Carrier at el, 2003)), is based on the physical
crime scene. The framework consists of five phases that
include the subphases, i.e., readiness (operation and
infrastructure readiness), development (detection and
notification and confirmation and authorization), physical
crime scene investigation (search and reconstruction),
presentation, and digital crime scene investigation phase.
Each phase in this framework has a clear goal and
requirements to achieve the expected results. The integrated
phases, when combined, are insufficient to investigate real
cybercrime cases because these phases have not mention the
completeness of each phases (Rahayuat el, 2008).
Computer Forensic Field Triage Process
framework, (Yong-Dal at el, 2008). It has six phases which
include planning, triage, usage or user profiles, chronology
or timeline, Internet activity, and case-specific evidence
phases. The framework provides the identification, analysis,
and interpretation of cybercrime evidence within a short time
frame without the need to generate a complete forensic
image of the lab. The main limitation experienced by the
model is suitability for investigating all types of cybercrimes
because evidence is very difficult to distinguish and collect.
Extended model of cybercrime investigation,
(Ciardhuáin at el, 2003). Consists of thirteen phases that
includes awareness, authorization, planning, notification,
search and identification of evidence, collection, transport,
storage, examination, hypotheses, presentation, proof or
defense, and dissemination activity. This model is more
comprehensive than the other IDIP framework because it
encompasses almost all the investigation activities but the
model needs more evaluation in terms of scalability to
ensure that it analyzes evidence efficiently. The model also
is based on single-tier processes, focuses on the abstract
layer in each phase. The advantage of single-tier processes
is that they produce unambiguous outputs. The main
limitation of single-tier processes is that they reduce the
scalability and flexibility of the investigation when more
details are required from the user (Wei at el, 2005).
Hierarchical Framework for Digital Investigations
(Beebe at el, 2005), is a multi-tier, hierarchical framework
to guide digital investigations. The framework has six
phases, namely, preparation, incident response, data
collection, data analysis, presentation, and incident closure.
The framework introduces objective-based phases and
3. International Journal of Computer Applications Technology and Research
Volume 5– Issue 5, 304 - 311, 2016, ISSN:- 2319–8656
www.ijcat.com 306
subphases to each layer in the first tier with the ability to add
more details in advance to guide digital investigations,
especially in data analysis. The main limitation of this
framework is that it is incomplete and requires a more
methodical approach to identify the objectives of each layer.
2.2 General Network Forensics
Approaches
Evidence Graphs for Network Forensics Analysis
(Wei at el, 2010), is a technique for network forensics
analysis mechanism that includes effective evidence
presentation, manipulation and automated reasoning. The
model includes an evidence graph which facilitates the
presentation and manipulation of intrusion evidence. For
automated evidence analysis, the model has a hierarchical
reasoning framework that includes local reasoning and
global reasoning. Local reasoning aims to infer the roles of
suspicious hosts from local observations. Global Reasoning
aims to identify group of strongly correlated hosts in the
attack and derive their relationships. The analysis step is the
most comprehensive and sophisticated step. There is a need
to refine the model in local and global reasoning process with
more realistic experiments and also investigate methods to
automate the process for hypothesizing missing evidence
and validating hypotheses as mentioned by the authors.
Step-by-step framework (Kohn at el, 2006)),
Merges the previous frameworks to compile a reasonably
complete framework which groups all the existing processes
into three stages, namely, preparation, investigation, and
presentation, which are implemented as guidelines in
network forensics. The aim of the framework is to establish
a clear guideline of what steps should be followed in a
forensic process. However, understanding how the
framework addresses all phases of network forensics in the
main stages is very difficult in clarification.
Forensics Zachman (FORZA) (Stephenson at el,
2003) is a framework that focuses on the legal rules and
participants in the organization rather than the technical
procedures. The framework solves complex problems by
integrating the answers with the questions what (the data
attributes), why (the motivation), how (the procedures), who
(the people involved), where (the location), and when (the
time) questions. The FORZA framework includes eight
rules: case leader, system or business owner, legal advisor,
security or system architect or auditor, digital forensic
specialist, digital forensic investigator or system
administrator or operator, digital forensic analyst, and legal
prosecutor. The main drawback of this framework is that it
is human dependent. It requires more tools to conduct a
network forensic analysis and to provide accurate results in
the investigation phase.
Two-dimensional evidence reliability
amplification process model (Khatir at el, 2008), consists of
sixteen subphases and grouped into five main phases,
namely, initialized, evidence collection, evidence
examination or analysis, presentation, and case termination.
The phases of the model are described in detail by
identifying the roles of the inspector and manager for each
phase. The model aims to provide answers to cybercrime
questions, such as what happened, when did it happen, and
who perpetrated the action, without considering the
cybercrime intention and strategy analysis (why and how
questions). A similarity exists between incident response
and computer forensics (Freiling at el, 2000). The two
present a common process model for both incident response
and computer forensics to improve the investigation phase.
The model includes a set of steps grouped into three main
phases, consisting of pre- analysis (detection of incidents,
initial response, and formulation of response strategy),
analysis (live response, forensic duplication, data recovery,
harvesting, reduction, and organization), and post-analysis
(report and resolution). Incident response is conducted in the
model during the actual analysis. The procedures and
methods of incident response are unclear in terms of the type
of evidence that is utilized to analyze the incident. No
standard method of detecting and collecting evidence exists,
which produces insignificant evidence and affects the
accuracy of the incident response.
Digital forensics investigation procedure model
(Yong-Dal at el, 2008), consists of ten phases: investigation
preparation, classifying cybercrime and deciding
investigation priority, investigating damaged (victim) digital
crime scene, criminal profiling consultant and analysis,
tracking suspects, investigating injurer digital crime scene,
summoning suspect, additional investigation, writing
criminal profiling, and writing report. The model presented
the block diagram without any technical details or methods
to manipulate with these phases. This indicates that the main
focuswas on the number and the type of the network
forensics phases rather than how it works and how they
conduct the outcomes.
A categorization of investigation process was
done (Rahayu at el, 2008) to group and merge the similar
activities or processes in five phases that provide the same
output. The phases are: Phase 1 (Preparation), Phase 2
(Collection and Preservation), Phase 3 (Examination and
Analysis), and Phase 4 (Presentation and Reporting), and
Phase 5 (Disseminating the case). The researcher also
proposed a mapping process of digital forensic investigation
process model to eliminate the redundancy of the process
involved in the model and standardize the terms used in
achieving the investigation goal.
2.3Proactive Process framework in
Network Forensics
Multi-Component View of Digital Forensics
(Grobler at el, 2010), includes three components, consisting
of ProDF, ActDF), and ReDF. The ProDF component
defines and manages the processes and procedures of the
comprehensive digital evidence. ActDF includes four
subphases: incident response and confirmation, ActDF
investigation, event reconstruction, and ActDF termination.
ReDF includes six sub phases, which are incident response
and confirmation, physical investigation, digital
investigation, incident reconstruction, presentation of
findings to the management or authorities, dissemination of
the result of the investigation, and incident closure.
A theoretical framework to guide the
implementation of proactive digital forensics and to ensure
the forensic readiness of the evidence available for the
investigation process. The framework helps organizations
reduce the cost of the investigation process because it
provides manageable components and live analysis. The
4. International Journal of Computer Applications Technology and Research
Volume 5– Issue 5, 304 - 311, 2016, ISSN:- 2319–8656
www.ijcat.com 307
components proposed in the high-level view make the
implementation and automation of the framework more
difficult to create automated tools, as stated. Additionally,
the process contains phases, such as service restoration, that
lie outside the scope of the investigation Alharbi (2011).
Functional Process Model for Proactive and
Reactive Digital Forensics, (Alharbi at el, 2011),has two
components. The first one is the proactive digital forensic
component, which includes five phases: proactive
collection, event triggering function, proactive preservation,
proactive analysis, and preliminary report. The second
component is a reactive digital forensic component that also
has five phases: identification, preservation, collection,
analysis, and final report. The proposed proactive
component is similar to the active component of the multi-
component process such that they share the same reactive
component process. The examination and analysis phases
are combined in the proposed process under a single phase
called analysis. The limitation of this framework, it has not
yet fully implemented and may be adapted to
implementation requirements and it does not address all
techniques used by anti-forensics methods, which could
affect the ability of the components to resolve the
cybercrime in an efficient manner.
2.4 Generic Process Model for
Network Forensics
The generic process model for network forensic
analysis (Grobler at el, 2010), divides the phases into two
groups. The first group relies on actual time and includes
five phases: preparation, detection, incident response,
collection, and preservation. The four phases in the second
group act as post-investigation phases, which include the
examination, analysis, investigation, and presentation phase.
The first five phases work proactively because they work
during the occurrence of the cybercrime saving time and cost
during the investigation process. The first phase prepares the
network forensic software and legal environments, such as
the IDS firewalls, packet analyzer, and authorization
privilege. The second phase detects the nature of the attack
by generating a set of alerts through the security tools. The
third phase extends from the detection phase; it initializes the
incident response based on the type of the attack and
organizational policy. The fourth phase, which also extends
from the detection phase, collects network traffic through
suitable hardware and software programs to guarantee the
maximum collection of useful evidence. The fifth phase
backs up the original data, preserves the hash of all trace
data, and prepares a copy of the data for utilization in the
analysis phase and other phases.
The other four phases of this model work after the
investigation phase and act as a reactive process begin with
the examination phase to integrate the trace data and identify
the attack indicators; the indicators are then prepared for the
analysis phase. The seventh phase is the analysis phase,
which reconstructs the attack indicators by soft computing
or through statistical or data mining techniques to classify
and correlate the attack patterns. The phase aims to clarify
the attack intentions and methodology through the attack
patterns and provides feedback on how to improve the
security tools. The eighth phase is the investigation phase,
which aims to identify the path of the attack and the suitable
incident response based on the results of the analysis phase.
The final phase presents and documents the results,
conclusions, and observations about the cybercrime. All the
activities of network forensics are included in this model; the
present research adopts the phases of this model as a baseline
to show how the analysis phase integrates with the other
phases.
In generic framework each phase in the first five phases
requires a certain amount of time to accomplish its
processes. Each phase works in real time; thus, the phases
require the same amount of time and processing cost to
accomplish their processes. Given that the other four phases
work reactively, it is assume that they require more time and
processing cost compared with the first five phases. The
reason for this assumption is that reactive phases work after
the cybercrime happens; therefore, the required amount of
time and cost increases during the investigation process.
3.0 Discussion and Analysis of Digital Forensic Frameworks
3.1 Summary of existing digital forensics framework
All the discussed techniques have their advantages and disadvantageous as summarized in Table 1 below
Table 1: Summary of existing digital forensics framework
Approach Type Limitations
event-based digital forensic investigation
framework (Carrier at el 2003)
Reactive the integrated phases, when combined, are insufficient to
investigate real cybercrime cases because these phases have not
mention the completeness of each phases
Computer Forensic Field Triage Process
framework (Yong-Dal at el 2008)
Reactive evidence is very difficult to distinguish and collect
Hierarchical Framework for Digital
Investigations(Beebe at el,2005)
Reactive It is incomplete and requires a more methodical approach to
identify the objectives of each layer.
Step-by-step framework (Kohn at el
2006)
Reactive Understanding how the framework addresses all phases of network
forensics in the main stages is very difficult need clarification.
Forensics ZachmanDigital forensics
Investigation Framework (Stephenson at
el, 2008)
Reactive It requires more tools to conduct a network forensic analysis and
to provide accurate results in the investigation phase.
5. International Journal of Computer Applications Technology and Research
Volume 5– Issue 5, 304 - 311, 2016, ISSN:- 2319–8656
www.ijcat.com 308
Two-Dimensional Evidence Reliability
Amplification
Process Diagram (Khatir at el 2008)
Reactive Does not consider the cybercrime intention and strategy analysis
(why and how questions)
Common Process Model for Incident
Response and Computer Forensics
(Freiling at el 2008)
Reactive No standard method of detecting and collecting evidence exists,
which produces insignificant evidence and affects the accuracy of
the incident response.
Digital forensics investigation procedure
model [31]
Reactive The model presented the block diagram without any technical
details or methods to manipulate with these phases
Mapping process in digital forensic
(Rahayu at el 2008)
Hybrid They did not implement the model
Generic Process Model for Network
Forensics (Ricci at el, 2006)
Hybrid The output of the examination and analysis phase which doesn’t
mention the methods and techniques which could be used to
conduct the output from this phase.
Multi-Component View of Digital
Forensics, (Grobler at el 2010)
Hybrid The components proposed in the high-level view make the
implementation and automation of the framework more difficult
to create automated tools(Alharbi at el, 2008)
Functional Process Model for Proactive
and Reactive Digital Forensics (Alharbi
at el, 2008)
Hybrid) has limited capabilities because it does not include all the anti-
forensic techniques,
Cyber Crime Resolving Approach
(Mohammad at el 2013)
Hybrid The modules of the proposed approach were neither discussed
nor implemented
From the existing frameworks discussed in the literature
review, it is clearly indicated that the digital forensic
investigation is a process consisting of several activities
although they may be different in terms used and the order
followed but they are all designed to achieve similar
objective. Also the proposed frameworks are built on the
underlying experience to improve the existing ones.
3.2 Design Consideration in
Developing Network Forensic
Investigation Frameworks
The challenges in current networkforensic Frameworks
includes
• The organization tends to develop its own
procedures focusing on the technology aspects
such as data acquisition or data analysis and hence
a change in the underlying technology forces new
procedures to be developed hence investigation
should be incorporated with the basic procedures
in forensic investigation which are preparation,
investigation and presentation (Satpathy at el,
2010; Kohn at el 2006).
• The digital evidence is in a disorganized form and
as such it can be very difficult to handle and not
all of them is obviously readable by human.
• During collection process, the evidence is related
to the aspect on how the evidence is searched,
collected, analyzed, presented and documented
without tampering the evidence and preserving the
chain of evidence.
• During the analysis process, the analysis tools
used must be legally accepted, performed by
experts or qualified person, and the evidence
should not be tampered with or lost.
• The huge amount of collected data from
heterogeneous devises needs automated
techniques to reduce redundancy, and
consequently reduce the analysis time and storage
requirement of the evidence ( Noor at el 2015;
Rahayu at el 2008)
• A proactive approach to help response systems
react before the network is compromised, and to
have the opportunities to overcome the advantages
of attacker by predicting the next attacker action
as a proactive step (Noor at el. 2015; Grobler at el,
2010),
• The investigation process should discover
complete relationship with optimal performance
among known and unknown attacks (Maheyzah at
el, 2015).
• The approach of presenting and documenting the
evidence should be understandable to non-
technical person such as jury and judge for
example applications of graph, tree diagrams other
than text.
4.0 Proposed Network Forensic
Investigation Framework
The proposed theoretical framework can be
categorized as proactive and reactive as it predict future
attacker actions before damage, and automatically respond
to attacks in a timely manner The proposed approach
includestwo major modules which are linked together with a
proactive depository.
i. Online alert collection and
preprocessing
ii. Online and offline alert correlationand
optimization
The proposed model processes include evidence collection,
evidence identification and classification, analysis and
investigation. The final phase presents and documents the
results, conclusions, and observations about the
6. International Journal of Computer Applications Technology and Research
Volume 5– Issue 5, 304 - 311, 2016, ISSN:- 2319–8656
www.ijcat.com 309
cybercrimethese phases are distributed in two modules and
linked with the proactive depository.
4.1 Online alert Collection And
Preprocessing
The first module gathers alerts from
heterogeneous sources in real time, preprocess by
normalization and aggregation of alerts based on given
feature such as time, IP source, destination address, etc. the
intrusion according to level of evidence accuracy so that
forensic professionals will have smaller scope of evidence to
investigate and analyze. The result will be stored in the
evidence depository. The module includes the preparation,
evidence collection, and normalization and aggregation
phases. This phase improves the investigation process by
accurately identifying similar cybercrime cases for
investigation.
4.2 Online and Offline Alert
Correlation and Optimization
The second module provides an analysis
mechanism that includes effective alert correlation to
improve the quality of alerts and integrate them with isolated
alerts and also construct all possible attack scenarios. This
can be done either online and offline mode. It wills also
prioritizing intrusion alerts. Evidence graph will be
generated to facilitate the presentation and manipulation of
intrusion evidence. Based on the evidence graph an
automated reasoning mechanism can be developed with the
help of soft computing and advanced analytics for automated
evidence analysis. The phase aims to identify the attack
group, reconstruct attack strategy predict incoming attacks
together with their intentions and provides feedback on how
to improve the security system.
Table 2: Summary of Processes in the Proposed Framework
Module Phase name Activities / processes
Evidence collection and
pre process
Preparation •Attacker Goal Identification and hypothesis formulation
•Network Configuration
•Privilege Profile and Trust Setting
•Vulnerability and Exploit Permission
Evidence collection and
preservation
Data aggregation from different data sources
• Formatting and standardizing intrusion alerts
• Improve the quality of alerts through Filtering redundant
and invalid alerts.
• dimensional reduction
Online and offline alert
correlation and
optimization
Analysis and examination Alert analysis through structural, causal and statistical based
correlation techniques
• Filtering low-interest and false positive intrusion alerts.
• Discovering attack scenario.
• Verification and prioritizing intrusion alerts.
• Forecasting attacker next action.
• Forecasting forthcoming attacks
Evidence presentation and
dissemination
Presenting and reporting
Preparing and presenting the information resulting from the
analysis phase
• Determine the issues relevance of the information, its
reliability and who can testify to it
• Interpret the statistical from analysis phase
• Clarify the evidence, and Document the findings
• Summarize and provide explanation of conclusions
• Presenting the physical and digital evidence to a court or
corporate management
• Attempt to confirm each piece of evidence and each event
in the chain each other, independently, evidence or events
• Prove the validity of the hypothesis and defend it against
criticism and challenge
• Communicate relevance findings to a variety of audiences
(management, technical personnel, law enforcement)
Disseminating and documenting Ensuring physical and digital property is returned to proper
owner
• Determine how and what criminal evidence must be
removed • Reviewing the investigation to identify areas of
improvement • Disseminate the information from the
investigation
• Close out the investigation and preserve knowledge gained
7. International Journal of Computer Applications Technology and Research
Volume 5– Issue 5, 304 - 311, 2016, ISSN:- 2319–8656
www.ijcat.com 310
5.0 Conclusion
Digital forensic is a scientific process that
facilitates detection of illegal activities and in-appropriate
behaviors using scientific proven tools, techniques and
investigation frameworks. Existing practices in digital
forensic are not scalable and efficient to handle advanced
and modern attacks exploiting emerging services resulting
from advancement in Information Communication
Technology. This research proposed proactive approach in
network forensic investigation process that will address the
issue of evidence collection and evidence analysis in a real
time multiple sensor environment. It is aimed to discover
complete relationship with optimal performance among
known and unseen/new alerts generated by multiple network
sensors in order to improve the quality of alert and recognize
attack strategy.
For future work, a prototype will be developed in
order to prove the effectiveness of the proposed
framework.Various issues will be addressed in the
implementation of the new process: the ability to collect and
preserve alerts online, predict an attack strategy, optimizing
the proactive component throughfiltering false negatives and
prioritizing intrusions and predict attack next cause of action
andprovide feedbackproactively
.
REFERENCES
1. Healy, L. (2008) Increasing the Likelihood of
admissible electronic evidence: Digital log
2. Handling excellence and a forensically aware
corporate culture
3. PandaLabs, Annual Report Panda Security’s Anti
Malware Laboratory 2009. 2010, Panda Security
4. Will, J. P. (2011). 7 - Cyber X: Criminal
Syndicates, Nation States, Subnational Entities,
and Beyond. In Cybercrime and Espionage (Vol.
ISBN 9781597496131, pp. 115-133). Syngress,
Boston.
5. S. Garfinkel, "Anti-forensics: Techniques,
detection and countermeasures," in 2nd
International Conference on i-Warfare and
Security, 2007, p. 77.
6. Alharbi, S. e. (2011). The Proactive and Reactive
Digital Forensics Investigation Process
International Journal of Security and Its
Applications Vol. 5 No. 4, October, 2011
7. Orebaugh, "Proactive forensics," Journal of
Digital Forensic Practice, vol. 1, p. 37, 2006.
8. Palmer, G. (2001, a). A Road Map for Digital
Forensic Research. Utica, New York,: Report
From the First Digital Forensic Research
Workshop (DFRWS).
9. Palmer. G. (2001, b). A Road Map for Digital
Forensic Research. Utica, New York: DFRWS
TECHNICAL REPORT.
10. Mukkamala, S. S. (2003). Identifying significant
features fornetwork forensic analysis using
artificial intelligent techniques. International
Journal of Digital Evidence, 1-10.
11. Nikkel. (2005). Generalizing sources of live
network evidence. Digital Investigation (The
International Journal of Digital Forensics &
Incident Response, 193–200.
12. Ren. (2004). on a network forensics model for
information security. In Proceedings of the third
international conference on information systems
technology and its applications (ISTA 2004), 29–
34.
13. Wei, R. a. (2005). Modeling the network forensics
behaviors. In Security and Privacy for Emerging
Areas in Communication Networks.Workshop of
the 1st International Conference on. 2005.
14. SitiRahayuSelamat, R. S. (2008). Mapping
Process of Digital Forensic Investigation
Framework. IJCSNS International Journal of
Computer Science and Network Security, Vol.
8(No. 10): p. 163-169.
15. Will, J. P. (2011). 7 - Cyber X: Criminal
Syndicates, Nation States, Subnational Entities,
and Beyond. In Cybercrime and Espionage (Vol.
ISBN 9781597496131, pp. 115-133). Syngress,
Boston.
16. Siebert, E. (2010). The Case for Security
Information and Event Management (SIEM) in
Proactive Network Defense.SolarWinds.
17. Khurana H, B. J. (2009). A framework for
collaborative incident response and investigation.
In: Proceedings of the eighth symposium on
identity and trust on the Internet,Maryland; , 38–
51.
18. Reith M, C. G. (2002. ). An Examination of Digital
Forensic Models. International Journal of Digital
Evidence, 1(3): p. 12.
19. Mohammad Rasmi, AmanJantan, Hani Al-Mimi
A New Approach For Resolving Cyber Crime In
Network Forensics Based On Generic Process
Model ICIT 2013 The 6th International
Conference on Information Technology
20. Baryamureeba and F. Tushabe, "The Enhanced
Digital Investigation Process Model," sAsian
Journal of Information Technology, vol. 5, pp.
790-794, 2006.
21. Beebe, N. a. (2005). A hierarchical, objectives-
based framework for the digital investigations
process. International Journal of Digital
Investigation, 2(2): p. 147-167.
22. Carrier, B. a. (2003). Getting Physical with the
Digital Investigation Process. International
Journal of Digital Evidence, 2(2): p. 20.
23. Ciardhuáin, S. (2003). An Extended Model of
Cybercrime Investigations. fxInternational
Journal of Digital Evidence , 3(1): p. 1-22.
24. Freiling, F. a. (2007). A Common Process Model
for Incident Response and Computer Forensics. IT
Incident Management and IT Forensics. Germany.
25. Grobler, C. C. (2010). A Multi-component View
of Digital Forensics. In Availability, Reliability,
and Security,.ARES '10 International Conference.
ARES.
26. Khatir, M. S. (2008). Two-Dimensional Evidence
Reliability Amplification Process Model for
Digital Forensics. In Digital Forensics and
8. International Journal of Computer Applications Technology and Research
Volume 5– Issue 5, 304 - 311, 2016, ISSN:- 2319–8656
www.ijcat.com 311
Incident Analysis.WDFIA '08. Third International
Annual Workshop on. 2008.
27. Kohn, M. J. (2006). Framework for a digital
forensic investigation,.Information Security South
Africa (ISSA). South Africa: Insight to Foresight.
28. Louwrens, C., & von Solms, S. (2010). A Multi-
component View of Digital Forensics. IEEE
Xplore , 647 - 652 .
29. Mandia, K. a. (2003). Incident response and
computer forensics. McGraw-Hill/Osborne. , 507.
30. Pilli, E. R. (2010). Network forensic frameworks:
Survey and research challenges. Journal of
Elsevier Ltd. 2010.
31. Ricci S.C, I. F. (2006). Digital forensics
investigation framework that incorporate legal
issues. Digital Investigation, 29-36.
32. Rogers, M. e. (2006). Computer Forensics Field
Triage Process Model. Journal of Digital
Forensics, Security and Law, Vol. 1(2), 9-37.
33. Stephenson, P. (2003). A Comprehensive
Approach To Digital Incident
Investigation,Information Security Technical
Report,E.A. Technology, Editor p. 42-54.
34. Yong-Dal, S. (2008). New Digital Forensics
Investigation Procedure Model. In Networked
Computing and Advanced Information
Management,.NCM '08.
35. Noora Al Khater , Richard E Overill (2015),
Forensic Network Traffic Analysis, Proceedings
of The Second International Conference on Digital
Security and Forensics, Cape Town, South Africa
36. S. Satpathy, S. K. Pradhan and B. B. Ray, 2010 “A
Digital Investigation Tool based on Data Fusion in
Management of Cyber Security Systems,”
International Journal of In- formation Technology
and Knowledge Management
37. MaheyzahMdSiraj, Hashim Hussein
TahaAlbasheer and Mazura Mat
Din,2015,Towards Predictive Real-time Multi-
sensors Intrusion Alert Correlation Framework
Indian Journal of Science and Technology, Vol
8(12)
.