Abstract: Since 2004, the DETER Cybersecurity Testbed Project has worked to create the necessary infrastructure – facilities, tools, and processes – to provide a national resource for experimentation in cyber security. The next generation of DETER envisions several conceptual advances in testbed design and experimental research methodology, targeting improved experimental validity, enhanced usability, and increased size, complexity, and diversity of experiments. This paper outlines the DETER project's status and R&D directions.
For more information, visit: http://www.deter-project.org
Current Developments in DETER Cybersecurity Testbed TechnologyDETER-Project
A referred paper in Proceedings of the Cybersecurity Applications & Technology Conference for Homeland Security (CATCH 2009). Abstract: "From its inception in 2004, the DETER testbed facility has provided effective, dedicated experimental resources and expertise to a broad range of academic, industrial and government researchers. Now, building on knowledge gained, the DETER developers and community are moving beyond the classic “testbed” model and towards the creation and deployment of fundamentally transformational cybersecurity research methodologies. This paper discusses underlying rationale, together with initial design and implementation, of key technical concepts that drive these transformations." Authors: Terry Benzel, Bob Braden, Ted Faber, Jelena Mirkovic, Steve Schwab, Karen Sollins, and John Wroclawski.
1.[1 9]a genetic algorithm based elucidation for improving intrusion detectio...Alexander Decker
This document summarizes a research paper that proposes using a genetic algorithm to improve intrusion detection. The paper aims to reduce features from the KDD Cup 99 dataset and generate a rule set using genetic algorithms to detect intrusions with a condensed feature set. The genetic algorithm is used to evolve rules from the reduced training data, with a fitness function evaluating rule quality. Experiments and evaluations are conducted on the KDD Cup 99 dataset to test the proposed method.
11.a genetic algorithm based elucidation for improving intrusion detection th...Alexander Decker
This document summarizes a research paper that proposes using a genetic algorithm to improve intrusion detection. The paper aims to reduce features from the KDD Cup 99 dataset and generate a rule set using genetic algorithms to detect intrusions. The genetic algorithm evolves rules over generations to maximize fitness. Experiments show this approach can improve detection rates and reduce false alarms compared to existing intrusion detection systems.
This document presents a proposed hybrid intrusion detection system that combines k-means clustering, k-nearest neighbor classification, and decision table majority rule-based approaches. The system is evaluated on the KDD-99 dataset to detect intrusions and classify them into four categories: R2L, DoS, Probe, and U2R. The goal is to decrease the false alarm rate and increase accuracy and detection rate compared to existing intrusion detection systems. The proposed system applies k-means clustering first, then k-nearest neighbor classification, and finally decision table majority rules. Results show the proposed hybrid approach improves performance metrics compared to existing techniques.
The document discusses Cisco's approach to endpoint security which goes beyond traditional prevention and point-in-time detection. It involves continuous monitoring, detection and response across the full attack continuum before, during and after an attack. This continuous approach provides visibility into compromise and persistence unlike traditional methods. It allows security teams to quickly contain and remediate infections without disrupting users. The approach weaves together file, process and communication streams over time to capture relationships and detect behavioral indicators of compromise in real-time.
The document provides information about the Center for Information Assurance and Cybersecurity (CIAC) at the University of Washington. It introduces Barbara Endicott-Popovsky as the director of CIAC and describes her background. It then summarizes some of CIAC's activities, including its multi-disciplinary approach to information assurance education and research through collaboration with various university departments and external partners like Pacific Northwest National Laboratory.
This document discusses the potential threat of a "Superworm", a theoretical worm that could incorporate successful propagation techniques from past worms to spread rapidly and cause widespread damage. It describes the features such a worm may have, including exploiting multiple vulnerabilities across many operating systems and using various proliferation methods. The document also examines a past university network security incident and two security technologies that could help detect and limit the spread of such a worm: an early worm detection system and a modified reverse proxy server.
Current Developments in DETER Cybersecurity Testbed TechnologyDETER-Project
A referred paper in Proceedings of the Cybersecurity Applications & Technology Conference for Homeland Security (CATCH 2009). Abstract: "From its inception in 2004, the DETER testbed facility has provided effective, dedicated experimental resources and expertise to a broad range of academic, industrial and government researchers. Now, building on knowledge gained, the DETER developers and community are moving beyond the classic “testbed” model and towards the creation and deployment of fundamentally transformational cybersecurity research methodologies. This paper discusses underlying rationale, together with initial design and implementation, of key technical concepts that drive these transformations." Authors: Terry Benzel, Bob Braden, Ted Faber, Jelena Mirkovic, Steve Schwab, Karen Sollins, and John Wroclawski.
1.[1 9]a genetic algorithm based elucidation for improving intrusion detectio...Alexander Decker
This document summarizes a research paper that proposes using a genetic algorithm to improve intrusion detection. The paper aims to reduce features from the KDD Cup 99 dataset and generate a rule set using genetic algorithms to detect intrusions with a condensed feature set. The genetic algorithm is used to evolve rules from the reduced training data, with a fitness function evaluating rule quality. Experiments and evaluations are conducted on the KDD Cup 99 dataset to test the proposed method.
11.a genetic algorithm based elucidation for improving intrusion detection th...Alexander Decker
This document summarizes a research paper that proposes using a genetic algorithm to improve intrusion detection. The paper aims to reduce features from the KDD Cup 99 dataset and generate a rule set using genetic algorithms to detect intrusions. The genetic algorithm evolves rules over generations to maximize fitness. Experiments show this approach can improve detection rates and reduce false alarms compared to existing intrusion detection systems.
This document presents a proposed hybrid intrusion detection system that combines k-means clustering, k-nearest neighbor classification, and decision table majority rule-based approaches. The system is evaluated on the KDD-99 dataset to detect intrusions and classify them into four categories: R2L, DoS, Probe, and U2R. The goal is to decrease the false alarm rate and increase accuracy and detection rate compared to existing intrusion detection systems. The proposed system applies k-means clustering first, then k-nearest neighbor classification, and finally decision table majority rules. Results show the proposed hybrid approach improves performance metrics compared to existing techniques.
The document discusses Cisco's approach to endpoint security which goes beyond traditional prevention and point-in-time detection. It involves continuous monitoring, detection and response across the full attack continuum before, during and after an attack. This continuous approach provides visibility into compromise and persistence unlike traditional methods. It allows security teams to quickly contain and remediate infections without disrupting users. The approach weaves together file, process and communication streams over time to capture relationships and detect behavioral indicators of compromise in real-time.
The document provides information about the Center for Information Assurance and Cybersecurity (CIAC) at the University of Washington. It introduces Barbara Endicott-Popovsky as the director of CIAC and describes her background. It then summarizes some of CIAC's activities, including its multi-disciplinary approach to information assurance education and research through collaboration with various university departments and external partners like Pacific Northwest National Laboratory.
This document discusses the potential threat of a "Superworm", a theoretical worm that could incorporate successful propagation techniques from past worms to spread rapidly and cause widespread damage. It describes the features such a worm may have, including exploiting multiple vulnerabilities across many operating systems and using various proliferation methods. The document also examines a past university network security incident and two security technologies that could help detect and limit the spread of such a worm: an early worm detection system and a modified reverse proxy server.
Wireless Sensor Network Nodes: Security and Deployment in the Niger-Delta Oil...IJNSA Journal
Wireless sensor networks (WSN) is tending towards becoming a complete solution in communication protocols, embedded systems and low-power implementations. However, the resource constraints which includes, limited communication range, limited energy, limited computing power, limited bandwidth and the fear of intruders have limited the WSN applications. Since lightweight computational nodes that are currently being used in WSN pose particular challenge for many security applications, the whole research therefore, is the investigation of new security techniques and appropriate implementation for WSN nodes, including various trade-offs such as implementation complexity, power dissipation, security flexibility and scalability. The goal of this research is to develop a network that has efficient and flexible key distribution scheme secured enough to prevent algorithmic complexity and denial of service attacks as well as the network able to conserve energy. A review of previous research to date in the area of security for WSNs was carried out and proposals are made based on security schemes that gather data in
an energy-efficient mechanism through secured pre-allocation of keys, faster clustering routing algorithm and dynamic based rekeying implementation.
The document discusses several testbeds and frameworks for evaluating intrusion detection systems (IDS), including the Air Force Evaluation Environment, LARIAT, and TIDeS testbeds. The TIDeS framework allows for customized testing scenarios, automated evaluations, and uses fuzzy logic to evaluate IDS performance based on metrics like detection depth, breadth, and false alarms. It generates realistic network profiles and traffic and can test IDSs under different environments.
This document discusses using wireless sensor networks for habitat monitoring. It presents requirements for a system to monitor seabird nesting environments and behaviors. The currently deployed network consists of 32 sensor nodes on an island off the coast of Maine that stream live data online. The application-driven design helps identify important areas for further work like data sampling, communications, network tasks, and health monitoring.
Top 10 intrusion detection system papesIJNSA Journal
The document summarizes 10 papers related to intrusion detection systems using genetic algorithms. The first paper discussed implements an intrusion detection system using a genetic algorithm to efficiently detect various types of network intrusions. It applies genetic algorithm parameters and evolution processes to filter traffic data and reduce complexity. It obtained reasonable detection rates when tested on the KDD99 benchmark dataset. The second paper combines naive Bayes and decision tree classifiers for adaptive intrusion detection. It aims to balance detections, reduce false positives, eliminate redundant attributes, and address data mining challenges. When tested on KDD99, it achieved high detection rates and significantly reduced false positives with limited resources.
27 5 jun17 28apr 15859 ammar final (edti ari baru))IAESIJEECS
The transition from analog technologies to digital technologies has increased the ever-growing concern for protection and authentication of digital content and data. Owners of digital content of any type are seeking and exploring new technologies for the protection of copyrighted multimedia content. Multimedia protection has become an issue in recent years, and to deal with this issue, researchers are continuously searching for and exploring new effective and efficient technologies. This thesis study has been prepared in order to increase the invisibility and durability of invisible watermarking by using the multilayer Discrete Wavelet Transform (DWT) in the frequency plane and embedding two marks into an image for the purpose of authentication and copyright when digital content travels through an unsecured channel. A novel watermarking algorithm has been proposed based on five active positions and on using two marks. In addition to the extraction process, watermarking images will be subjected to a set of attack tests. The evaluation criteria have been the bases of assessing the value of SNR, PNSR, MAE and RMSE for both the watermarking images and the watermarking images after attacks, followed by the invisibility of the watermarking being measured before and after the attacks. Our lab results show high robustness and high quality images obtaining value for both SNR and PNSR.
Abstract—With the heightening reliance on Information Technology in recent times, it has becoming more relevant to find measures to secure every online device, data and information. A Network Intrusion Detection System (NIDS) is one of the security options to consider to help protect such devices, data and information. However, IDS needs to be up to date to mitigate current threats to secure systems. A critical issue in the development of the right IDS is the scarcity of current data sets used for training these IDS and the impact on system performance. This paper presents an On-demand Network Data Set Creation Application (ONDaSCA) a Graphical User Interface software capable of generating labelled network intrusion data set. ONDaSCA grants IDS users or researchers the option to choose a raw data set and processed this data set as output, real-time packet capture and offline upload of existing PCAP file and two (2) difference packet capturing methods (Tshark and Dumpcap). ONDaSCA is highly customisable and an IDS user or researcher can leverage its capabilities to suit their needs. The abilities of this software are compared with other similar products that generate data set for use by IDS model.
International Journal of Computer Science and Information Security,IJCSIS ISSN 1947-5500, Pittsburgh, PA, USA
Email: ijcsiseditor@gmail.com
http://sites.google.com/site/ijcsis/
https://google.academia.edu/JournalofComputerScience
https://www.linkedin.com/in/ijcsis-research-publications-8b916516/
http://www.researcherid.com/rid/E-1319-2016
The document discusses the need for calibration standards for network devices used to collect forensic data on networks. Currently there are no standards, and network evidence is admitted in court without calibration of the collecting devices. This could lead to unreliable evidence and legal challenges. The document proposes developing network device calibration standards based on current network testing protocols to validate the reliability of network forensic data collection and support the admissibility of network evidence in court.
An Overview of Information Systems Security Measures in Zimbabwean Small and ...researchinventy
This paper reports on the Information Systems (IS) securitymeasures implemented by small and medium size enterprises (SMEs) in Zimbabwe. A survey questionnaire was distributed to 32 randomly selected participants in order to investigate the security measures and practices in their respective organisations. The results indicated that over 50% of the respondents had installed firewalls, while more than 80% carried out regular software updates and none of the respondents had intrusion detection systems. The researchers recommended that SMEs work to enhance their knowledge on the different IS threats in order to enable the implementation of preventive measures.
This document presents a proposed model for an intrusion detection system using data mining techniques. The proposed model combines clustering and classification methods. Specifically, it uses k-means clustering to group data and then applies naive Bayes classification. This is intended to improve performance over existing IDS systems by leveraging data mining concepts. The proposed model is described as enhancing efficiency by reducing false alarms and missed detections compared to prior work.
Presentation from the 2013 Bio-IT World conference. It describes the design and implementation of data and compute infrastructure for the New York Genome Center.
Network infrastructures have played important part in most daily communications for business industries,
social networking, government sectors and etc. Despites the advantages that came from such
functionalities, security threats have become a daily struggle. One major security threat is hacking.
Consequently, security experts and researchers have suggested possible security solutions such as
Firewalls, Intrusion Detection Systems (IDS), Intrusion Detection and Prevention Systems (IDP) and
Honeynet. Yet, none of these solutions have proven their ability to completely address hacking. The reason
behind that, there is a few researches that examine the behavior of hackers. This paper formally and
practically examines in details the behavior of hackers and their targeted environments. Moreover, this
paper formally examines the properties of one essential pre-hacking step called scanning and highlights its
importance in developing hacking strategies. Also, it illustrates the properties of hacking that is common in
most hacking strategies to assist security experts and researchers towards minimizing the risk of hack.
This document provides an overview of intrusion detection systems (IDS), including their challenges, potential solutions, and future developments. It discusses how IDS aim to detect attacks against computer systems and networks. The challenges of high false alarm rates and dependency on the environment are outlined. Potential solutions explored include data mining, machine learning, and co-simulation mechanisms. Alarm correlation techniques are examined as ways to combine fragmented alert information to better interpret attack flows. Artificial intelligence is seen as important for improving IDS flexibility, adaptability, and pattern recognition.
Finding Diversity In Remote Code Injection Exploitsamiable_indian
1. The document analyzes the diversity among remote code injection exploits by collecting exploit samples from network traces, extracting and emulating shellcodes, and clustering the shellcodes based on an exedit distance metric.
2. It finds that exploits can be grouped into families based on the vulnerability targeted. The LSASS and ISystemActivator exploit families show subtle variations among related exploits, while RemoteActivation exploits exhibit more diversity.
3. Analyzing exploit phylogenies reveals code sharing among families and subtle variations within families, providing insights into the emergence of polymorphism in malware payloads.
Analyzing and implementing of network penetration testingEngr Md Yusuf Miah
The primary objective for a analysis of network penetration test is to identify exploitable vulnerabilities in applications before hackers are able to discover and exploit them. Network penetration testing will reveal real-world opportunities for hackers to be able to compromise applications in such a way that allows for unauthorized access to sensitive data or even take-over systems for malicious/non-business purposes.
A Proactive Approach in Network Forensic Investigation ProcessEditor IJCATR
nformation Assurance and Security (IAS) is a crucial component in the corporate environment to ensure that the secrecy of
sensitive data is protected, the integrity of important data is not violated, and the availability of critical systems is guaranteed. The
advancement of Information communication and technology into a new era and domain such as mobility and Internet of Things,
its ever growing user’s base and sophisticated cyber-attacks forces the organizations to deploy automated and robust defense
mechanism to manage resultant digital security incidences in real time. Digital forensic is a scientific process that facilitates
detection of illegal activities and in-appropriate behaviors using scientific tools, techniques and investigation frameworks. This
research aims at identifying processes that facilitate and improves digital forensic investigation process. Existing digital forensic
framework will be reviewed and the analysis will be compiled toderive a network forensic investigation framework that include
evidence collection, preservation and analysis at a sensor level and in real time. It is aimed to discover complete relationship with
optimal performance among known and unseen/new alerts generated by multiple network sensors in order to improve the quality
of alert and recognize attack strategy
The DETER Project: Towards Structural Advances in Experimental Cybersecurity ...DETER-Project
Abstract: It is widely argued that today's largely reactive, "respond and patch" approach to securing cyber systems must yield to a new, more rigorous, more proactive methodology. Achieving this transformation is a difficult challenge. Building on insights into requirements for cyber science and on experience gained through 8 years of operation, the DETER project is addressing one facet of this problem: the development of transformative advances in methodology and facilities for experimental cybersecurity research and system evaluation. These advances in experiment design and research methodology are yielding progressive improvements not only in experiment scale, complexity, diversity, and repeatability, but also in the ability of researchers to leverage prior experimental efforts of others within the community. We describe in this paper the trajectory of the DETER project towards a new experimental science and a transformed facility for cyber-security research development and evaluation.
For more information, visit: http://www.deter-project.org
The Science of Cyber Security Experimentation: The DETER ProjectDETER-Project
Terry Benzel provided the keynote address at the 11th Annual Computer Security Applications Conference (ACSAC). This document is the invited paper that she addressed in her keynote.
Abstract: Since 2004, the DETER Cyber-security Project has worked to create an evolving infrastructure – facilities, tools, and processes – to provide a national resource for experimentation in cyber security. Building on our insights into requirements for cyber science and on lessons learned through 8 years of operation, we have made several transformative advances towards creating the next generation of DeterLab. These advances in experiment design and research methodology are yielding progressive improvements not only in experiment scale, complexity, diversity, and repeatability, but also in the ability of researchers to leverage prior experimental efforts of other researchers in the DeterLab user community. This paper describes the advances resulting in a new experimentation science and a transformed facility for cyber-security research development and evaluation.
Further described: http://www.deter-project.org/blog/deter_-_keynote_address_acsac_key_new_web_site
For additional information, visit:
- http://www.deter-project.org
- http://info.deterlab.net
Use of network forensic mechanisms to formulate network securityIJMIT JOURNAL
Network Forensics is fairly a new area of research which would be used after an intrusion in various
organizations ranging from small, mid-size private companies and government corporations to the defence
secretariat of a country. At the point of an investigation valuable information may be mishandled which
leads to difficulties in the examination and time wastage. Additionally the intruder could obliterate tracks
such as intrusion entry, vulnerabilities used in an entry, destruction caused, and most importantly the
identity of the intruder. The aim of this research was to map the correlation between network security and
network forensic mechanisms. There are three sub research questions that had been studied. Those have
identified Network Security issues, Network Forensic investigations used in an incident, and the use of
network forensics mechanisms to eliminate network security issues. Literature review has been the
research strategy used in order study the sub research questions discussed. Literature such as research
papers published in Journals, PhD Theses, ISO standards, and other official research papers have been
evaluated and have been the base of this research. The deliverables or the output of this research was
produced as a report on how network forensics has assisted in aligning network security in case of an
intrusion. This research has not been specific to an organization but has given a general overview about
the industry. Embedding Digital Forensics Framework, Network Forensic Development Life Cycle, and
Enhanced Network Forensic Cycle could be used to develop a secure network. Through the mentioned
framework, and cycles the author has recommended implementing the 4R Strategy (Resistance,
Recognition, Recovery, Redress) with the assistance of a number of tools. This research would be of
interest to Network Administrators, Network Managers, Network Security personnel, and other personnel interested in obtaining knowledge in securing communication devices/infrastructure. This research provides a framework that can be used in an organization to eliminate digital anomalies through network forensics, helps the above mentioned persons to prepare infrastructure readiness for threats and also enables further research to be carried on in the fields of computer, database, mobile, video, and audio.
USE OF NETWORK FORENSIC MECHANISMS TO FORMULATE NETWORK SECURITYIJMIT JOURNAL
Network Forensics is fairly a new area of research which would be used after an intrusion in various
organizations ranging from small, mid-size private companies and government corporations to the defence
secretariat of a country. At the point of an investigation valuable information may be mishandled which
leads to difficulties in the examination and time wastage. Additionally the intruder could obliterate tracks
such as intrusion entry, vulnerabilities used in an entry, destruction caused, and most importantly the
identity of the intruder. The aim of this research was to map the correlation between network security and
network forensic mechanisms. There are three sub research questions that had been studied. Those have
identified Network Security issues, Network Forensic investigations used in an incident, and the use of
network forensics mechanisms to eliminate network security issues. Literature review has been the
research strategy used in order study the sub research questions discussed. Literature such as research
papers published in Journals, PhD Theses, ISO standards, and other official research papers have been
evaluated and have been the base of this research. The deliverables or the output of this research was
produced as a report on how network forensics has assisted in aligning network security in case of an
intrusion. This research has not been specific to an organization but has given a general overview about
the industry. Embedding Digital Forensics Framework, Network Forensic Development Life Cycle, and
Enhanced Network Forensic Cycle could be used to develop a secure network. Through the mentioned
framework, and cycles the author has recommended implementing the 4R Strategy (Resistance,
Recognition, Recovery, Redress) with the assistance of a number of tools. This research would be of
interest to Network Administrators, Network Managers, Network Security personnel, and other personnel
interested in obtaining knowledge in securing communication devices/infrastructure. This research
provides a framework that can be used in an organization to eliminate digital anomalies through network
forensics, helps the above mentioned persons to prepare infrastructure readiness for threats and also
enables further research to be carried on in the fields of computer, database, mobile, video, and audio.
Testimony of Terry V. Benzel, University of Southern California Information S...DETER-Project
1. The witness discusses the importance of expanding the scope of cybersecurity research to address evolving threats. Narrowly focused research is not enough to keep up with adversaries who plan attacks across systems.
2. Infrastructure is needed to enable experimental cybersecurity research and testing at scale. This allows innovations to be proven in realistic settings and better prepared for technology transfer.
3. New models are needed to successfully transfer university research into commercial products. Not all innovations work as expected outside controlled labs.
Wireless Sensor Network Nodes: Security and Deployment in the Niger-Delta Oil...IJNSA Journal
Wireless sensor networks (WSN) is tending towards becoming a complete solution in communication protocols, embedded systems and low-power implementations. However, the resource constraints which includes, limited communication range, limited energy, limited computing power, limited bandwidth and the fear of intruders have limited the WSN applications. Since lightweight computational nodes that are currently being used in WSN pose particular challenge for many security applications, the whole research therefore, is the investigation of new security techniques and appropriate implementation for WSN nodes, including various trade-offs such as implementation complexity, power dissipation, security flexibility and scalability. The goal of this research is to develop a network that has efficient and flexible key distribution scheme secured enough to prevent algorithmic complexity and denial of service attacks as well as the network able to conserve energy. A review of previous research to date in the area of security for WSNs was carried out and proposals are made based on security schemes that gather data in
an energy-efficient mechanism through secured pre-allocation of keys, faster clustering routing algorithm and dynamic based rekeying implementation.
The document discusses several testbeds and frameworks for evaluating intrusion detection systems (IDS), including the Air Force Evaluation Environment, LARIAT, and TIDeS testbeds. The TIDeS framework allows for customized testing scenarios, automated evaluations, and uses fuzzy logic to evaluate IDS performance based on metrics like detection depth, breadth, and false alarms. It generates realistic network profiles and traffic and can test IDSs under different environments.
This document discusses using wireless sensor networks for habitat monitoring. It presents requirements for a system to monitor seabird nesting environments and behaviors. The currently deployed network consists of 32 sensor nodes on an island off the coast of Maine that stream live data online. The application-driven design helps identify important areas for further work like data sampling, communications, network tasks, and health monitoring.
Top 10 intrusion detection system papesIJNSA Journal
The document summarizes 10 papers related to intrusion detection systems using genetic algorithms. The first paper discussed implements an intrusion detection system using a genetic algorithm to efficiently detect various types of network intrusions. It applies genetic algorithm parameters and evolution processes to filter traffic data and reduce complexity. It obtained reasonable detection rates when tested on the KDD99 benchmark dataset. The second paper combines naive Bayes and decision tree classifiers for adaptive intrusion detection. It aims to balance detections, reduce false positives, eliminate redundant attributes, and address data mining challenges. When tested on KDD99, it achieved high detection rates and significantly reduced false positives with limited resources.
27 5 jun17 28apr 15859 ammar final (edti ari baru))IAESIJEECS
The transition from analog technologies to digital technologies has increased the ever-growing concern for protection and authentication of digital content and data. Owners of digital content of any type are seeking and exploring new technologies for the protection of copyrighted multimedia content. Multimedia protection has become an issue in recent years, and to deal with this issue, researchers are continuously searching for and exploring new effective and efficient technologies. This thesis study has been prepared in order to increase the invisibility and durability of invisible watermarking by using the multilayer Discrete Wavelet Transform (DWT) in the frequency plane and embedding two marks into an image for the purpose of authentication and copyright when digital content travels through an unsecured channel. A novel watermarking algorithm has been proposed based on five active positions and on using two marks. In addition to the extraction process, watermarking images will be subjected to a set of attack tests. The evaluation criteria have been the bases of assessing the value of SNR, PNSR, MAE and RMSE for both the watermarking images and the watermarking images after attacks, followed by the invisibility of the watermarking being measured before and after the attacks. Our lab results show high robustness and high quality images obtaining value for both SNR and PNSR.
Abstract—With the heightening reliance on Information Technology in recent times, it has becoming more relevant to find measures to secure every online device, data and information. A Network Intrusion Detection System (NIDS) is one of the security options to consider to help protect such devices, data and information. However, IDS needs to be up to date to mitigate current threats to secure systems. A critical issue in the development of the right IDS is the scarcity of current data sets used for training these IDS and the impact on system performance. This paper presents an On-demand Network Data Set Creation Application (ONDaSCA) a Graphical User Interface software capable of generating labelled network intrusion data set. ONDaSCA grants IDS users or researchers the option to choose a raw data set and processed this data set as output, real-time packet capture and offline upload of existing PCAP file and two (2) difference packet capturing methods (Tshark and Dumpcap). ONDaSCA is highly customisable and an IDS user or researcher can leverage its capabilities to suit their needs. The abilities of this software are compared with other similar products that generate data set for use by IDS model.
International Journal of Computer Science and Information Security,IJCSIS ISSN 1947-5500, Pittsburgh, PA, USA
Email: ijcsiseditor@gmail.com
http://sites.google.com/site/ijcsis/
https://google.academia.edu/JournalofComputerScience
https://www.linkedin.com/in/ijcsis-research-publications-8b916516/
http://www.researcherid.com/rid/E-1319-2016
The document discusses the need for calibration standards for network devices used to collect forensic data on networks. Currently there are no standards, and network evidence is admitted in court without calibration of the collecting devices. This could lead to unreliable evidence and legal challenges. The document proposes developing network device calibration standards based on current network testing protocols to validate the reliability of network forensic data collection and support the admissibility of network evidence in court.
An Overview of Information Systems Security Measures in Zimbabwean Small and ...researchinventy
This paper reports on the Information Systems (IS) securitymeasures implemented by small and medium size enterprises (SMEs) in Zimbabwe. A survey questionnaire was distributed to 32 randomly selected participants in order to investigate the security measures and practices in their respective organisations. The results indicated that over 50% of the respondents had installed firewalls, while more than 80% carried out regular software updates and none of the respondents had intrusion detection systems. The researchers recommended that SMEs work to enhance their knowledge on the different IS threats in order to enable the implementation of preventive measures.
This document presents a proposed model for an intrusion detection system using data mining techniques. The proposed model combines clustering and classification methods. Specifically, it uses k-means clustering to group data and then applies naive Bayes classification. This is intended to improve performance over existing IDS systems by leveraging data mining concepts. The proposed model is described as enhancing efficiency by reducing false alarms and missed detections compared to prior work.
Presentation from the 2013 Bio-IT World conference. It describes the design and implementation of data and compute infrastructure for the New York Genome Center.
Network infrastructures have played important part in most daily communications for business industries,
social networking, government sectors and etc. Despites the advantages that came from such
functionalities, security threats have become a daily struggle. One major security threat is hacking.
Consequently, security experts and researchers have suggested possible security solutions such as
Firewalls, Intrusion Detection Systems (IDS), Intrusion Detection and Prevention Systems (IDP) and
Honeynet. Yet, none of these solutions have proven their ability to completely address hacking. The reason
behind that, there is a few researches that examine the behavior of hackers. This paper formally and
practically examines in details the behavior of hackers and their targeted environments. Moreover, this
paper formally examines the properties of one essential pre-hacking step called scanning and highlights its
importance in developing hacking strategies. Also, it illustrates the properties of hacking that is common in
most hacking strategies to assist security experts and researchers towards minimizing the risk of hack.
This document provides an overview of intrusion detection systems (IDS), including their challenges, potential solutions, and future developments. It discusses how IDS aim to detect attacks against computer systems and networks. The challenges of high false alarm rates and dependency on the environment are outlined. Potential solutions explored include data mining, machine learning, and co-simulation mechanisms. Alarm correlation techniques are examined as ways to combine fragmented alert information to better interpret attack flows. Artificial intelligence is seen as important for improving IDS flexibility, adaptability, and pattern recognition.
Finding Diversity In Remote Code Injection Exploitsamiable_indian
1. The document analyzes the diversity among remote code injection exploits by collecting exploit samples from network traces, extracting and emulating shellcodes, and clustering the shellcodes based on an exedit distance metric.
2. It finds that exploits can be grouped into families based on the vulnerability targeted. The LSASS and ISystemActivator exploit families show subtle variations among related exploits, while RemoteActivation exploits exhibit more diversity.
3. Analyzing exploit phylogenies reveals code sharing among families and subtle variations within families, providing insights into the emergence of polymorphism in malware payloads.
Analyzing and implementing of network penetration testingEngr Md Yusuf Miah
The primary objective for a analysis of network penetration test is to identify exploitable vulnerabilities in applications before hackers are able to discover and exploit them. Network penetration testing will reveal real-world opportunities for hackers to be able to compromise applications in such a way that allows for unauthorized access to sensitive data or even take-over systems for malicious/non-business purposes.
A Proactive Approach in Network Forensic Investigation ProcessEditor IJCATR
nformation Assurance and Security (IAS) is a crucial component in the corporate environment to ensure that the secrecy of
sensitive data is protected, the integrity of important data is not violated, and the availability of critical systems is guaranteed. The
advancement of Information communication and technology into a new era and domain such as mobility and Internet of Things,
its ever growing user’s base and sophisticated cyber-attacks forces the organizations to deploy automated and robust defense
mechanism to manage resultant digital security incidences in real time. Digital forensic is a scientific process that facilitates
detection of illegal activities and in-appropriate behaviors using scientific tools, techniques and investigation frameworks. This
research aims at identifying processes that facilitate and improves digital forensic investigation process. Existing digital forensic
framework will be reviewed and the analysis will be compiled toderive a network forensic investigation framework that include
evidence collection, preservation and analysis at a sensor level and in real time. It is aimed to discover complete relationship with
optimal performance among known and unseen/new alerts generated by multiple network sensors in order to improve the quality
of alert and recognize attack strategy
The DETER Project: Towards Structural Advances in Experimental Cybersecurity ...DETER-Project
Abstract: It is widely argued that today's largely reactive, "respond and patch" approach to securing cyber systems must yield to a new, more rigorous, more proactive methodology. Achieving this transformation is a difficult challenge. Building on insights into requirements for cyber science and on experience gained through 8 years of operation, the DETER project is addressing one facet of this problem: the development of transformative advances in methodology and facilities for experimental cybersecurity research and system evaluation. These advances in experiment design and research methodology are yielding progressive improvements not only in experiment scale, complexity, diversity, and repeatability, but also in the ability of researchers to leverage prior experimental efforts of others within the community. We describe in this paper the trajectory of the DETER project towards a new experimental science and a transformed facility for cyber-security research development and evaluation.
For more information, visit: http://www.deter-project.org
The Science of Cyber Security Experimentation: The DETER ProjectDETER-Project
Terry Benzel provided the keynote address at the 11th Annual Computer Security Applications Conference (ACSAC). This document is the invited paper that she addressed in her keynote.
Abstract: Since 2004, the DETER Cyber-security Project has worked to create an evolving infrastructure – facilities, tools, and processes – to provide a national resource for experimentation in cyber security. Building on our insights into requirements for cyber science and on lessons learned through 8 years of operation, we have made several transformative advances towards creating the next generation of DeterLab. These advances in experiment design and research methodology are yielding progressive improvements not only in experiment scale, complexity, diversity, and repeatability, but also in the ability of researchers to leverage prior experimental efforts of other researchers in the DeterLab user community. This paper describes the advances resulting in a new experimentation science and a transformed facility for cyber-security research development and evaluation.
Further described: http://www.deter-project.org/blog/deter_-_keynote_address_acsac_key_new_web_site
For additional information, visit:
- http://www.deter-project.org
- http://info.deterlab.net
Use of network forensic mechanisms to formulate network securityIJMIT JOURNAL
Network Forensics is fairly a new area of research which would be used after an intrusion in various
organizations ranging from small, mid-size private companies and government corporations to the defence
secretariat of a country. At the point of an investigation valuable information may be mishandled which
leads to difficulties in the examination and time wastage. Additionally the intruder could obliterate tracks
such as intrusion entry, vulnerabilities used in an entry, destruction caused, and most importantly the
identity of the intruder. The aim of this research was to map the correlation between network security and
network forensic mechanisms. There are three sub research questions that had been studied. Those have
identified Network Security issues, Network Forensic investigations used in an incident, and the use of
network forensics mechanisms to eliminate network security issues. Literature review has been the
research strategy used in order study the sub research questions discussed. Literature such as research
papers published in Journals, PhD Theses, ISO standards, and other official research papers have been
evaluated and have been the base of this research. The deliverables or the output of this research was
produced as a report on how network forensics has assisted in aligning network security in case of an
intrusion. This research has not been specific to an organization but has given a general overview about
the industry. Embedding Digital Forensics Framework, Network Forensic Development Life Cycle, and
Enhanced Network Forensic Cycle could be used to develop a secure network. Through the mentioned
framework, and cycles the author has recommended implementing the 4R Strategy (Resistance,
Recognition, Recovery, Redress) with the assistance of a number of tools. This research would be of
interest to Network Administrators, Network Managers, Network Security personnel, and other personnel interested in obtaining knowledge in securing communication devices/infrastructure. This research provides a framework that can be used in an organization to eliminate digital anomalies through network forensics, helps the above mentioned persons to prepare infrastructure readiness for threats and also enables further research to be carried on in the fields of computer, database, mobile, video, and audio.
USE OF NETWORK FORENSIC MECHANISMS TO FORMULATE NETWORK SECURITYIJMIT JOURNAL
Network Forensics is fairly a new area of research which would be used after an intrusion in various
organizations ranging from small, mid-size private companies and government corporations to the defence
secretariat of a country. At the point of an investigation valuable information may be mishandled which
leads to difficulties in the examination and time wastage. Additionally the intruder could obliterate tracks
such as intrusion entry, vulnerabilities used in an entry, destruction caused, and most importantly the
identity of the intruder. The aim of this research was to map the correlation between network security and
network forensic mechanisms. There are three sub research questions that had been studied. Those have
identified Network Security issues, Network Forensic investigations used in an incident, and the use of
network forensics mechanisms to eliminate network security issues. Literature review has been the
research strategy used in order study the sub research questions discussed. Literature such as research
papers published in Journals, PhD Theses, ISO standards, and other official research papers have been
evaluated and have been the base of this research. The deliverables or the output of this research was
produced as a report on how network forensics has assisted in aligning network security in case of an
intrusion. This research has not been specific to an organization but has given a general overview about
the industry. Embedding Digital Forensics Framework, Network Forensic Development Life Cycle, and
Enhanced Network Forensic Cycle could be used to develop a secure network. Through the mentioned
framework, and cycles the author has recommended implementing the 4R Strategy (Resistance,
Recognition, Recovery, Redress) with the assistance of a number of tools. This research would be of
interest to Network Administrators, Network Managers, Network Security personnel, and other personnel
interested in obtaining knowledge in securing communication devices/infrastructure. This research
provides a framework that can be used in an organization to eliminate digital anomalies through network
forensics, helps the above mentioned persons to prepare infrastructure readiness for threats and also
enables further research to be carried on in the fields of computer, database, mobile, video, and audio.
Testimony of Terry V. Benzel, University of Southern California Information S...DETER-Project
1. The witness discusses the importance of expanding the scope of cybersecurity research to address evolving threats. Narrowly focused research is not enough to keep up with adversaries who plan attacks across systems.
2. Infrastructure is needed to enable experimental cybersecurity research and testing at scale. This allows innovations to be proven in realistic settings and better prepared for technology transfer.
3. New models are needed to successfully transfer university research into commercial products. Not all innovations work as expected outside controlled labs.
Table of Content - International Journal of Managing Information Technology (...IJMIT JOURNAL
The International Journal of Managing Information Technology (IJMIT) is a quarterly open access peer-reviewed journal that publishes articles that contribute new results in all areas of the strategic application of information technology (IT) in organizations. The journal focuses on innovative ideas and best practices in using IT to advance organizations – for-profit, non-profit, and governmental. The goal of this journal is to bring together researchers and practitioners from academia, government and industry to focus on understanding both how to use IT to support the strategy and goals of the organization and to employ IT in new ways to foster greater collaboration, communication, and information sharing both within the organization and with its stakeholders. The International Journal of Managing Information Technology seeks to establish new collaborations, new best practices, and new theories in these areas.
An organized and Secured Local Area Network in Naval Post Graduate SchoolJude Rainer
This document presents a case study on establishing an organized and secure local area network for the laboratories at the Naval Postgraduate School. It discusses the current unsecured state of the network and identifies problems like unorganized user accounts, unsafe external device connections, and indirect instructor-student communication. The study aims to implement solutions like active directory for user organization, a proxy server to block unnecessary websites, disabling external ports, a local mail server for communication, and a firewall for security screening. The overall goal is to develop a fully secured network that protects hardware, software and information resources.
Open Source and Cyber Security: Open Source Software's Role in Government Cyb...Great Wide Open
This document summarizes the cybersecurity research agenda of the U.S. Department of Homeland Security Science and Technology Directorate. It discusses how DHS is focusing on areas like critical infrastructure security, open source software, cyber-physical systems, and new technology programs. The research aims to drive innovation in cybersecurity solutions through collaboration with academia, industry and open source communities to address evolving threats and transition technologies for real-world use.
AN EFFICIENT SEMANTIC DATA ALIGNMENT BASED FCM TO INFER USER SEARCH GOALS USI...pharmaindexing
This document discusses a two-way chained packet marking technique for secure communication in wireless sensor networks. It aims to provide a scheme for detecting attacks by creating a bidirectional link between packets. Any packets found without this link information will be eliminated at network boundaries, improving security. Neighboring packets will be marked to form a chain of legitimate messages, preserving originality and mitigating attacks like jamming.
Network security is one of the foremost anxieties of the modern time. Over
the previous years, numerous studies have been accompanied on the
intrusion detection system. However, network security is one of the foremost
apprehensions of the modern era this is due to the speedy development and
substantial usage of altered technologies over the past period. The
vulnerabilities of these technologies security have become a main dispute
intrusion detection system is used to classify unapproved access and unusual
attacks over the secured networks. For the implementation of intrusion
detection system different approaches are used machine learning technique
is one of them. In order to comprehend the present station of application of
machine learning techniques for solving the intrusion discovery anomalies in
internet of thing (IoT) based big data this review paper conducted. Total 55
papers are summarized from 2010 and 2021 which were centering on the
manner of the single, hybrid and collaborative classifier design. This review
paper also includes some of the basic information like IoT, big data, and
machine learning approaches are discussed.
CIS505008VA0- COMMUNICATION TECHNOLOGIESWeek 2 Assignment 1 Su.docxsleeperharwell
CIS505008VA0- COMMUNICATION TECHNOLOGIES
Week 2 Assignment 1 Submission
45% highest match
Attachments (1)
· week 2 assingment 1 cis 505.doc 45%
Word Count: 643
week 2 assingment 1 cis 505.doc
Running head: WIRELESS TECHNOLOGY 1
WIRLESS TECHNOLOGY 2
CIS 505
Wireless Technology
1 04/15/2019
1. 2 SELECT ONE OF THE WORKING GROUPS IN THE IETF OR IEEE AND BRIEFLY SUMMARIZE WHAT THIS GROUP IS WORKING ON
3 I WILL SELECT IETF WHICH IS THE INTERNET ENGINEERING TASK FORCE IN FULL. 4 THE MISSION OF IETF IS TO ENSURE THE INTERNET IS WORKING BETTER BY THE PROVISION OF HIGH QUALITY AND TECHNICAL DOCUMENTS WHICH INFLUENCE THE WAY PEOPLE DESIGN AND MANAGE THE INTERNET. Furthermore, it develops internet standards which make up the Transmission control protocol or the internet protocol (Schuch & Pritham, 2014). 4 THE WEB SECURITY INVOLVES PROTECTING INFORMATION BY PREVENTING, DETECTING AND RESPONDING TO NEW ATTACKS. CONVERSELY, POLICY COMMUNICATION MECHANISMS AND SECURITY TECHNIQUES HAVE SPRINKLED THROUGH VARIOUS LAYERS OF THE WEB AND HTTP. IN THAT CASE, THE ROLE OF THIS WORKING GROUP IS TO COMPOSE THE OVERALL DOCUMENT DERIVED FROM SURVEYING THE ISSUE OUTLINED.
2. 5 JUSTIFY THE NEED OF THE IEEE 802 STANDARD USED IN NETWORKING
6 IEEE 802 IS TERMED AS THE INSTITUTE OF THE ELECTRICAL AND ELECTRONICS ENGINEERS STANDARDS THAT ARE SET TO DEFINE THE STANDARDS AND PROTOCOLS FOR THE WIRED LOCAL AREA NETWORKS, THE METROPOLITAN AREA NETWORKS ALONG WITH THE WIRELESS NETWORKS. Furthermore, it also defies the characteristics, protocols along with the services the networks which carry different sizes of packets. 6 IEEE 802 STANDARD PROMOTES THE APPLIANCES INTEROPERABILITY, DEVELOPMENT, PRICE REDUCTION ALONG WITH STABLE FUTURE MIGRATION (HINDEN, 2015). The rationale for the 802 network standard is usually to facilitate intercommunication among different equipment from different manufacturers. In that case, it can be termed as the fundamental coordinating standard.
IEEE 802 also defines the network standards for physical components which comprise of network interface cards, WAN components together with cabling and will continue to advance with technology.
3. 2 EVALUATE THE THREE STANDARD ORGANIZATIONS INCLUDING IEEE, ISO, AND ANSI TO DETERMINE THE MOST IMPORTANT FOR COMMUNICATION TECHNOLOGY
6 IEEE IS THE NONPROFIT PROFESSIONAL ORGANIZATION THAT WAS FOUNDED BY ENGINEERS FOR THE PURPOSE OF CONSOLIDATING IDEAS THAT DEAL WITH ELECTROTECHNOLOGY. 7 HOWEVER, IT PLAYS A KEY ROLE IN PUBLISHING TECHNICAL WORKS AND STANDARDS DEVELOPMENT.
ISO which is the International Standards Organization is a non-governmental organization that plays a key role in the manufacturing sector. 8 IT USUALLY ENSURES HIGH QUALITY PRODUCTS WHICH ARE BOTH SAFE AND RELIABLE. This is so because of the value of the standards, the company requirement, and the sales and marketing advantage.
9 ANSI IS THE AMERICAN NATIONAL STANDARDS INSTITUTE IS AN ORGANIZATION RESPONSIBLE FOR ADMINISTRATION AND COORDINATION OF THE US.
Research Paper TopicITS835 – Enterprise Risk Managemen.docxaudeleypearl
Research Paper Topic
ITS835 – Enterprise Risk Management
Dr. Jerry Alsay
University of the Cumberlands
Introduction
All research reports begin with an introduction. (1 – 2 Pages)
Background
Provide your reader with a broad base of understanding of the research topic. The goal is to give the reader an overview of the topic, and its context within the real world, research literature, and theory. (3 – 5 Pages)
Problem Statement
This section should clearly articulate how the study will relate to the current literature. This is done by describing findings from the research literature that define the gap. Should be very clear what the research problem is and why it should be solved. Provide a general/board problem and a specific problem (150 – 200 Words)
Literature Review
Using your annotated bibliography, construct a literature review. (3-5 pages)
Discussion
Provide a discussion about your specific topic findings. Using the literature, you found, how do you solve your problem? How does it affect your general/board problem?
References
Running Head: CLOUD COMPUTING AND DATA SECURITY1
Cloud Computing and Data Security
Naresh Rama
Professor Dr.Jerry Alsay
07/14/2019
Cloud Computing and Data Security
Introduction
In today's world, the movement of data is from a store that is severe and it is located centrally to the storage of cloud, services in the cloud offer the flexibility, scalability, and concerns that are proportionate that concerns the issue of security. Safety is an aspect that is important and it associated with the computing of cloud because information can be stored on the cloud by the users with the help of providers that works in the service of the cloud. In the security f data and computing of the cloud, there are some problems that are available. They include backups of data that is improper and inadequate that have caused organizations been among those that are vulnerable to threats that re-associated with security measures.
Data that is found in an organization and is stored in files that are encrypted are interfered by these threats. Problem found under these investigations is significant to this study and these show that the threats that emerge because of backups concerning data that is improper lead to an issue that is significant in the security of data in the computing cloud and also security concerning data.
The study tends to shows that security of data and computing of data leads to the provision of ways that helps in the protection of data that is private and also information that is classified away from such threats. That may include attacks in the cyber sector and losses that occur in case of disasters (Strategic Cyber Security, 2011). This study has limitations that state that assurance of security to the computing of cloud is not available and that there is no protection of data that is vital in an organization to a hundred percent.
Background
Hacke ...
1) The document discusses security issues in computer networks and proposes contemporary solutions. It covers topics like cryptography, secure data access, intrusion detection, and secure routing.
2) The literature review discusses previous research on wireless sensor network security including common attacks, requirements, and defenses. It also examines security issues that arise from the unique characteristics of wireless networks.
3) The document proposes that more research is still needed on topics like quantifying security costs and benefits, data integrity, survivability, and security for data-centric wireless sensor networks. A holistic security model is needed that integrates solutions at each network layer.
The United States National Institute of Standards and Technology (NIST) has p...Michael Hudak
The document defines cloud computing based on recommendations from the National Institute of Standards and Technology (NIST). It identifies five essential characteristics of cloud computing (on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service). It also outlines three service models (Software as a Service, Platform as a Service, and Infrastructure as a Service) and four deployment models (private cloud, community cloud, public cloud, and hybrid cloud). The purpose is to provide an informal definition to inform public debate on cloud computing.
«Определение понятия «облачные вычисления» (от National Institute of Standard...Victor Gridnev
The document defines cloud computing based on recommendations from the National Institute of Standards and Technology (NIST). It identifies five essential characteristics of cloud computing (on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service). It also outlines three service models (Software as a Service, Platform as a Service, and Infrastructure as a Service) and four deployment models (private cloud, community cloud, public cloud, and hybrid cloud). The purpose is to provide an informal definition to inform public debate on cloud computing.
Cloud Forensics: Drawbacks in Current Methodologies and Proposed SolutionIJERA Editor
The document summarizes research on cloud forensics and proposes a new framework. It discusses drawbacks in current cloud forensics methodologies, including an inability to collect evidence from all cloud subscribers and support future forensic objectives. The proposed framework aims to offer a forensic-friendly cloud platform that facilitates easy access to cloud evidence in a forensically-sound manner while minimizing non-essential data and maximizing reliability, relevance and authenticity of collected evidence.
Experimental analysis of intrusion detection systems using machine learning a...IJECEIAES
Since the invention of the internet for military and academic research purposes, it has evolved to meet the demands of the increasing number of users on the network, who have their scope beyond military and academics. As the scope of the network expanded maintaining its security became a matter of increasing importance. With various users and interconnections of more diversified networks, the internet needs to be maintained as securely as possible for the transmission of sensitive information to be one hundred per cent safe; several anomalies may intrude on private networks. Several research works have been released around network security and this research seeks to add to the already existing body of knowledge by expounding on these attacks, proffering efficient measures to detect network intrusions, and introducing an ensemble classifier: a combination of 3 different machine learning algorithms. An ensemble classifier is used for detecting remote to local (R2L) attacks, which showed the lowest level of accuracy when the network dataset is tested using single machine learning models but the ensemble classifier gives an overall efficiency of 99.8%.
Network forensics comes under the domain of digital forensics and deals with evidences left behind on the networkiafter a cyber-attack. It is indication of the weakness that led to the crime and the possible cause. Network focused research comes up with many challenges which involves the collection, storage, content, privacy, confiscation and the admissibility. It is important and critical for any network forensic researcher or the investigator to consider adopting efficient forensic network investigation framework or the methodologies in order to improve investigation process. The main aim of this research contribution was to do a comprehensive analysis of concepts of networks forensics through extensive investigation and by analyzing various methodologies and associated tools which should be used in the network forensic investigations. Detailed and in depth analysis of concepts of network forensic investigation on a designed/conceived network architecture was carried out which was then followed by analyzing various methodologies and tools employed. An innovative framework for the investigation was designed which can be used by any forensic expert. The acquired data was analyzed by using information, strategizing and collecting evidence and by analyzing and reporting of the methodologies on the conceptualized network. Consequently, it led to the researcher to adopt and utilize a powerful and efficient forensic network methodology that will ultimately help in improving the investigation process and providing required tools/techniques along with the requisite guidelines that will determine the approach, methods, and strategies which are to be used for networkiforensiciprocess to be followed and be executed with the use of relevant tools that will tend to help in the simplification and improvement of the forensics investigation process.
Digital Security by Design: ISCF Digital Security by Design Research Projects...KTN
KTN ran a collaborators' workshop on 26 September 2019 in London to explain more about the Digital Security by Design Challenge announced by the government.
The Digital Security by Design challenge has been recently announced by the Department for Business, Energy & Industrial Strategy (BEIS). This challenge, amounting to £70 million of government funding over 5 years, was delivered by UK Research and Innovation (UKRI) through the Industrial Strategy Challenge Fund (ISCF).
This Collaborators' Workshop provides an opportunity to hear more details of the challenge and forthcoming competitions.
A Scoping Workshop for this challenge was held on 30th May: http://ow.ly/oz6230pHlGl
Find out more about the Defence and Security Interest Group at https://ktn-uk.co.uk/interests/defence-security
Join the Defence and Security Interest Group at https://www.linkedin.com/groups/8584397 or Follow KTN_UK Defence group on Twitter https://twitter.com/KTNUK_Defence
Similar to The DETER Project: Advancing the Science of Cyber Security Experimentation and Test (20)
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Letter and Document Automation for Bonterra Impact Management (fka Social Sol...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on automated letter generation for Bonterra Impact Management using Google Workspace or Microsoft 365.
Interested in deploying letter generation automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Dive into the realm of operating systems (OS) with Pravash Chandra Das, a seasoned Digital Forensic Analyst, as your guide. 🚀 This comprehensive presentation illuminates the core concepts, types, and evolution of OS, essential for understanding modern computing landscapes.
Beginning with the foundational definition, Das clarifies the pivotal role of OS as system software orchestrating hardware resources, software applications, and user interactions. Through succinct descriptions, he delineates the diverse types of OS, from single-user, single-task environments like early MS-DOS iterations, to multi-user, multi-tasking systems exemplified by modern Linux distributions.
Crucial components like the kernel and shell are dissected, highlighting their indispensable functions in resource management and user interface interaction. Das elucidates how the kernel acts as the central nervous system, orchestrating process scheduling, memory allocation, and device management. Meanwhile, the shell serves as the gateway for user commands, bridging the gap between human input and machine execution. 💻
The narrative then shifts to a captivating exploration of prominent desktop OSs, Windows, macOS, and Linux. Windows, with its globally ubiquitous presence and user-friendly interface, emerges as a cornerstone in personal computing history. macOS, lauded for its sleek design and seamless integration with Apple's ecosystem, stands as a beacon of stability and creativity. Linux, an open-source marvel, offers unparalleled flexibility and security, revolutionizing the computing landscape. 🖥️
Moving to the realm of mobile devices, Das unravels the dominance of Android and iOS. Android's open-source ethos fosters a vibrant ecosystem of customization and innovation, while iOS boasts a seamless user experience and robust security infrastructure. Meanwhile, discontinued platforms like Symbian and Palm OS evoke nostalgia for their pioneering roles in the smartphone revolution.
The journey concludes with a reflection on the ever-evolving landscape of OS, underscored by the emergence of real-time operating systems (RTOS) and the persistent quest for innovation and efficiency. As technology continues to shape our world, understanding the foundations and evolution of operating systems remains paramount. Join Pravash Chandra Das on this illuminating journey through the heart of computing. 🌟
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Trusted Execution Environment for Decentralized Process MiningLucaBarbaro3
Presentation of the paper "Trusted Execution Environment for Decentralized Process Mining" given during the CAiSE 2024 Conference in Cyprus on June 7, 2024.
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxSitimaJohn
Ocean Lotus cyber threat actors represent a sophisticated, persistent, and politically motivated group that poses a significant risk to organizations and individuals in the Southeast Asian region. Their continuous evolution and adaptability underscore the need for robust cybersecurity measures and international cooperation to identify and mitigate the threats posed by such advanced persistent threat groups.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
2. by a dynamically reconfigurable, switched network. Resources
of the testbed are controlled and allocated to individual
experiments using an extended version of the Emulab [4]
testbed control software. Users receive exclusive, hardware-
level access to the number of machines they need, and may set
up network topologies, operating systems, and applications of
their choice.
DETER is an open testbed, developed and supported as a
national resource by DHS, NSF, and additional sponsors.
DETER is administered using a project model. Any researcher
is eligible to become a project member. Eligibility of project
leaders is verified by confirming that the applicant is
employed by a known institution or otherwise has roots in the
community, for example through the production of prior
research output such as whitepapers, papers, presentations or
product catalogs. Project leaders approve membership of
regular users under their project, thus e.g., a faculty member
would approve his/her students. There is no cost to use the
DETER testbed and tools. There are also no special technical
requirements. The testbed can be accessed from any machine
that runs a web browser and has an SSH client. Experimental
nodes are accessed through a single portal node via SSH.
Under normal circumstances, no traffic is allowed to leave or
enter an experiment except via this SSH tunnel.
B. Who Uses DETER?
The DETER testbed and tools have been used by 1387 users
(as of August 2010), from 14 countries and 4 continents. The
majority of DETER users are located in the North America
and Europe, but lately we are seeing an increased use in
regions such as Middle and Far East, India and South
America. Around 70% of our users come from academia,
20% from industry and the remainder from government
schools and organizations.
C. What Can DETER Do Today?
DETER is used today to support a variety of cyber-security
research. The largest use occurs in the following research
fields: comprehensive security systems, privacy, spoofing,
worms, product evaluation, overlays as security solutions,
attack traceback, multicast security, wireless security,
intrusions, botnets, denial-of-service, malware, infrastructure
security and new security architectures. A significant number
of users use DETER as a platform to test new testbed
technologies, or to learn how to deploy facilities similar to
DETER at their own institutions. Finally, in the last three
years we have seen a dramatic increase in number of projects
and users that use DETER in classroom environments, to
supplement regular cyber-security education with practical
exercises. These exercises increase student interest in material
and improve comprehension of topics taught in class. The
project strongly supports such use. DETER is currently setting
up a Moodle [5] server, slated to be made public in Fall 2010,
that will host educational content and facilitate its sharing
between teachers, and is adding account management and
scheduling enhancements designed to simplify classroom and
educational user administration.
The project has developed a number of new capabilities for
security research, in addition to the range of useful abilities
inherited from the Emulab software. Among those deployed
today are:
• An integrated experiment management and control
environment called SEER [6][7], with a rich set of
traffic generators and monitoring tools.
• The ability to run a small set of risky experiments in a
tightly controlled environment that maximizes research
utility and minimizes risk [8].
• The ability to run large-scale experiments through
federation [9] with other testbeds that run Emulab
software, and more recently with facilities that utilize
other classes of control software.
D. What Have We Learned?
While DETER’s contributions in the five years of its existence
have been significant, the experience of running the testbed
and developing tools for cyber-security experimentation has
helped us recognize significant new challenges in this field.
We believe that these challenges are major obstacles to easy,
versatile, realistic, scientific and repeatable experimentation
needed to advance the state of cyber defense research. These
challenges range from those that are purely technical to those
that are primarily human-oriented and community-focused.
A first challenge lies in the diversity of users and uses of
network testbeds, and how to best support all at a fixed budget
and within a shared infrastructure. Many testbed users have
insufficient systems knowledge to effectively use the facility,
while others may benefit from a focus on increased scientific
rigor. Each of these classes of users benefit from significant
help to learn how to most effectively use the testbed for their
research.
On the other hand, there are a few sophisticated users that
require advanced capabilities and need little guidance to
produce highly visible results in terms of publications and new
products. Here, the challenge is to most effectively get out of
the way, ensuring that mechanisms established for ease of use
and guidance of mainstream users are not impediments to use
by this sophisticated user class.
Finally, there is a large diversity of testbed use patterns
between the research, industry/government and education
communities. Researchers tend to investigate new phenomena
in a collaborative manner, and build unique tools for each new
project they undertake. Industry/government users tend to
focus on system evaluation and expect a standardized
evaluation environment and test cases. Educational users need
a special set of protections to minimize cheating and manage
both desired and undesired sharing of work across groups.
They further may lack sufficient systems background to
2
3. intuitively “jump in” to using a testbed, which leads to a steep
learning curve and few useful results.
A second challenge lies in the fast pace with which the cyber-
security field is moving. As new technologies, attacks and
defenses appear with great frequency, it is difficult for testbed
development efforts to offer detailed support for
experimentation in all emerging areas. Creation of new tools
and mechanisms requires significant domain expertise and
time investment. By the time support is available it may
already be obsolete. We believe that the only sustainable
approach is to adopt an overall focus on proactive
development of broad capabilities, rather than reactive
response to specific, narrow risks and attacks. We combine
this proactive overall focus with an effort to foster and
catalyze domain-specific user communities, offering
motivation and support for these communities to share,
exchange, and reuse the latest information and tools.
A third challenge lies in the need for determining and then
creating appropriate realism in the experimental environment.
Two separate factors are important. First, for any particular
experiment, it is necessary to understand which aspects of the
experiment must accurately reflect the technology’s actual
intended operating environment (the “real world”), and to
what degree. Second, for each such aspect, it is necessary to
develop an appropriate model or configuration for the
experiment.
For example, consider an Internet distributed denial of service
(DDOS) experiment. There is very little detailed data available
about Internet topology and traffic, and what is available must
be heavily transformed to create experiment-specific models
appropriate for use in a smaller-scale, low-diversity testbed.
This adds significant time and challenge to experimentation
that may not directly lead to publishable research, but is
necessary to facilitate it. Users tend to view this as overhead,
and consequently often design naïve experiments that
minimize this overhead but lead to flawed methodologies or
results.
A fourth challenge lies in the unpredictability and complexity
of working with real hardware and software, especially at
large scale. It is difficult for users to verify that their
experiment’s behavior matches what is intended, and that the
results contain no misleading artifacts. It is even more difficult
to diagnose the source of experimental artifact and error when
it occurs.
III. NEW DIRECTIONS FOR CYBER-SECURITY RESEARCH
AND DETER
The DETER project’s vision of the future of cyber security
experimentation is as large a step forward from the current
state of the art as the current DETER testbed is from the small
isolated testbeds traditionally seen before DETER. The future
DETER testbed will:
• support larger and more complex experiments,
• advance the scientific quality and accuracy of
experimental results,
• provide adaptive and domain-specific tools to help
users create new experiments and to deal with the great
complexity of large-scale cyber security experiments,
• build a knowledge base of experimental designs and
results,
• incorporate an extensible software architecture,
• provide a user-friendly interface for both novice and
experienced users,
• and, as a result of these advancements, support a
significantly larger and more diverse research
community.
We now provide more details about three major research and
development initiatives the DETER project is undertaking to
reach this future: advancing the science of experimentation,
advancing testbed technology, and supporting new application
domains. While DETER focuses on cyber-security
experimentation, these initiatives will advance the state of
network testbeds in general.
Figure 1: Experiment lifecycle
3
4. A. Creating an Advanced Scientific Instrument
As a scientific instrument, a network testbed must provide
repeatability, validity, and usability. It should advance the
scientific enterprise by helping experimenters to distinguish
valid results from artifacts, and to build on each other’s work.
Furthermore, it should provide a significant expansion of
experiment scope beyond that available today, for example:
support for structured experiment design and worst-case
experiments, multi-party experiments, multi-dimensional
experiments, and experiments that interact safely with the
outside world. We briefly outline each of these areas:
a. Worst-case experiments focus on explicitly identifying
and exercising worst-case or corner-case behaviors,
pushing to the breaking point one or more dimensions of
the technology or scenario under investigation. These
experiments closely resemble attack scenarios from real
life. Yet they are fundamentally different from
experiments commonly carried out today in network
testbeds, which avoid extremes to ensure determinism
and to focus on behavior when the technology is
operating properly. They are particularly challenging in
cases where the worst-case evaluation may misleadingly
trigger limitations of the testbed infrastructure, masking
the behavior of the actual scenario under test.
b. Multi-party experiments involve experimental scenarios
with two or more participating entities (human or
algorithmic), each with only partial visibility into the
modeled universe. These experiments closely resemble
security interactions in the real world. For example, a
multi-party experiment might combine experimenters
playing different roles (e.g., attacker/defender or local
facility administrator/global network security operator),
or experimenters with different levels of expertise.
Consider a concrete scenario. In an experiment
involving a malware defense system, the system and its
operators may have detailed knowledge of their own
network and host environment, but only limited and
inaccurate information about the overall network
topology and the actions of the distant attacker. Actions
taken to observe, analyze, and undermine the attack
must be carried out through this lens. Current network
testbeds do not support such information hiding and
fine-grain visibility control.
c. Multi-dimensional experiments investigate multiple
dimensions (technologies, phenomena, etc.)
simultaneously. Such experiments are difficult to specify
and construct, yet often crucial to obtaining useful
results. It is of limited use to investigate individual
technologies (OS, application, user-interface, node
security, network routing, network transport, network
authentication, network firewall, network IDS, etc.) in
isolation, due to coupling effects and system-level
behavioral constraints. One complexity of security
experiments is created by the reality that security
problems and properties are emergent in the collection
of technical elements (e.g. inherently complex), and it is
rarely the case that simpler scenarios can simply be
studied independently and then composed or "added
together”.
d. Experiments that can interact with the larger world
outside the testbed are necessary to study current attack
trends. For example, many useful experiments require
some degree of interaction with the Internet, to capture
Internet properties such as fidelity, scale, and non-
determinism, or to interact with malware “in the wild”.
Yet such experiments are potentially risky. Even if built
for benign purpose, these experiments may lead to
unexpected and undesired interaction with the outside
world, such accidental use of the testbed to spread
malware to outside systems. What is required is, first,
methodologies to analyze and reason about such risks,
and second, new testbed technologies that preserve the
properties essential to the experiment but rigorously
minimize the risk.
We believe that the fundamental change needed to address the
above issues is a shift from focusing only on the runtime
configuration of an experiment at a highly concrete level, as is
the current practice in network testbeds, to capturing and
describing an experiment’s entire lifecycle at a more abstract
level, as shown in Figure 1. This includes such elements as the
experiment definition (topology, node configuration, traffic
generators, event generators, monitors), the workflow (set of
events that should occur in order or with specific timings to
define the experiment), the invariants (conditions necessary
for experiment success) and the analysis (types of statistics
collected and how they are processed and visualized). This
level of experiment description will advance the value of
DETER as a scientific instrument by:
• Supporting the use of models for classes of experiments. A
model is an abstract representation of an entire class of
experiments such as “worm propagation experiments”. It
defines actors (e.g., vulnerable and infected hosts), their
actions (e.g., once infected, start scanning randomly),
experiment invariants and relevant monitoring and analysis.
• Permitting the refinement of models into complete recipes.
This refinement occurs through concrete realization of a
model, including an interaction with users to bind specific
values to parameters (e.g., type of scanning = random). The
final outcome, a recipe, is a complete specification of an
experiment that can be directly run on the testbed.
• Supporting replay and automatic operation of experiments,
including “clusters” of experiments that explore different
points in a design space. This support might include such
capabilities as
− Automatically invoking instrumentation and configuring
visualization of results.
− Facilitating the standardization of experimental
conditions, and facilitating the execution of multiple
4
5. experiments while varying key parameters or conditions.
− Facilitating the sharing of experimental setups and
results as recipes or models.
This shift towards a richer model of experiment definition
represents a change in the basic research paradigm. It will
enable researchers to focus on their research and to create
sophisticated experiments that properly evaluate performance
of proposed technologies and algorithms. Better evaluation
practices and increased sharing should help to transform
research practice in the cyber security community from single-
point, naïve efforts into collaborative, evolving, and
sophisticated endeavors.
Another consequence of this shift will be to enable users to
build shared repositories of tools, data, experiments and
models for specific, popular research problems. This is very
important from the DETER project viewpoint, as it will foster
a more sustainable model of open-source and community
collaboration instead of the current centralized development
model, in which DETER staff provides tools and help to all
users.
Working towards this overall paradigm, a number of detailed
steps may be considered. The following paragraphs discuss a
subset of these details, which represent near-term aspects of
our work towards achieving the overall goal of creating an
advanced scientific instrument.
An experiment is defined along a number of dimensions.
Some dimensions are well known – e.g., topology or traffic
characteristics – while other dimensions may be specific to a
specific type of experiment (e.g. type of scanning for a worm
spread experiment). Dimension descriptions may range from
very concrete (as in a recipe) to very abstract (as in a model).
A model may generate detailed values for its dimensions, or it
may be queried to provide the details when needed. A model
may represent only a subset of the possible dimensions.
Composition of models for different dimensions will provide a
natural way to construct, reason about, and manipulate large,
complex experiments.
A major hindrance to effective testbed use is the steep
learning curve needed to set up and orchestrate experiments.
Ideally, it would be possible to create a powerful but user-
friendly experimenter support system that meets the needs of
different levels of users – from novice to sophisticated – and
all user activities – from classroom exercises to product testing
to scientific research. The interface framework for designing
and executing experiments should be unified, flexible, and
easily extensible. We use the term “workbench” for such an
experimenter interface. This term pertains to a general user
interface and the backend that helps users design and
manipulate experiments.
Risky experiment management is enabled by applying a wide
variety of techniques, from a priori constraints placed on an
experiment's implementation, to verification of key
assumptions, to external enforcement of invariants by the
testbed infrastructure. All of these – constraint, verification,
and enforcement – rely upon formal representations of the
assumptions and behaviors that describe the semantics of the
experiment, and reasoning about these assumptions and
behaviors to create constraints that preserve function but
manage risk. These constraints are divided between those
defined by the experimenter – who knows the precise goals
and objectives of the experiment – and those defined by the
DETER testbed operators – who know the overall assurance
goals of the testbed, and the potential vulnerabilities and
mitigations available through different enforcement
mechanisms [8].
Figure 2: Virtualization and embedding process in future DETER testbed
5
6. B. Advanced Testbed Technology
To explain this new capability, we contrast it with DETER’s
current Emulab-based mechanisms. Today, an experiment
description contains little more than the network wiring
diagram and some initial configuration information for each
node. An Emulab mechanism operating within DETER maps
this description to a set of physical nodes running the selected
OS and interconnects these nodes with VLANs that match the
network diagram.
In contrast, the advanced testbed technology we are
developing takes as input the experiment description created
by a researcher, using the advanced scientific instrument
mechanisms described in the previous section. Experiment
descriptions are composed of elements that may range from
concrete – physical nodes loaded with specific OS and
application software – to abstract models that can be
virtualized in a number of different ways, depending on the
required fidelity and experiment objectives. To guide this
process, descriptions incorporate semantic information such as
goals and invariants associated with the experiment. A novel
translation step, the virtualization engine, will then examine
each element and determine the appropriate physical or virtual
technology to realize that element on the testbed. Then, a
generalized embedder will allocate and configure the testbed
hardware and software resources needed to instantiate this set
of elements. Lastly, the federator will allocate the embedded
containers to physical nodes – optionally using remote as well
as local resources. The virtualization and embedding process
is shown in Figure 2.
We now give some examples of elements and the resource
containers in which they may be embedded to illustrate this
novel testbed technology. An element may be a single
computer running an OS, and as in Emulab, embedded as a
physical node running that OS, or alternatively as a virtual
guest OS on a virtualization layer that provides the necessary
properties to satisfy all invariants, including invariants for
performance (validity management) and containment of
malicious behavior (risky experiment management). On the
other hand, an element could be substantially more abstract –
for example, a host node within a 200,000 node botnet – that
would be embedded as thousands of ‘bot emulation threads on
hundreds of physical machines. An element could also be a
cluster of nodes together emulating the routing, loss, and delay
properties of a core network. Finally, an element could be a
specific switch, router, or firewall device – but rather than
using the real device or emulating the device on a general-
purpose PC, DETER would implement the device by loading
and configuring an FPGA-based platform – achieving both
high performance and a high degree of fidelity for complex
parallel algorithms – if that was the requirement for a given
experiment.
In summary, the result of deploying an experiment on the
future DETER testbed will be an engineered environment,
precisely tailored to support this experiment. It will consist of
an interconnected set of containers, each of which may be a
physical node, a virtual machine, a simulator thread realizing
some experiment-specific model, an FPGA-based platform
with customized firmware, or some other environment. The
virtualization and embedding process will not be restricted to a
fixed set of container classes nor to a fixed set of federated
testbed types. New containers classes – virtual machine
technologies, simulators, emulators, or hybrids – will be added
as plug-ins to the virtualization engine to allow evolution of
the models we can support. Similarly, the current DETER
federation architecture [10] will be augmented with additional
plug-ins to support the use of testbed resources – both
hardware and software – on remote testbeds using both
Emulab-derived and radically different testbed control
software – such as a custom SCADA testbed.
This envisioned capability subsumes all of the current
functionality of the present DETER testbed and DETER
federation architecture, while radically generalizing in three
dimensions. First, it enables the target containers to be not
only physical nodes, but also any virtualization technology in
the most general sense – any configuration of hardware and/or
software that we engineer to re-create the behavior of an
element of a cyber-experiment at an appropriate level of
abstraction for that experiment – that is, with the semantics
needed for experimental validity in the case being addressed.
Second, it enables the creation of multi-party experiments,
above and beyond the use of federation to exploit resources on
remote testbed facilities. This is supported through an ability
to combine the recipes for different experiments into large
composed experiments, while respecting the semantics of each
sub-experiment including hiding or controlling information
flows between elements of the federated experiment. Third, it
provides for the mapping between the scientific recipe
describing the experiment and the engineered containers
realizing the experiment to be informed and controlled by the
semantics of the experiment – so that different mappings are
possible depending on exactly what the goals and objectives of
the researcher are for a given experiment.
C. New Application Domains
The goal of any network testbed, including DETER, is to
support research in key and emerging areas. While it is
difficult to predict which security areas will dominate the field
in the future, we briefly describe here the support we plan for
three specific domains that currently satisfy this criteria: large-
scale semi-self-organizing systems, critical infrastructure, and
wireless.
1) Large-Scale Semi-Self-Organizing Systems
Botnets are a current example of large-scale, semi-self-
organizing systems (hereafter, SSOs) that represent powerful
and versatile platforms for attackers. These systems are
ultimately characterized by the overall aggregate behavior of
thousands or millions of elements, rather than the individual
behavior of a single element.
Cyber security researchers will study this class of threat for
years to come. SSOs pose a fundamentally new domain of
6
7. malware whose systematic investigation will require a suite of
very different tools and techniques in DETER. SSO’s differ
from viruses, worms and earlier malware in that the system
operates semi-autonomously, yet the attacker is actively
engaged at some level, managing the system, uploading new
software modules, and monitoring its effectiveness, anti-
reverse engineering protections, and rate of growth and
persistence on the Internet. In short, SSO’s can be thought of
as exhibiting a form of guided intelligence. The current
generation of botnet experimentation tools, such as SLINGbot
[11] and Rubot [12], emulate only the botnet communication
protocols. Emulating guided intelligence behavior at large
scale, or combining simulation with testbed emulation, is
needed to obtain realistic experiments in this research field.
2) Critical Infrastructure Support
The intersection of cyber security with critical infrastructure
lies in cyber-physical systems. Such systems, including power
grids, water, oil and gas distribution systems, control systems
for refineries and reactors, and transportation systems are all
vulnerable to attacks in both the cyber and physical realms,
separately or, more dangerously, in combination. Protecting
such systems requires the ability to model the reaction of such
cyber-physical systems to both kinds of attacks, and, more
interestingly, modeling the coupled behavior of the cyber and
physical systems to a single attack.
Our objective is to extend DETER to provide useful modeling
of such systems. Cyber-physical systems can be modeled in
DETER by simulating the physical system on some DETER
nodes operating as cluster computers, while emulating or
modeling the cyber control system using standard DETER
capabilities. Ties between the cyber and physical elements,
corresponding to the sensors and effectors that are present in
such infrastructure, must be modeled or emulated as well.
Critical infrastructure systems are naturally federated, with
portions of the physical infrastructure owned and operated by
multiple utilities, operating under oversight by multiple
jurisdictions. In the longer term we envision the ability to
model individual domains within such infrastructure on
separate testbeds, and to use testbed support for federation to
allow for participation in the modeling by actual operators,
with support for information hiding. For example, utilities
could participate in emulations of the larger system, modeling
their own response, without compromising the own sensitive
information on which those responses are based.
3) Wireless Support
Wireless security is an important and complex research field
that is not presently well supported by DETER. An ultimate
goal would be to enable long-term research covering current
and future wireless technologies as well as local area, cellular-
based, and wide area RF communications, and to support
network applications on many different platforms such as PCs,
handhelds/PDAs, and sensor networks. By adding wireless
support within the existing DETER security experimentation
framework we can leverage experimental tools such as traffic
generators that are common to all types of security research,
while integrating emulators and wireless-protocol specific
tools that span the physical link, network infrastructure, and
wireless application domains.
A crucial restriction with respect to effective support of
wireless experimentation is DETER’s current reliance on the
“nodes and links” resource allocation model inherited from
Emulab. Wireless communication, in contrast, requires a
different underlying model more tuned to broadcast media and
communication spaces. Development and implementation of
appropriate underlying semantics for wireless support at the
infrastructure level is a major challenge.
IV. CONCLUSIONS
The cyber threat changes rapidly, with an evolving landscape
of cyber attacks and vulnerabilities that require technologies
for prevention, detection, and response. One key aspect of a
cyber security strategy is the scientific test and analysis of new
and emerging technologies for cyber space. The DETER cyber
security testbed provides a unique resource for such testing,
and the work described here aims to provide new science-
based methods, tools, and technologies to advance the field of
cyber experimentation and test. In addition to the technology-
based contributions we are also engaged in efforts to increase
the community of testbed users, educate future researchers and
the next generation of computer operations staff, and provide a
platform for training of cyber security personnel and
postmortem analysis of their actions.
REFERENCES
[1] DETER Testbed Web page, http://www.deterlab.net
[2] DETER Project Web page, http://www.isi.edu/deter
[3] Terry Benzel, Robert Braden, Dongho Kim, Clifford Neuman, Anthony
Joseph, Keith Sklower, Ron Ostrenga and Stephen Schwab, “Experience
with DETER: A Testbed for Security Research,” in Proceedings of
Tridentcom, March 2006.
[4] Emulab Testbed Web page, http://www.emulab.net
[5] Moodle Course Management Tool, http://www.moodle.org
[6] SEER Experimentation Workbench, http://seer.isi.deterlab.net
[7] Stephen Schwab, Brett Wilson, Calvin Ko, and Alefiya Hussain, “SEER:
A Security Experimentation EnviRonment for DETER,” in Proceedings
of the DETER Community Workshop on Cyber Security Experimentation
and Test, August 2007.
[8] J. Wroclawski, J. Mirkovic, T. Faber and S. Schwab, “A Two-Constraint
Approach to Risky Cybersecurity Experiment Management,” Invited
paper at the Sarnoff Symposium, April 2008.
[9] T. Faber and J. Wroclawski, “A Federated Experiment Environment for
Emulab-based Testbeds,” in Proceedings of Tridentcom, 2009.
[10] DETER Federation Architecture, http://fedd.isi.deterlab.net
[11] Alden W. Jackson, David Lapsley, Christine Jones, Mudge Zatko, Chaos
Golubitsky, W. Timothy Strayet, “SLINGbot: A System for Live
Investigation of Next Generation Botnets,” in Proceedings of the 2009
Cybersecurity Applications & Technology Conference for Homeland
Security, pp. 313-318, 2009.
[12] Christopher Patrick Lee, “Framework for Botnet Emulation and
Analysis,” Ph.D. Thesis, Georgia Institute of Technology, March 2009.
7