Ms. Terry Benzel's keynote presentation slides at the Annual Security Applications Conference (ACSAC) on December 9, 2011. Ms. Benzel's presentation crystalizes many of the key concepts that she (principal investigator) and her team have been working on in The DETER Project (www.deter-project.org). It provides descriptions of the research focused on new transformational methods of increasing knowledge, incorporating higher level, semantic information about experiments, new approaches to scalable modeling and Emulation, and techniques for increasing the efficiency and efficacy of experimentation. Further described at: http://www.deter-project.org/blog/deter_-_keynote_address_acsac_key_new_web_site
The document discusses intrusion alert correlation. It defines key terms like correlation, event, alert, and alert correlation. It outlines that the goals of correlation are to address weaknesses in individual intrusion detection systems like alert flooding, lack of context, and false positives/negatives. The main steps of the correlation process include alert collection, normalization, aggregation, verification, and producing high-level alert structures. Specific correlation techniques are also discussed.
The document discusses probabilistic seismic hazard analysis (PSHA) for evaluating seismic risks at US nuclear facilities. PSHA follows a structured process outlined in NUREG and SSHAC guidelines. It involves developing a seismic source characterization model and ground motion characterization model through expert elicitation to account for epistemic and aleatory uncertainty. The PSHA results provide seismic loads that are compared to structural capacity through fragility curves to determine risk and inform risk-informed regulatory decisions.
Finding Diversity In Remote Code Injection Exploitsamiable_indian
1. The document analyzes the diversity among remote code injection exploits by collecting exploit samples from network traces, extracting and emulating shellcodes, and clustering the shellcodes based on an exedit distance metric.
2. It finds that exploits can be grouped into families based on the vulnerability targeted. The LSASS and ISystemActivator exploit families show subtle variations among related exploits, while RemoteActivation exploits exhibit more diversity.
3. Analyzing exploit phylogenies reveals code sharing among families and subtle variations within families, providing insights into the emergence of polymorphism in malware payloads.
IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
Current Developments in DETER Cybersecurity Testbed TechnologyDETER-Project
A referred paper in Proceedings of the Cybersecurity Applications & Technology Conference for Homeland Security (CATCH 2009). Abstract: "From its inception in 2004, the DETER testbed facility has provided effective, dedicated experimental resources and expertise to a broad range of academic, industrial and government researchers. Now, building on knowledge gained, the DETER developers and community are moving beyond the classic “testbed” model and towards the creation and deployment of fundamentally transformational cybersecurity research methodologies. This paper discusses underlying rationale, together with initial design and implementation, of key technical concepts that drive these transformations." Authors: Terry Benzel, Bob Braden, Ted Faber, Jelena Mirkovic, Steve Schwab, Karen Sollins, and John Wroclawski.
This document is Daniel Araújo Melo's 2014 master's dissertation which examines the ARCA (Alerts Root Cause Analysis) framework for analyzing intrusion detection system alerts. The dissertation describes modern malware propagation techniques, proposes methods for detecting malware through IDS alert analysis, and reducing false positives. It presents the ARCA framework which combines alert aggregation using relative uncertainty with the Apriori frequent itemset mining algorithm. Tests on real data showed an 88% reduction in alerts requiring analysis without prior network infrastructure knowledge.
This document provides an overview of intrusion detection systems (IDS), including their challenges, potential solutions, and future developments. It discusses how IDS aim to detect attacks against computer systems and networks. The challenges of high false alarm rates and dependency on the environment are outlined. Potential solutions explored include data mining, machine learning, and co-simulation mechanisms. Alarm correlation techniques are examined as ways to combine fragmented alert information to better interpret attack flows. Artificial intelligence is seen as important for improving IDS flexibility, adaptability, and pattern recognition.
Ehip4 caring through sharing privacy and-security-technical-aspects riccardo ...imec.archive
This document discusses security and privacy concerns for electronic health information platforms (E-HIP). It analyzes threats at the business and technical levels, and proposes using centralized access control rules and decentralized identity providers to enforce security. For privacy, it suggests pseudonymizing identifiers when data is communicated across contexts, with some applications requiring reversible pseudonymization. The implementation involves users authenticating and an anonymizer permitting access according to security service rules.
The document discusses intrusion alert correlation. It defines key terms like correlation, event, alert, and alert correlation. It outlines that the goals of correlation are to address weaknesses in individual intrusion detection systems like alert flooding, lack of context, and false positives/negatives. The main steps of the correlation process include alert collection, normalization, aggregation, verification, and producing high-level alert structures. Specific correlation techniques are also discussed.
The document discusses probabilistic seismic hazard analysis (PSHA) for evaluating seismic risks at US nuclear facilities. PSHA follows a structured process outlined in NUREG and SSHAC guidelines. It involves developing a seismic source characterization model and ground motion characterization model through expert elicitation to account for epistemic and aleatory uncertainty. The PSHA results provide seismic loads that are compared to structural capacity through fragility curves to determine risk and inform risk-informed regulatory decisions.
Finding Diversity In Remote Code Injection Exploitsamiable_indian
1. The document analyzes the diversity among remote code injection exploits by collecting exploit samples from network traces, extracting and emulating shellcodes, and clustering the shellcodes based on an exedit distance metric.
2. It finds that exploits can be grouped into families based on the vulnerability targeted. The LSASS and ISystemActivator exploit families show subtle variations among related exploits, while RemoteActivation exploits exhibit more diversity.
3. Analyzing exploit phylogenies reveals code sharing among families and subtle variations within families, providing insights into the emergence of polymorphism in malware payloads.
IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
Current Developments in DETER Cybersecurity Testbed TechnologyDETER-Project
A referred paper in Proceedings of the Cybersecurity Applications & Technology Conference for Homeland Security (CATCH 2009). Abstract: "From its inception in 2004, the DETER testbed facility has provided effective, dedicated experimental resources and expertise to a broad range of academic, industrial and government researchers. Now, building on knowledge gained, the DETER developers and community are moving beyond the classic “testbed” model and towards the creation and deployment of fundamentally transformational cybersecurity research methodologies. This paper discusses underlying rationale, together with initial design and implementation, of key technical concepts that drive these transformations." Authors: Terry Benzel, Bob Braden, Ted Faber, Jelena Mirkovic, Steve Schwab, Karen Sollins, and John Wroclawski.
This document is Daniel Araújo Melo's 2014 master's dissertation which examines the ARCA (Alerts Root Cause Analysis) framework for analyzing intrusion detection system alerts. The dissertation describes modern malware propagation techniques, proposes methods for detecting malware through IDS alert analysis, and reducing false positives. It presents the ARCA framework which combines alert aggregation using relative uncertainty with the Apriori frequent itemset mining algorithm. Tests on real data showed an 88% reduction in alerts requiring analysis without prior network infrastructure knowledge.
This document provides an overview of intrusion detection systems (IDS), including their challenges, potential solutions, and future developments. It discusses how IDS aim to detect attacks against computer systems and networks. The challenges of high false alarm rates and dependency on the environment are outlined. Potential solutions explored include data mining, machine learning, and co-simulation mechanisms. Alarm correlation techniques are examined as ways to combine fragmented alert information to better interpret attack flows. Artificial intelligence is seen as important for improving IDS flexibility, adaptability, and pattern recognition.
Ehip4 caring through sharing privacy and-security-technical-aspects riccardo ...imec.archive
This document discusses security and privacy concerns for electronic health information platforms (E-HIP). It analyzes threats at the business and technical levels, and proposes using centralized access control rules and decentralized identity providers to enforce security. For privacy, it suggests pseudonymizing identifiers when data is communicated across contexts, with some applications requiring reversible pseudonymization. The implementation involves users authenticating and an anonymizer permitting access according to security service rules.
Replay of Malicious Traffic in Network TestbedsDETER-Project
In this paper we present tools and methods to integrate attack measurements from the Internet with controlled experimentation on a network testbed. We show that this approach provides greater fidelity than synthetic models. We compare the statistical properties of real-world attacks with synthetically generated constant bit rate attacks on the testbed. Our results indicate that trace replay provides fine time-scale details that may be absent in constant bit rate attacks. Additionally, we demonstrate the effectiveness of our approach to study new and emerging attacks. We replay an Internet attack captured by the LANDER system on the DETERLab testbed within two hours.
Data and tools from the paper are available at: http://montage.deterlab.net/magi/hst2013tools
Also read the LANDER Blog entry at: http://ant.isi.edu/blog/?p=411
The Science of Cyber Security Experimentation: The DETER ProjectDETER-Project
Terry Benzel provided the keynote address at the 11th Annual Computer Security Applications Conference (ACSAC). This document is the invited paper that she addressed in her keynote.
Abstract: Since 2004, the DETER Cyber-security Project has worked to create an evolving infrastructure – facilities, tools, and processes – to provide a national resource for experimentation in cyber security. Building on our insights into requirements for cyber science and on lessons learned through 8 years of operation, we have made several transformative advances towards creating the next generation of DeterLab. These advances in experiment design and research methodology are yielding progressive improvements not only in experiment scale, complexity, diversity, and repeatability, but also in the ability of researchers to leverage prior experimental efforts of other researchers in the DeterLab user community. This paper describes the advances resulting in a new experimentation science and a transformed facility for cyber-security research development and evaluation.
Further described: http://www.deter-project.org/blog/deter_-_keynote_address_acsac_key_new_web_site
For additional information, visit:
- http://www.deter-project.org
- http://info.deterlab.net
First Steps Toward Scientific Cyber-Security Experimentation in Wide-Area Cyb...DETER-Project
Abstract: Steps towards an environment for repeatable and scalable experiments on wide-area cyber-physical systems. The cyber-physical systems that underlie the world's critical infrastructure are increasingly vulnerable to attack and failure. Our work has focused on secure and resilient communication technology for the electric power grid, a subset of the general cyber-physical problem. We have demonstrated tools and methodology for experimentation with GridStat, a middleware system designed to provide enhanced communication service for the grid, within the DeterLab cyber-security testbed. Experiment design tools for DeterLab and for GridStat will ease the creation and execution of relatively large experiments, and they should make this environment accessible to users inexperienced with cluster testbeds. This abstract presents brief overviews of DeterLab and of GridStat and describes their integration. It also describes a large scale GridStat/DeterLab experiment.
For more information, visit: http://www.deter-project.org
El documento define una estructura como un ensamblaje de elementos que mantiene su forma y unidad para resistir cargas. Explica que una estructura debe ser funcional, segura y económica, y que su comportamiento puede ser lineal o no lineal dependiendo de factores como las deformaciones, materiales y cargas aplicadas.
The DETER Project: Advancing the Science of Cyber Security Experimentation an...DETER-Project
Abstract: Since 2004, the DETER Cybersecurity Testbed Project has worked to create the necessary infrastructure – facilities, tools, and processes – to provide a national resource for experimentation in cyber security. The next generation of DETER envisions several conceptual advances in testbed design and experimental research methodology, targeting improved experimental validity, enhanced usability, and increased size, complexity, and diversity of experiments. This paper outlines the DETER project's status and R&D directions.
For more information, visit: http://www.deter-project.org
In Quest of Benchmarking Security Risks to Cyber-Physical SystemsDETER-Project
DeterLab provides the capability to conduct risk evaluations of cyber-physical systems where the controllable variables range from IP level dynamics to introduction of malicious entities such as DDoS attacks. I recently co-authored an article published in the IEEE Magazine that discusses how the cyber aspects and the physical aspects of such systems can be integrated together to provide a CPS risk assessment environment.
In our article, titled "In the Quest of Benchmarking Security Risks to Cyber-Physical Systems" we present a generic yet practical framework for assessing security risks to cyber-physical systems. Our framework can be used to benchmark security risks when information is less than perfect, and interdependencies of physical and computational components may result in correlated failures. We focus on the risks that arise from interdependent reliability failures (faults) and security failures (attacks).
We advocate that a sound assessment of these risks requires explicit modeling of the effects of both technology-based defenses and institutions necessary for supporting them. Our game-theoretic approach to estimate security risks allows designing defenses that consider fault-tolerant control along with institutional structures.
Alefiya Hussain, University of Southern California
For information on DeterLab, visit: http://www.deter-project.org/deterlab-cyber-security-science-facility
The DETER Project: Towards Structural Advances in Experimental Cybersecurity ...DETER-Project
Abstract: It is widely argued that today's largely reactive, "respond and patch" approach to securing cyber systems must yield to a new, more rigorous, more proactive methodology. Achieving this transformation is a difficult challenge. Building on insights into requirements for cyber science and on experience gained through 8 years of operation, the DETER project is addressing one facet of this problem: the development of transformative advances in methodology and facilities for experimental cybersecurity research and system evaluation. These advances in experiment design and research methodology are yielding progressive improvements not only in experiment scale, complexity, diversity, and repeatability, but also in the ability of researchers to leverage prior experimental efforts of others within the community. We describe in this paper the trajectory of the DETER project towards a new experimental science and a transformed facility for cyber-security research development and evaluation.
For more information, visit: http://www.deter-project.org
Este documento presenta una introducción general sobre puentes. Explica que un puente conecta una vía a través de un obstáculo y consta de una superestructura y una infraestructura. Luego clasifica los puentes y describe los estudios básicos necesarios para su diseño, incluyendo estudios topográficos, hidrológicos, geológicos, sísmicos e impacto ambiental. Finalmente, cubre aspectos de la geometría de un puente como la sección transversal, anchos de calzada, bermas, veredas y barreras.
This document promotes investment in a multi-level marketing business opportunity selling water ionizer machines. It outlines three investment package levels and projections showing how investing and recruiting others can lead to millions in income within a few months. Critics note multi-level marketing schemes often overpromise income potential and lack transparency. The document aims to convince readers this is their chance to achieve financial freedom and a luxury lifestyle through this business.
Testimony of Terry V. Benzel, University of Southern California Information S...DETER-Project
1. The witness discusses the importance of expanding the scope of cybersecurity research to address evolving threats. Narrowly focused research is not enough to keep up with adversaries who plan attacks across systems.
2. Infrastructure is needed to enable experimental cybersecurity research and testing at scale. This allows innovations to be proven in realistic settings and better prepared for technology transfer.
3. New models are needed to successfully transfer university research into commercial products. Not all innovations work as expected outside controlled labs.
Taking Care of Yourself -- Even When It's ToughLinkedIn
As a women’s empowerment expert and social entrepreneur, Claudia Chan's life passion is to help women unlock their limitless potential in career and life, because she believes when women do better, the world does better. Here's some of Chan's best advice about what self-care is and how you can weave it into your life -- every day.
Connect: Professional Women’s Network is online community with more than 400,000 members that discusses issues relevant to women and their success. The free LinkedIn group powered by Citi also features videos interviews with influential businesswomen, live Q&As with experts and slideshows with career advice. To learn more and join the conversation in the largest women's group on LinkedIn, visit http://www.linkedin.com/womenconnect.
In honor of International Women's Day on March 8, we've compiled commentary from some of the most inspiring women driving change for gender equality.
Connect: Professional Women's Network is online community with more than 425,000 members that discusses issues relevant to women and their success. The free LinkedIn group powered by Citi also features videos interviews with influential businesswomen, live Q&As with experts and slideshows with career advice.
To learn more and join the conversation in the largest women's group on LinkedIn, visit http://www.linkedin.com/womenconnect.
Rakebul Hasan's document discusses explanation in the semantic web. It provides an overview of explanation in early expert systems and outlines several approaches to generating explanations in the semantic web, including representing justifications, provenance, trust and consuming explanations for both machines and humans. The document also discusses semantic web features like collaboration, autonomy and ontologies from an explanation perspective and several frameworks for generating explanations in the semantic web like the Inference Web, Accountability in RDF and Proof Explanation in Semantic Web. It concludes by noting areas for future work in generating and representing explanations.
Artificial intelligence began in the 1960s with early attempts at game playing, theorem proving, and problem solving. An expert system is a type of AI that attempts to provide answers to problems where human experts would normally be consulted. Expert systems use knowledge bases, inference engines, and other components to mimic human expertise in a specific domain. Virtual reality allows users to interact with simulated environments through technologies like head-mounted displays, CAVEs, and haptic interfaces.
The document provides an overview of expert systems and artificial intelligence. It defines key concepts such as artificial intelligence, expert systems, knowledge bases, inference engines, and knowledge representation. It also discusses applications of expert systems, the development process, benefits, limitations, and other areas of applied artificial intelligence like natural language processing, robotics, computer vision, and machine learning. Neural networks are also introduced as computing systems that mimic the human brain.
The PDX Splunk community came together for a fantastic in-person Splunk PNW User Group at Steeplejack Brewing Company in PDX! We had a great Detection Engineering walkthrough and demo from our sponsor Anvilogic, and Arcus Data gave a wonderful demo of both Edge Hub and AI Assist. See you again soon!
This document proposes a framework to enable flexible access control and cloud-based information sharing during emergency situations. The framework uses complex event processing to detect emergencies and then activates temporary access control policies and obligations to allow authorized users controlled access to resources needed for emergency response. It also explores using encryption and dynamic virtualization techniques to securely share information across multiple organizations' private clouds during emergencies.
The document discusses applying geospatial representation and forecasting models to improve chemical, biological, radiological, nuclear and explosive (CBRNE) defense. It proposes integrating CBRNE prediction, detection, and countermeasures with geospatial analysis. This would allow incorporation of mobile, wireless, and portable technologies. The goal is a smooth transition between combat, post-combat and civilian CBRNE situations. Challenges include differences between field and domestic environments and issues with sensors. The document outlines several proposed technologies, including the Nomad Eyes architecture for distributed sensor deployment using inverse modeling. It also discusses the ADaM software for real-time data processing and sensor devices like the portable OPA for chemical detection.
Sep2009 Introduction to Medical Expert Decision Support Systems for Mayo Clinicdoc_vogt
This document discusses expert systems and their potential application to medical decision support. It provides background on expert systems, describing their components like knowledge bases, inference engines, and explanation facilities. It also discusses different approaches to building expert systems, such as production rules, pattern recognition, fuzzy logic, and imagery analysis. The document then discusses some examples of medical expert systems from the past and potential benefits of developing new expert decision support systems.
Supporting Emergence: Interaction Design for Visual Analytics Approach to ESDAJesse Lingeman
The document summarizes a presentation given at the NSF Workshop on From OpenSHAPA to Open Data Sharing. The presentation discussed visual analytics and its role in exploratory spatial data analysis (ESDA). It described how visual analytics combines interactive visualizations and analytical tools to enable rapid querying and interrogation of information to support sense-making. Challenges of visual analytics for analyzing large and diverse datasets were also presented, including issues of representation, missing/contradictory data, and uncertainty. Examples from security and library domains demonstrated multi-disciplinary approaches to visual analytics.
Keynote at the European Semantic Web Conference (ESWC 2006). The talk tries to figure out what the main scientific challenges are in Semantic Web research.
This talk was also recorded on video, and is available on-line at http://videolectures.net/eswc06_harmelen_wswnj/
Replay of Malicious Traffic in Network TestbedsDETER-Project
In this paper we present tools and methods to integrate attack measurements from the Internet with controlled experimentation on a network testbed. We show that this approach provides greater fidelity than synthetic models. We compare the statistical properties of real-world attacks with synthetically generated constant bit rate attacks on the testbed. Our results indicate that trace replay provides fine time-scale details that may be absent in constant bit rate attacks. Additionally, we demonstrate the effectiveness of our approach to study new and emerging attacks. We replay an Internet attack captured by the LANDER system on the DETERLab testbed within two hours.
Data and tools from the paper are available at: http://montage.deterlab.net/magi/hst2013tools
Also read the LANDER Blog entry at: http://ant.isi.edu/blog/?p=411
The Science of Cyber Security Experimentation: The DETER ProjectDETER-Project
Terry Benzel provided the keynote address at the 11th Annual Computer Security Applications Conference (ACSAC). This document is the invited paper that she addressed in her keynote.
Abstract: Since 2004, the DETER Cyber-security Project has worked to create an evolving infrastructure – facilities, tools, and processes – to provide a national resource for experimentation in cyber security. Building on our insights into requirements for cyber science and on lessons learned through 8 years of operation, we have made several transformative advances towards creating the next generation of DeterLab. These advances in experiment design and research methodology are yielding progressive improvements not only in experiment scale, complexity, diversity, and repeatability, but also in the ability of researchers to leverage prior experimental efforts of other researchers in the DeterLab user community. This paper describes the advances resulting in a new experimentation science and a transformed facility for cyber-security research development and evaluation.
Further described: http://www.deter-project.org/blog/deter_-_keynote_address_acsac_key_new_web_site
For additional information, visit:
- http://www.deter-project.org
- http://info.deterlab.net
First Steps Toward Scientific Cyber-Security Experimentation in Wide-Area Cyb...DETER-Project
Abstract: Steps towards an environment for repeatable and scalable experiments on wide-area cyber-physical systems. The cyber-physical systems that underlie the world's critical infrastructure are increasingly vulnerable to attack and failure. Our work has focused on secure and resilient communication technology for the electric power grid, a subset of the general cyber-physical problem. We have demonstrated tools and methodology for experimentation with GridStat, a middleware system designed to provide enhanced communication service for the grid, within the DeterLab cyber-security testbed. Experiment design tools for DeterLab and for GridStat will ease the creation and execution of relatively large experiments, and they should make this environment accessible to users inexperienced with cluster testbeds. This abstract presents brief overviews of DeterLab and of GridStat and describes their integration. It also describes a large scale GridStat/DeterLab experiment.
For more information, visit: http://www.deter-project.org
El documento define una estructura como un ensamblaje de elementos que mantiene su forma y unidad para resistir cargas. Explica que una estructura debe ser funcional, segura y económica, y que su comportamiento puede ser lineal o no lineal dependiendo de factores como las deformaciones, materiales y cargas aplicadas.
The DETER Project: Advancing the Science of Cyber Security Experimentation an...DETER-Project
Abstract: Since 2004, the DETER Cybersecurity Testbed Project has worked to create the necessary infrastructure – facilities, tools, and processes – to provide a national resource for experimentation in cyber security. The next generation of DETER envisions several conceptual advances in testbed design and experimental research methodology, targeting improved experimental validity, enhanced usability, and increased size, complexity, and diversity of experiments. This paper outlines the DETER project's status and R&D directions.
For more information, visit: http://www.deter-project.org
In Quest of Benchmarking Security Risks to Cyber-Physical SystemsDETER-Project
DeterLab provides the capability to conduct risk evaluations of cyber-physical systems where the controllable variables range from IP level dynamics to introduction of malicious entities such as DDoS attacks. I recently co-authored an article published in the IEEE Magazine that discusses how the cyber aspects and the physical aspects of such systems can be integrated together to provide a CPS risk assessment environment.
In our article, titled "In the Quest of Benchmarking Security Risks to Cyber-Physical Systems" we present a generic yet practical framework for assessing security risks to cyber-physical systems. Our framework can be used to benchmark security risks when information is less than perfect, and interdependencies of physical and computational components may result in correlated failures. We focus on the risks that arise from interdependent reliability failures (faults) and security failures (attacks).
We advocate that a sound assessment of these risks requires explicit modeling of the effects of both technology-based defenses and institutions necessary for supporting them. Our game-theoretic approach to estimate security risks allows designing defenses that consider fault-tolerant control along with institutional structures.
Alefiya Hussain, University of Southern California
For information on DeterLab, visit: http://www.deter-project.org/deterlab-cyber-security-science-facility
The DETER Project: Towards Structural Advances in Experimental Cybersecurity ...DETER-Project
Abstract: It is widely argued that today's largely reactive, "respond and patch" approach to securing cyber systems must yield to a new, more rigorous, more proactive methodology. Achieving this transformation is a difficult challenge. Building on insights into requirements for cyber science and on experience gained through 8 years of operation, the DETER project is addressing one facet of this problem: the development of transformative advances in methodology and facilities for experimental cybersecurity research and system evaluation. These advances in experiment design and research methodology are yielding progressive improvements not only in experiment scale, complexity, diversity, and repeatability, but also in the ability of researchers to leverage prior experimental efforts of others within the community. We describe in this paper the trajectory of the DETER project towards a new experimental science and a transformed facility for cyber-security research development and evaluation.
For more information, visit: http://www.deter-project.org
Este documento presenta una introducción general sobre puentes. Explica que un puente conecta una vía a través de un obstáculo y consta de una superestructura y una infraestructura. Luego clasifica los puentes y describe los estudios básicos necesarios para su diseño, incluyendo estudios topográficos, hidrológicos, geológicos, sísmicos e impacto ambiental. Finalmente, cubre aspectos de la geometría de un puente como la sección transversal, anchos de calzada, bermas, veredas y barreras.
This document promotes investment in a multi-level marketing business opportunity selling water ionizer machines. It outlines three investment package levels and projections showing how investing and recruiting others can lead to millions in income within a few months. Critics note multi-level marketing schemes often overpromise income potential and lack transparency. The document aims to convince readers this is their chance to achieve financial freedom and a luxury lifestyle through this business.
Testimony of Terry V. Benzel, University of Southern California Information S...DETER-Project
1. The witness discusses the importance of expanding the scope of cybersecurity research to address evolving threats. Narrowly focused research is not enough to keep up with adversaries who plan attacks across systems.
2. Infrastructure is needed to enable experimental cybersecurity research and testing at scale. This allows innovations to be proven in realistic settings and better prepared for technology transfer.
3. New models are needed to successfully transfer university research into commercial products. Not all innovations work as expected outside controlled labs.
Taking Care of Yourself -- Even When It's ToughLinkedIn
As a women’s empowerment expert and social entrepreneur, Claudia Chan's life passion is to help women unlock their limitless potential in career and life, because she believes when women do better, the world does better. Here's some of Chan's best advice about what self-care is and how you can weave it into your life -- every day.
Connect: Professional Women’s Network is online community with more than 400,000 members that discusses issues relevant to women and their success. The free LinkedIn group powered by Citi also features videos interviews with influential businesswomen, live Q&As with experts and slideshows with career advice. To learn more and join the conversation in the largest women's group on LinkedIn, visit http://www.linkedin.com/womenconnect.
In honor of International Women's Day on March 8, we've compiled commentary from some of the most inspiring women driving change for gender equality.
Connect: Professional Women's Network is online community with more than 425,000 members that discusses issues relevant to women and their success. The free LinkedIn group powered by Citi also features videos interviews with influential businesswomen, live Q&As with experts and slideshows with career advice.
To learn more and join the conversation in the largest women's group on LinkedIn, visit http://www.linkedin.com/womenconnect.
Rakebul Hasan's document discusses explanation in the semantic web. It provides an overview of explanation in early expert systems and outlines several approaches to generating explanations in the semantic web, including representing justifications, provenance, trust and consuming explanations for both machines and humans. The document also discusses semantic web features like collaboration, autonomy and ontologies from an explanation perspective and several frameworks for generating explanations in the semantic web like the Inference Web, Accountability in RDF and Proof Explanation in Semantic Web. It concludes by noting areas for future work in generating and representing explanations.
Artificial intelligence began in the 1960s with early attempts at game playing, theorem proving, and problem solving. An expert system is a type of AI that attempts to provide answers to problems where human experts would normally be consulted. Expert systems use knowledge bases, inference engines, and other components to mimic human expertise in a specific domain. Virtual reality allows users to interact with simulated environments through technologies like head-mounted displays, CAVEs, and haptic interfaces.
The document provides an overview of expert systems and artificial intelligence. It defines key concepts such as artificial intelligence, expert systems, knowledge bases, inference engines, and knowledge representation. It also discusses applications of expert systems, the development process, benefits, limitations, and other areas of applied artificial intelligence like natural language processing, robotics, computer vision, and machine learning. Neural networks are also introduced as computing systems that mimic the human brain.
The PDX Splunk community came together for a fantastic in-person Splunk PNW User Group at Steeplejack Brewing Company in PDX! We had a great Detection Engineering walkthrough and demo from our sponsor Anvilogic, and Arcus Data gave a wonderful demo of both Edge Hub and AI Assist. See you again soon!
This document proposes a framework to enable flexible access control and cloud-based information sharing during emergency situations. The framework uses complex event processing to detect emergencies and then activates temporary access control policies and obligations to allow authorized users controlled access to resources needed for emergency response. It also explores using encryption and dynamic virtualization techniques to securely share information across multiple organizations' private clouds during emergencies.
The document discusses applying geospatial representation and forecasting models to improve chemical, biological, radiological, nuclear and explosive (CBRNE) defense. It proposes integrating CBRNE prediction, detection, and countermeasures with geospatial analysis. This would allow incorporation of mobile, wireless, and portable technologies. The goal is a smooth transition between combat, post-combat and civilian CBRNE situations. Challenges include differences between field and domestic environments and issues with sensors. The document outlines several proposed technologies, including the Nomad Eyes architecture for distributed sensor deployment using inverse modeling. It also discusses the ADaM software for real-time data processing and sensor devices like the portable OPA for chemical detection.
Sep2009 Introduction to Medical Expert Decision Support Systems for Mayo Clinicdoc_vogt
This document discusses expert systems and their potential application to medical decision support. It provides background on expert systems, describing their components like knowledge bases, inference engines, and explanation facilities. It also discusses different approaches to building expert systems, such as production rules, pattern recognition, fuzzy logic, and imagery analysis. The document then discusses some examples of medical expert systems from the past and potential benefits of developing new expert decision support systems.
Supporting Emergence: Interaction Design for Visual Analytics Approach to ESDAJesse Lingeman
The document summarizes a presentation given at the NSF Workshop on From OpenSHAPA to Open Data Sharing. The presentation discussed visual analytics and its role in exploratory spatial data analysis (ESDA). It described how visual analytics combines interactive visualizations and analytical tools to enable rapid querying and interrogation of information to support sense-making. Challenges of visual analytics for analyzing large and diverse datasets were also presented, including issues of representation, missing/contradictory data, and uncertainty. Examples from security and library domains demonstrated multi-disciplinary approaches to visual analytics.
Keynote at the European Semantic Web Conference (ESWC 2006). The talk tries to figure out what the main scientific challenges are in Semantic Web research.
This talk was also recorded on video, and is available on-line at http://videolectures.net/eswc06_harmelen_wswnj/
Supporting Inter-Organizational Situation Assessment in Crisis ManagementTorben Wiedenhoefer
To assess current situation properly is crucial for effective decision-making in crisis management. However, gathering accurate information from incidence sites and providing appropriate support for assessment practices faces several challenges. The unique information demands of each crisis situation, the information availability or inter-organizational problems and obstacles to information exchange are important factors that need to be considered in designing ICT. In this contribution we present results from an empirical study about decision- making practices in scenarios of medium to large power outages in Germany. We focused on the needs and practices on information exchange at the level of inter-organizational cooperation. We examined the cooperation of fire departments, police, public administration, electricity infrastructure operators and citizens. Our empirical material reflects particularly conditions and challenges in current situation assessment practices, and we were able to derive design requirements for an inter-organizational situation assessment client as a complementary tool for existing crisis management infrastructures.
Cloud Economics in Training and SimulationNane Kratzke
This document discusses a presentation on cloud economics in training and simulation. It begins with defining cloud computing and outlining its essential characteristics like on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service. Some postulated use cases for cloud computing are then discussed, including training and education. Real-world data is then presented from a course that utilized Amazon Web Services, analyzing costs, cost drivers, and server usage. The findings provide insights into the economics of educational cloud usage.
Knowledge management for integrative omics data analysisCOST action BM1006
This document discusses knowledge management for integrative omics data analysis. It describes Biomax, a knowledge management platform that can flexibly interconnect isolated data silos in biomedical research. The platform addresses challenges like aggregating and analyzing multi-scale omics data from various sources and representing biological knowledge through semantic mapping and ontologies. Examples demonstrate how Biomax can integrate data from literature and databases, develop domain models, perform statistical analyses and network searches on integrated data, and support collaborative knowledge extraction.
Cloud Economics in Training and SimulationNane Kratzke
This slide presents a use case how to adopt IaaS cloud computing in higher education. It is shown that virtual labs can provide a more than 25 times cost advantage compared to classical dedicated on-premise in-house labs.
This document discusses Juan de Dios Santander Vela's work on the Wf4Ever project to preserve scientific workflows. The Wf4Ever project aims to develop technological infrastructure for preserving, retrieving, and reusing scientific workflows across disciplines. Mr. Santander Vela has worked on making radio astronomy archives and tools interoperable with the Virtual Observatory and is now applying his expertise to the Wf4Ever project goals of archiving, classifying, indexing, and providing access to scientific workflows and materials in semantic repositories. Preserving workflows is important for astronomy research as it allows experiments to be reproduced, repeated, reused, re-purposed, and collaborated on.
g-Social - Enhancing e-Science Tools with Social Networking FunctionalityNicholas Loulloudes
Presentation of "g-Social - Enhancing e-Science Tools with Social Networking Functionality" given at the Workshop on Analyzing and Improving Collaborative eScience with Social Networks, Chicago October 8th, 2012. Co-located with IEEE eScience 2012.
This document summarizes a presentation on managing complex projects given by Dr. Jerry Mulenburg. It discusses complexity and the need for project managers of complex projects to have certain abilities. Complex projects are characterized by many interconnected parts, ambiguity, uncertainty, and rapid change. Managing complex projects requires understanding both tangible and intangible aspects. It also requires the capacity for parallel processing of information, drawing from experience, intelligence, and the ability to deal with unknown unknowns.
This document discusses scalable internet architectures and operating at scale. It begins by introducing the author and their expertise and experience in scalable web applications and open source software. The document then outlines topics that will be covered, including what constitutes an architecture, scaling patterns for dynamic content, databases, and complex systems. It emphasizes that an architecture encompasses many interconnected components and that awareness of other disciplines is important. It stresses the importance of tools, experience, knowledge, and discipline in successfully operating architectures at scale.
Edge-based Discovery of Training Data for Machine LearningZiqiang Feng
(Accepted and presented in Symposium on Edge Computing, Seattle, Oct 2018)
We show how edge-based early discard of data can greatly improve the productivity of a human expert in assembling a large training set for machine learning. This task may span multiple data sources that are live (e.g., video cameras) or archival (data sets dispersed over the Internet). The critical resource here is the attention of the expert. We describe Eureka, an interactive system that leverages edge computing to greatly improve the productivity of experts in this task. Our experimental results show that Eureka reduces the labeling effort needed to construct a training set by two orders of magnitude relative to a brute-force approach.
Neural networks are inspired by biological neural networks and are composed of interconnected processing elements called neurons. Neural networks can learn complex patterns and relationships through a learning process without being explicitly programmed. They are widely used for applications like pattern recognition, classification, forecasting and more. The document discusses neural network concepts like architecture, learning methods, activation functions and applications. It provides examples of biological and artificial neurons and compares their characteristics.
Similar to The Science of Cyber Security Experimentation: The DETER Project (20)
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
A Comprehensive Guide to DeFi Development Services in 2024Intelisync
DeFi represents a paradigm shift in the financial industry. Instead of relying on traditional, centralized institutions like banks, DeFi leverages blockchain technology to create a decentralized network of financial services. This means that financial transactions can occur directly between parties, without intermediaries, using smart contracts on platforms like Ethereum.
In 2024, we are witnessing an explosion of new DeFi projects and protocols, each pushing the boundaries of what’s possible in finance.
In summary, DeFi in 2024 is not just a trend; it’s a revolution that democratizes finance, enhances security and transparency, and fosters continuous innovation. As we proceed through this presentation, we'll explore the various components and services of DeFi in detail, shedding light on how they are transforming the financial landscape.
At Intelisync, we specialize in providing comprehensive DeFi development services tailored to meet the unique needs of our clients. From smart contract development to dApp creation and security audits, we ensure that your DeFi project is built with innovation, security, and scalability in mind. Trust Intelisync to guide you through the intricate landscape of decentralized finance and unlock the full potential of blockchain technology.
Ready to take your DeFi project to the next level? Partner with Intelisync for expert DeFi development services today!
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
FREE A4 Cyber Security Awareness Posters-Social Engineering part 3Data Hops
Free A4 downloadable and printable Cyber Security, Social Engineering Safety and security Training Posters . Promote security awareness in the home or workplace. Lock them Out From training providers datahops.com
Trusted Execution Environment for Decentralized Process MiningLucaBarbaro3
Presentation of the paper "Trusted Execution Environment for Decentralized Process Mining" given during the CAiSE 2024 Conference in Cyprus on June 7, 2024.
Trusted Execution Environment for Decentralized Process Mining
The Science of Cyber Security Experimentation: The DETER Project
1. Terry
Benzel
USC
Information
Sciences
Institute
December
9,
2011
Annual
Computer
Security
Applications
Conference
2. Large,
Complex,
Interconnected
Slow
to
evolve
Legacy
Subsytems
System
of
Systems
Connected
Cyber
Physical
Systems
2
3. Weapons
evolve
rapidly
and
proliferate
widely
Asymmetric
warfare:
Attacks
from
anywhere,
with
unknown
weapons
Defenses
must
be
known,
effective,
affordable
3
5. ¡ Solution
–
build
less
vulnerable
systems
to
begin
with!
¡ Create
fundamental
understanding
and
reason
about
systems
through
experimental
means
¡ Key
aspect
–
enable
science
based
experimentation
¡ Hard
Problem
5
6. 1.
Have
an
idea
for
a
“new”
tool
that
would
“help”
security
2.
Program/assemble
the
tool
(the
majority
of
the
work)
3.
Put
it
on
your
local
net
4.
Attack
your
system
5.
Show
the
tool
repels
the
attack
6.
Write
up
“the
results”
and
open-‐source
the
tool
7.
(optional)
Start
up
a
company
which
might
succeed
6
7. ¡ Perform
experimental
research
of
scale
and
complexity
sufficient
to
the
real
world
¡ Extract
understanding
through
experimental
research
¡
Collect,
leverage,
and
share
experimental
artifacts
and
learnings
7
8. ¡ Class
of
experimental
cyber
science
applied
to
sets
of
problems
-‐
networked
cyber
systems
and
often
cyber
physical
networked
systems
¡
Goal
-‐
enable
experimental
cyber
science
aimed
at
study
of
behavior,
phenomena,
providing
fundamental
understanding
8
9. ¡ A
research
program:
§ To
advance
capabilities
for
experimental
cybersecurity
research
¡ A
testbed
facility:
§ To
serve
as
a
publicly
available
national
resource…
§ …supporting
a
broad
base
of
users
and
experiments
§ …
and
act
as
a
technology
transfer
and
evangelization
vehicle
for
our
and
others’
research
in
experimental
methodology
¡ A
community
building
activity:
§ To
foster
and
support
collaborative
science…
§ …effective
and
efficient
leverage
and
sharing
of
knowledge
9
11. ¡ Advance
our
understanding
of
of
experimental
cybersecurity
science
and
methodologies
§ Enable
new
levels
of
rigor
and
repeatability
§ Transform
low
level
results
to
high
level
understanding
§ Broaden
the
domains
of
applicability
¡ Advance
the
technology
of
experimental
infrastructure
§ Develop
technologies
with
new
levels
of
function,
applicability,
and
scale
¡ Share
knowledge,
results,
and
operational
capability
§ Facility,
data
and
tools
§ Community
and
knowledge
11
14. ¡ The
problem:
§ Today’s
testbed
technologies
understand
the
syntax
of
experiments,
but
have
no
awareness
of
higher
level
knowledge
or
semantics.
¡ The
challenge:
§ Incorporate
higher
level,
semantic
information
about
experiments
and
scenarios
into
our
systems
and
tools,
and
§ Use
this
knowledge
to
improve
research
quality
and
understanding.
14
15. ¡ Uses
higher
level
knowledge
about
the
scenario
§ Required
invariants
(things
that
must
be
true
for
the
experiment
to
be
valid)
§ Expected
behavior
¡ Takes
corrective
or
notification
action
if
invariant
is
violated
§ Monitor
invariants
§ Trigger
actions
15
16. ¡ Captures
invariants
in
explicit
form
for
experiment
reuse,
repeatability
and
validation,
etc.
¡ Must
be
true
for
experiment
to
be
valid
¡ High
level
testing
of
invariants
–
§
Understanding
against
data
sets
§
Against
constraints/invariants
¡ Also
questions
of
modeling
and
scale
–
§ Researcher
intuition
expressed
as
checkable
invariants
¡ Specification
for
sharing
16
17. Test
it
on
Define
data
behavior
E xperiment
data
is
Semantic
input
as
normalized
Models
drive
S emantic events .
vis ualization
over
Analysis
Vis ualiz ation
data.
Framework
Gain
Understanding
17
19. Scenarios
are
captured
by
¡ Environment
–
the
conditions
of
the
scenario
§ Virtual
topology
(varies
with
phenomenon),
could
be
dynamic,
abstract,
expresses
needs
and
constraints
§ Traffic,
cross-‐traffic,
cross-‐events,
human
actions,
etc.
¡ Workflow
–
Occurrences
and
events
of
interest
¡ Invariants
–
truths
that
must
hold
for
correctness
19
21. ¡ The
problem:
§ Traditional
testbeds
can
model
and
emulate
small
systems
at
a
fixed
level
of
fidelity.
¡ The
challenge:
§ Many
real
problems
require
modeling
of
large,
complex
systems
at
an
appropriate
(“good
enough”)
level
of
fidelity.
§ That
level
may
be
different
for
different
parts
of
the
modeled
system.
§ Think
of
this
as
“smearing
the
computation
power
around
to
just
where
it’s
needed”.
21
23. Command &
Victim Control (VMs)
Victim
(Physical Host)
Command &
Control
Network
(VMs)
Network
23
24. Abstract
Elements
Containers
Interconn-‐
Map
Assign
ected
Elements
Containers
Abstract
to
to
Elements
Containers
Resources
Federation
Federated
Description
Embedder
System
Systems
¡ Abstract
the
“node”
concept
to
multiple
classes
of
containers
¡ Support
wide
range
of
scalability-‐fidelity
tradeoffs
§ Apply
computational
resources
to
key
dimensions
for
specific
problem
space
24
25. Server
Computer
Apache
8
GB
Mem
2.2
4
CPUs
Server
Apache
2.2
Routers
Production
Threaded Full
Software in VMs
Emulation Computer
25
30. ¡ The
problem:
§ Today’s
testbed
technologies
provide
limited
support
for
complex
user
tasks,
thus,
hampering
system
of
system
level
experimentation
and
reasoning.
¡ The
challenge:
§ Develop
methodologies
to
leverage
knowledge,
understanding,
and
semantics,
through
development
environments,
composition
and
sharing.
30
32. § Most
testbed
tools
focus
on
creating
and
running
an
experiment.
Much
less
attention
is
paid
to
other
important
steps
in
the
process
§ Develop
a
model
for
workflow
over
the
full
lifecycle
of
an
experiment,
and
capture
that
model
in
methodologies
and
tools
32
33. ¡ Key
Observation:
isomorphism
to
software
engineering
lifecycle
¡ Implementation
Approach:
Leverage
Eclipse
§ Repurpose
tested
SWE
methodologies
§ Build
on
20M+
lines
of
code
33
38. ¡ Testbeds
must
model
impact
of
human
activity
in
repeatable
experiments
§ Provide
more
realistic
behavior
for
testing
security
tools
§ But
real
humans
are
expensive
and
non-‐repeatable
¡ Model
goal-‐directed
team
activity
§ Measure
impact
of
an
attack
on
team
goals
§ Model
impact
of
organization
structure
¡ Model
certain
human
characteristics
§ Propensity
to
make
mistakes
§ Aspects
of
physiology,
(soon:
emotion,
bounded
rationality)
§ Flexibility
to
changing
conditions
¡ Configurable
tool
for
experimenters
38
40. A
general
purpose,
flexible
platform
for
modeling,
emulation,
and
controlled
study
of
large,
complex
networked
systems
§ Elements
located
at
USC/ISI
(Los
Angeles),
UC
Berkeley,
and
USC/ISI
(Arlington,
VA)
§ Funded
by
NSF
and
DHS,
started
in
2003
§ Based
on
Emulab
software,
with
focus
on
security
experimentation
§ Shared
resource
–
multiple
simultaneous
experiments
subject
to
resource
constraints
§ Open
to
academic,
industrial,
govt
researchers
essentially
worldwide
–
very
lightweight
approval
process
40
41. ¡
~440
PC-‐based
nodes
• Berkeley,
CA
-‐
~200
Nodes
• Los
Angeles,
CA
-‐
220
Nodes
• Arlington,
VA
–
20
Nodes
¡ Interconnect
(2010)
• 1
Gb/s
–
LA-‐UCB
• 1-‐10
Gb/s
LA-‐Arlington
¡ Local
and
Remote
access
41
46. • Content
sharing
support
– Experiments,
data,
models,
recipes
– Class
materials,
recent
research
results,
ideas
• Shared
spaces
– Outreach:
Conferences,
tutorials,
presentations
– Share
and
connect:
Website,
exchange
server
– Common
experiment
description:
Templates
– Build
community
knowledge:
domain-‐specific
communities
• Education
support
– NSF
CCLI
grant:
develop
hands-‐on
exercises
for
classes
– Moodle
server
for
classes
on
DETER
46
47. Academia
UC
Irvine
Government
Carnegie
Mellon
University
UC
Santa
Cruz
Air
Force
Research
Laboratory
Columbia
University
UCLA
DARPA
Cornell
University
UCSD
Lawrence
Berkeley
National
Lab
Dalhousie
University
UIUC
Naval
Postgraduate
School
DePaul
University
UNC
Chapel
Hill
Sandia
National
Laboratories
George
Mason
University
UNC
Charlotte
Industry
Georgia
State
University
Universidad
Michoacana
de
San
Nicolas
Agnik,
LLC
Hokuriku
Research
Center
Universita
di
Pisa
Aerospace
Corporation
ICSI
University
of
Advancing
Technology
Backbone
Security
IIT
Delhi
University
of
Illinois,
Urbana-‐Champaign
BAE
Systems,
Inc.
IRTT
University
of
Maryland
BBN
ISI
University
of
Massachusetts
Bell
Labs
Johns
Hopkins
University
University
of
Oregon
Cs3
Inc.
Lehigh
University
University
of
Southern
Callfornia
Distributed
Infinity
Inc.
MIT
University
of
Washington
EADS
Innovation
Works
New
Jersey
Institute
of
Technology
University
of
Wisconsin
-‐
Madison
FreeBSD
Foundation
Norfolk
State
University
USC
iCAST
Pennsylvania
State
University
UT
Arlington
Institute
for
Information
Industry
Purdue
University
UT
Austin
Intel
Research
Berkeley
Rutgers
University
UT
Dallas
IntruGuard
Devices,
Inc.
Sao
Paulo
State
University
Washington
State
University
Purple
Streak
Southern
Illinois
University
Washington
University
in
St.
Louis
Secure64
Software
Corp
TU
Berlin
Western
Michigan
University
Skaion
Corporation
TU
Darmstadt
Xiangnan
University
SPARTA
Texas
A&M
University
Youngstown
State
University
SRI
International
UC
Berkeley
Telcordia
Technologies
UC
Davis
47
49. ¡ Hands
on
exercises
¡ Students
gain
from
direct
observation
of
attacks
and
interaction
¡ Pre
packaged
for
both
student
and
teacher
§ Buffer
overflows,
command-‐injection,
middle-‐in-‐
the-‐middle,
worm
modeling,
botnets,
and
DoS
¡ Facility
support
for
class
administration
49
51. ¡ Transformative
research
and
facility
for
cyber
security
R&D
¡ Experimental
science:
§ Fostering
fundamental
understanding
world
complexity
¡ Contribution
transformation
of
field
¡ Proactive
robustness
and
away
from
reactive
security
51
52. ¡ Growing
DETER
Community
increasingly
engaged
in
experimental
science
of
cyber
security
¡ Collaboration
key
part
of
DETER
mission
§ DETERLab
and
new
scientific
experimentation
Join
us
http://deter-‐project.org/
52