This document provides an overview of AccessData's Cyber Intelligence Response Technology (CIRT) platform. CIRT offers an integrated suite of digital forensics and incident response capabilities including network forensics, host-based forensics, data auditing, and malware analysis. Key features include an agent that can independently collect and store data from endpoints, a Cerberus module that analyzes files for malicious behaviors without signatures or prior knowledge, and modules for analyzing removable media, volatile memory, and network packet captures. The platform allows multiple teams such as incident response, computer forensics, and compliance to collaborate on investigations.
The New Mobile Landscape - OWASP IrelandTyler Shields
The document discusses threats to mobile devices and potential solutions. It outlines the mobile threat landscape including types of mobile malware, vulnerabilities, and statistics on infected platforms. It then examines players in the mobile ecosystem like MDM vendors, mobile anti-virus, application markets, and developers. Potential fixes are explored at the enterprise, consumer, vendor, and developer levels through capabilities mapping, malware detection, vulnerability analysis, and secure coding practices. The road ahead is seen through continued collaboration between these players and communities.
Survey of Rootkit Technologies and Their Impact on Digital ForensicsTyler Shields
This document discusses the history and evolution of rootkit technologies and their impact on digital forensics. It begins with defining rootkits as code used by attackers to surreptitiously execute and control systems while remaining undetected. The document then covers: (1) the origins and evolution of rootkits from modifying system binaries in the 1980s to more advanced techniques today, (2) the five classes of rootkits - application, library, kernel, firmware, and virtualized, and (3) how rootkits aim to hide themselves and impede forensic investigation, posing challenges for incident response.
Instrumentation is a technique that adds extra code to a program or environment for monitoring or changing program behavior. It can be used for various purposes like performance analysis, automated debugging, error detection, and security testing. Instrumentation can be applied at different levels like source code, bytecode, or binary code. Common instrumentation techniques include static analysis, dynamic analysis, load-time instrumentation, and dynamic binary instrumentation. Instrumentation is useful for tasks like control flow analysis, vulnerability detection, malware analysis, and fuzzing.
1) Software-defined networking (SDN) uses a decoupled architecture where the control plane is separated from the data plane to provide programmability of network behavior. This transition enables automation, elastic scaling, and improved security.
2) Software-defined security (SDS) extends this model to security controls by making them virtualized, programmable entities that can be centrally managed and automated
Preventing The Next Data Breach Through Log ManagementNovell
The document discusses how log management can be used for prevention, detection, and investigation of security incidents and data breaches. It explains that log management provides transparency by collecting logs from across an organization's IT infrastructure in a central location. This allows security teams to discover misconfigurations, unauthorized access attempts, and other anomalies that could indicate potential threats or actual security breaches. The document advocates for taking a preventative approach to security by using log data to monitor user activity and identity risks. It also promotes investing in security intelligence capabilities like security monitoring, analytics, and automated remediation.
This document provides an overview of intrusion detection systems (IDS), including their challenges, potential solutions, and future developments. It discusses how IDS aim to detect attacks against computer systems and networks. The challenges of high false alarm rates and dependency on the environment are outlined. Potential solutions explored include data mining, machine learning, and co-simulation mechanisms. Alarm correlation techniques are examined as ways to combine fragmented alert information to better interpret attack flows. Artificial intelligence is seen as important for improving IDS flexibility, adaptability, and pattern recognition.
Palestra do evento "Cybersecurity: a nova era em resposta a incidentes e auditoria de dados"
Jim Butterworth - Senior Cybersecurity Director Guidance Software Inc.
Brasília, 04 de agosto de 2010
The document discusses disrupting cyber attacks using Splunk software. It provides an overview of Splunk's security capabilities such as monitoring known and unknown threats, security investigations, and fraud detection. It then demonstrates how to investigate a hypothetical security incident at a company called Buttercup Games. The investigation uses Splunk to trace an attack from initial website exploitation and phishing email through endpoint infection back to the root cause of a user opening a weaponized PDF file. The investigation illustrates how Splunk can disrupt the cyber kill chain by connecting threat indicators from multiple data sources to rapidly uncover attack details and attributes.
The New Mobile Landscape - OWASP IrelandTyler Shields
The document discusses threats to mobile devices and potential solutions. It outlines the mobile threat landscape including types of mobile malware, vulnerabilities, and statistics on infected platforms. It then examines players in the mobile ecosystem like MDM vendors, mobile anti-virus, application markets, and developers. Potential fixes are explored at the enterprise, consumer, vendor, and developer levels through capabilities mapping, malware detection, vulnerability analysis, and secure coding practices. The road ahead is seen through continued collaboration between these players and communities.
Survey of Rootkit Technologies and Their Impact on Digital ForensicsTyler Shields
This document discusses the history and evolution of rootkit technologies and their impact on digital forensics. It begins with defining rootkits as code used by attackers to surreptitiously execute and control systems while remaining undetected. The document then covers: (1) the origins and evolution of rootkits from modifying system binaries in the 1980s to more advanced techniques today, (2) the five classes of rootkits - application, library, kernel, firmware, and virtualized, and (3) how rootkits aim to hide themselves and impede forensic investigation, posing challenges for incident response.
Instrumentation is a technique that adds extra code to a program or environment for monitoring or changing program behavior. It can be used for various purposes like performance analysis, automated debugging, error detection, and security testing. Instrumentation can be applied at different levels like source code, bytecode, or binary code. Common instrumentation techniques include static analysis, dynamic analysis, load-time instrumentation, and dynamic binary instrumentation. Instrumentation is useful for tasks like control flow analysis, vulnerability detection, malware analysis, and fuzzing.
1) Software-defined networking (SDN) uses a decoupled architecture where the control plane is separated from the data plane to provide programmability of network behavior. This transition enables automation, elastic scaling, and improved security.
2) Software-defined security (SDS) extends this model to security controls by making them virtualized, programmable entities that can be centrally managed and automated
Preventing The Next Data Breach Through Log ManagementNovell
The document discusses how log management can be used for prevention, detection, and investigation of security incidents and data breaches. It explains that log management provides transparency by collecting logs from across an organization's IT infrastructure in a central location. This allows security teams to discover misconfigurations, unauthorized access attempts, and other anomalies that could indicate potential threats or actual security breaches. The document advocates for taking a preventative approach to security by using log data to monitor user activity and identity risks. It also promotes investing in security intelligence capabilities like security monitoring, analytics, and automated remediation.
This document provides an overview of intrusion detection systems (IDS), including their challenges, potential solutions, and future developments. It discusses how IDS aim to detect attacks against computer systems and networks. The challenges of high false alarm rates and dependency on the environment are outlined. Potential solutions explored include data mining, machine learning, and co-simulation mechanisms. Alarm correlation techniques are examined as ways to combine fragmented alert information to better interpret attack flows. Artificial intelligence is seen as important for improving IDS flexibility, adaptability, and pattern recognition.
Palestra do evento "Cybersecurity: a nova era em resposta a incidentes e auditoria de dados"
Jim Butterworth - Senior Cybersecurity Director Guidance Software Inc.
Brasília, 04 de agosto de 2010
The document discusses disrupting cyber attacks using Splunk software. It provides an overview of Splunk's security capabilities such as monitoring known and unknown threats, security investigations, and fraud detection. It then demonstrates how to investigate a hypothetical security incident at a company called Buttercup Games. The investigation uses Splunk to trace an attack from initial website exploitation and phishing email through endpoint infection back to the root cause of a user opening a weaponized PDF file. The investigation illustrates how Splunk can disrupt the cyber kill chain by connecting threat indicators from multiple data sources to rapidly uncover attack details and attributes.
Lessons Learned Fighting Modern Cyberthreats in Critical ICS NetworksAngeloluca Barba
A presentation given in April 2019 in London during ICS Cyber Security Conference. I discuss an anonymized investigation conducted by our team to identify a real malware infection on a production network, the tools and techniques used to contain this threat and how to use threat intelligence and visibility to stay ahead of cyber adversaries.
Asset visibility and network baselining
Continuous network monitoring
Threat intelligence ingestion
Thorough incident response plans
Mobile phone forensics presents huge challenges for digital investigators due to the rapid evolution of mobile technology. While traditional computer forensics procedures are well established, mobile forensics is still developing appropriate processes due to mobile devices' increasing capabilities, data storage, and usage. Mobile devices now store vast amounts of personal and sensitive data and are commonly used for online activities, making them valuable sources of evidence but also targets for cybercrime like hacking and malware. Investigators face challenges in obtaining forensically sound evidence from mobile systems.
ServicePilot NBA for z/OS Datasheet [EN]ServicePilot
ServicePilot NBA for z/OS is a solution that provides end-to-end visibility of application performance and dependencies on IBM mainframe systems. It monitors in real-time across LPARs with a single login, and generates reports to quickly identify and resolve issues. The tool pinpoints specific users and activities to determine causes of problems. It also detects security threats and unauthorized access through deep packet inspection and user-defined rules.
Gtb Dlp & Irm Solution Product And Deployment Overviewgtbsalesindia
The document provides an overview of the GTB DLP Suite, which includes four integrated modules: GTB Inspector, GTB eDiscovery, GTB IRM, and GTB Endpoint Protector. The GTB Inspector acts as a "reverse content-aware firewall" that monitors all network communications in real-time and blocks any detected violations of security policies. The GTB eDiscovery module discovers sensitive data on endpoints and applies IRM policies. GTB IRM enforces identity rights management policies on protected files. And GTB Endpoint Protector provides controls for removable media devices and audits file transfers to prevent data loss. The suite aims to prevent data loss, ensure compliance, and secure business processes through comprehensive network monitoring, file
Intrusion Detection Techniques for Mobile Wireless Networksguest1b5f71
This document proposes techniques for intrusion detection in mobile wireless networks. It discusses vulnerabilities in these networks and existing IDS approaches. It then presents a distributed and cooperative architecture where each node has an IDS agent to monitor for local anomalies. An information-theoretic approach is used for anomaly detection modeling traffic patterns, routing activities, and topological changes. Experiments show that on-demand routing protocols like DSR and AODV work better than table-driven protocols for detection due to path and pattern redundancy. The proposed techniques aim to provide effective intrusion detection in mobile ad hoc networks.
This document summarizes an enterprise data loss prevention solution from GTB. It provides sample customers ranging from large companies like Apple with 60,000 users to smaller credit unions. The solution includes a content-aware reverse firewall, endpoint DLP to discover, protect, audit and control devices, and an eDiscovery component to scan files and report on vulnerable files. Future plans include mobile device and network traffic protection. The solution aims to answer who is sending data, what data, and who is receiving it to help control unauthorized data loss.
This document provides an overview of intrusion detection systems (IDS) and Snort, an open source network-based IDS. It discusses the basic requirements, types (network-based, host-based, distributed), and approaches of IDS. It then focuses on Snort, describing its modes of operation, packet sniffing capabilities, and network intrusion detection. Key terms related to IDS are also defined. The document aims to introduce readers to IDS and Snort for monitoring network traffic and detecting intrusions and threats.
This document discusses Indicators of Compromise (IOCs) related to APT1, a Chinese cyber espionage group. It provides links to download the IOCs and explains how they can be used with Mandiant tools like Redline and MIR to detect malware. The document also defines IOCs and describes how the included IOCs were developed and may differ from other Mandiant IOCs. It notes that the IOCs focus on detecting known malware families and may not find new variants.
DYNAMIC IDP SIGNATURE PROCESSING BY FAST ELIMINATION USING DFAIJNSA Journal
Intrusion Detection & Prevention Systems generally aims at detecting / preventing attacks against Information systems and networks. The basic task of IDPS is to monitor network & system traffic for any malicious packets/patterns and hence to prevent any unwarranted incidents which leads the systems to insecure state. The monitoring is done by checking each packet for its validity against the signatures formulated for identified vulnerabilities. Since, signatures are the heart & soul of an Intrusion Detection and Prevention System (IDPS), we, in this paper, discuss two methodologies we adapted in our research effort to improve the current Intrusion Detection and Prevention (IDP) systems. The first methodology RUDRAA is for formulating, verifying & validating the potential signatures to be used with IDPS. The second methodology DSP-FED is aimed at processing the signatures in less time with our proposed fast elimination method using DFA. The research objectives of this project are 1) To formulate & process potential IPS signatures to be used with Intrusion prevention system. 2) To propose a DFA based approach for signature processing which, upon a pattern match, could process the signatures faster else could eliminate it efficiently if not matched
The document discusses security best practices, focusing on the Microsoft Security Development Lifecycle (SDL). The SDL is a 6-month iterative process that includes threat modeling, secure coding guidelines, code reviews, testing, and response. It aims to integrate security into all phases of development. Key SDL principles discussed are attack surface reduction, basic privacy, threat modeling, defense in depth, least privilege, and secure defaults.
An IDS (Intrusion detection system) is a device or software application that monitors network or system
activities for malicious activities or policy violations and produces reports to a management station. IDS
come in a variety of “flavors” and approach the goal of detecting suspicious traffic in different ways.
There are network based (NIDS) and host based (HIDS) intrusion detection systems. Some systems may
attempt to stop an intrusion attempt but this is neither required nor expected of a monitoring system.
Rationalization and Defense in Depth - Two Steps Closer to the CloudBob Rhubart
Security represents one of the biggest concerns about cloud computing. In this session we’ll get past the FUD with a real-world look at some key issues. We’ll discuss the infrastructure necessary to support rationalization and security services, explore architecture for defense –in-depth, and deal frankly with the good, the bad, and the ugly in Cloud security. (As presented by Dave Chappelle at OTN Architect Day in Chicago, October 24, 2011.)
The document discusses techniques for testing software security, as traditional testing methods are not well-suited for finding security bugs. It outlines several approaches for identifying unintended side effects, including monitoring for unexpected interactions with the environment, injecting faults to test error handling, and attacking dependencies and implementations. Specifically, the document recommends testing applications' use of resources like files, memory, and network availability under stressful conditions to identify potential vulnerabilities.
Symantec announced new website security solutions including support for new SSL encryption algorithms like Elliptic Curve Cryptography (ECC) and Digital Signature Algorithm (DSA). ECC provides stronger encryption with shorter keys, improved server and desktop performance, and meets future security needs. Symantec is the first certificate authority to offer ECC commercially. The announcements also included new services like the Certificate Intelligence Center and Secure App Service to help customers manage certificates and code signing keys.
The document discusses various techniques for confining untrusted code, including running it at different levels of isolation such as in a separate hardware system, virtual machine, process, or thread. It describes approaches like system call interposition and software fault isolation that monitor applications and isolate their ability to access resources. The document also covers topics like rootkits, which can provide unauthorized access, and intrusion detection systems, which monitor networks for malicious activity.
Secure Decisions is a company that helps analyze security data and decision processes to enhance cybersecurity. They develop visual analytics tools to help cyber analysts make sense of network and software vulnerability data. Some of their products include tools that visualize wireless network security data, software vulnerabilities from multiple scanners, and network flows and alerts. They also provide consulting services and have developed technologies like a cyber decision support system and a testbed for evaluating network monitoring algorithms. Their goal is to transition research into tools that can help operators in both government and commercial organizations.
Intrusion Detection Systems (IDSs) have become widely recognized as powerful tools for identifying, deterring and deflecting malicious attacks over the network. Intrusion detection systems (IDSs) are designed and installed to aid in deterring or mitigating the damage that can be caused by hacking, or breaking into sensitive IT systems. . The attacks can come from outsider attackers on the Internet, authorized insiders who misuse the privileges that have been given them and unauthorized insiders who attempt to gain unauthorized privileges. IDSs cannot be used in isolation, but must be part of a larger framework of IT security measures. Essential to almost every intrusion detection system is the ability to search through packets and identify content that matches known attacks. Space and time efficient string matching algorithms are therefore important for identifying these packets at line rate. In this paper we examine string matching algorithm and their use for Intrusion Detection. Keywords: System Design, Network Algorithm
This product brochure summarizes ManageEngine NetFlow Analyzer, a network traffic analysis and security tool. It provides unparalleled network visibility [1] and supports various flow technologies. [2] The tool helps monitor network performance, security threats, and application usage to ensure business critical services run optimally. [3]
IPS Product Comparison of Cisco 4255 & TippingPoint 5000Eallengalvan
This document compares and contrasts two intrusion prevention systems (IPS): the Cisco 4255 IPS and the TippingPoint 5000E IPS. It describes how each system works, focusing on the Cisco system's three main components for risk rating, event correlation, and threat identification. Both systems perform stateful packet inspection and inline protection at gigabit speeds. The Cisco system integrates with TrendMicro for malware protection and uses additional methods for risk analysis and policy enforcement.
The document discusses various topics related to information security including security audits, application security testing, secure software development lifecycles, identity management, network security assessments, security design, vulnerability analysis, remediation recommendations, penetration testing, compliance testing, and security trainings. It also discusses motives for security incidents, system incident management, security monitoring tools, data leakage prevention, exfiltration threats, deep session inspection, social network risk mitigation, public key infrastructure systems, and port-based authentication. The presentation is in Polish and concludes by thanking the audience.
You have spent a ton of money on your security infrastructure. But how do you string all those things together so you can achieve your goals of reducing time to response, detecting, preventing threats. And most importantly, having your security team serve your business and mission. Learn how to organize your security resources to get the best benefit. See a live demonstration of operationalizing those resources so your security teams can do more for your organization.
Lessons Learned Fighting Modern Cyberthreats in Critical ICS NetworksAngeloluca Barba
A presentation given in April 2019 in London during ICS Cyber Security Conference. I discuss an anonymized investigation conducted by our team to identify a real malware infection on a production network, the tools and techniques used to contain this threat and how to use threat intelligence and visibility to stay ahead of cyber adversaries.
Asset visibility and network baselining
Continuous network monitoring
Threat intelligence ingestion
Thorough incident response plans
Mobile phone forensics presents huge challenges for digital investigators due to the rapid evolution of mobile technology. While traditional computer forensics procedures are well established, mobile forensics is still developing appropriate processes due to mobile devices' increasing capabilities, data storage, and usage. Mobile devices now store vast amounts of personal and sensitive data and are commonly used for online activities, making them valuable sources of evidence but also targets for cybercrime like hacking and malware. Investigators face challenges in obtaining forensically sound evidence from mobile systems.
ServicePilot NBA for z/OS Datasheet [EN]ServicePilot
ServicePilot NBA for z/OS is a solution that provides end-to-end visibility of application performance and dependencies on IBM mainframe systems. It monitors in real-time across LPARs with a single login, and generates reports to quickly identify and resolve issues. The tool pinpoints specific users and activities to determine causes of problems. It also detects security threats and unauthorized access through deep packet inspection and user-defined rules.
Gtb Dlp & Irm Solution Product And Deployment Overviewgtbsalesindia
The document provides an overview of the GTB DLP Suite, which includes four integrated modules: GTB Inspector, GTB eDiscovery, GTB IRM, and GTB Endpoint Protector. The GTB Inspector acts as a "reverse content-aware firewall" that monitors all network communications in real-time and blocks any detected violations of security policies. The GTB eDiscovery module discovers sensitive data on endpoints and applies IRM policies. GTB IRM enforces identity rights management policies on protected files. And GTB Endpoint Protector provides controls for removable media devices and audits file transfers to prevent data loss. The suite aims to prevent data loss, ensure compliance, and secure business processes through comprehensive network monitoring, file
Intrusion Detection Techniques for Mobile Wireless Networksguest1b5f71
This document proposes techniques for intrusion detection in mobile wireless networks. It discusses vulnerabilities in these networks and existing IDS approaches. It then presents a distributed and cooperative architecture where each node has an IDS agent to monitor for local anomalies. An information-theoretic approach is used for anomaly detection modeling traffic patterns, routing activities, and topological changes. Experiments show that on-demand routing protocols like DSR and AODV work better than table-driven protocols for detection due to path and pattern redundancy. The proposed techniques aim to provide effective intrusion detection in mobile ad hoc networks.
This document summarizes an enterprise data loss prevention solution from GTB. It provides sample customers ranging from large companies like Apple with 60,000 users to smaller credit unions. The solution includes a content-aware reverse firewall, endpoint DLP to discover, protect, audit and control devices, and an eDiscovery component to scan files and report on vulnerable files. Future plans include mobile device and network traffic protection. The solution aims to answer who is sending data, what data, and who is receiving it to help control unauthorized data loss.
This document provides an overview of intrusion detection systems (IDS) and Snort, an open source network-based IDS. It discusses the basic requirements, types (network-based, host-based, distributed), and approaches of IDS. It then focuses on Snort, describing its modes of operation, packet sniffing capabilities, and network intrusion detection. Key terms related to IDS are also defined. The document aims to introduce readers to IDS and Snort for monitoring network traffic and detecting intrusions and threats.
This document discusses Indicators of Compromise (IOCs) related to APT1, a Chinese cyber espionage group. It provides links to download the IOCs and explains how they can be used with Mandiant tools like Redline and MIR to detect malware. The document also defines IOCs and describes how the included IOCs were developed and may differ from other Mandiant IOCs. It notes that the IOCs focus on detecting known malware families and may not find new variants.
DYNAMIC IDP SIGNATURE PROCESSING BY FAST ELIMINATION USING DFAIJNSA Journal
Intrusion Detection & Prevention Systems generally aims at detecting / preventing attacks against Information systems and networks. The basic task of IDPS is to monitor network & system traffic for any malicious packets/patterns and hence to prevent any unwarranted incidents which leads the systems to insecure state. The monitoring is done by checking each packet for its validity against the signatures formulated for identified vulnerabilities. Since, signatures are the heart & soul of an Intrusion Detection and Prevention System (IDPS), we, in this paper, discuss two methodologies we adapted in our research effort to improve the current Intrusion Detection and Prevention (IDP) systems. The first methodology RUDRAA is for formulating, verifying & validating the potential signatures to be used with IDPS. The second methodology DSP-FED is aimed at processing the signatures in less time with our proposed fast elimination method using DFA. The research objectives of this project are 1) To formulate & process potential IPS signatures to be used with Intrusion prevention system. 2) To propose a DFA based approach for signature processing which, upon a pattern match, could process the signatures faster else could eliminate it efficiently if not matched
The document discusses security best practices, focusing on the Microsoft Security Development Lifecycle (SDL). The SDL is a 6-month iterative process that includes threat modeling, secure coding guidelines, code reviews, testing, and response. It aims to integrate security into all phases of development. Key SDL principles discussed are attack surface reduction, basic privacy, threat modeling, defense in depth, least privilege, and secure defaults.
An IDS (Intrusion detection system) is a device or software application that monitors network or system
activities for malicious activities or policy violations and produces reports to a management station. IDS
come in a variety of “flavors” and approach the goal of detecting suspicious traffic in different ways.
There are network based (NIDS) and host based (HIDS) intrusion detection systems. Some systems may
attempt to stop an intrusion attempt but this is neither required nor expected of a monitoring system.
Rationalization and Defense in Depth - Two Steps Closer to the CloudBob Rhubart
Security represents one of the biggest concerns about cloud computing. In this session we’ll get past the FUD with a real-world look at some key issues. We’ll discuss the infrastructure necessary to support rationalization and security services, explore architecture for defense –in-depth, and deal frankly with the good, the bad, and the ugly in Cloud security. (As presented by Dave Chappelle at OTN Architect Day in Chicago, October 24, 2011.)
The document discusses techniques for testing software security, as traditional testing methods are not well-suited for finding security bugs. It outlines several approaches for identifying unintended side effects, including monitoring for unexpected interactions with the environment, injecting faults to test error handling, and attacking dependencies and implementations. Specifically, the document recommends testing applications' use of resources like files, memory, and network availability under stressful conditions to identify potential vulnerabilities.
Symantec announced new website security solutions including support for new SSL encryption algorithms like Elliptic Curve Cryptography (ECC) and Digital Signature Algorithm (DSA). ECC provides stronger encryption with shorter keys, improved server and desktop performance, and meets future security needs. Symantec is the first certificate authority to offer ECC commercially. The announcements also included new services like the Certificate Intelligence Center and Secure App Service to help customers manage certificates and code signing keys.
The document discusses various techniques for confining untrusted code, including running it at different levels of isolation such as in a separate hardware system, virtual machine, process, or thread. It describes approaches like system call interposition and software fault isolation that monitor applications and isolate their ability to access resources. The document also covers topics like rootkits, which can provide unauthorized access, and intrusion detection systems, which monitor networks for malicious activity.
Secure Decisions is a company that helps analyze security data and decision processes to enhance cybersecurity. They develop visual analytics tools to help cyber analysts make sense of network and software vulnerability data. Some of their products include tools that visualize wireless network security data, software vulnerabilities from multiple scanners, and network flows and alerts. They also provide consulting services and have developed technologies like a cyber decision support system and a testbed for evaluating network monitoring algorithms. Their goal is to transition research into tools that can help operators in both government and commercial organizations.
Intrusion Detection Systems (IDSs) have become widely recognized as powerful tools for identifying, deterring and deflecting malicious attacks over the network. Intrusion detection systems (IDSs) are designed and installed to aid in deterring or mitigating the damage that can be caused by hacking, or breaking into sensitive IT systems. . The attacks can come from outsider attackers on the Internet, authorized insiders who misuse the privileges that have been given them and unauthorized insiders who attempt to gain unauthorized privileges. IDSs cannot be used in isolation, but must be part of a larger framework of IT security measures. Essential to almost every intrusion detection system is the ability to search through packets and identify content that matches known attacks. Space and time efficient string matching algorithms are therefore important for identifying these packets at line rate. In this paper we examine string matching algorithm and their use for Intrusion Detection. Keywords: System Design, Network Algorithm
This product brochure summarizes ManageEngine NetFlow Analyzer, a network traffic analysis and security tool. It provides unparalleled network visibility [1] and supports various flow technologies. [2] The tool helps monitor network performance, security threats, and application usage to ensure business critical services run optimally. [3]
IPS Product Comparison of Cisco 4255 & TippingPoint 5000Eallengalvan
This document compares and contrasts two intrusion prevention systems (IPS): the Cisco 4255 IPS and the TippingPoint 5000E IPS. It describes how each system works, focusing on the Cisco system's three main components for risk rating, event correlation, and threat identification. Both systems perform stateful packet inspection and inline protection at gigabit speeds. The Cisco system integrates with TrendMicro for malware protection and uses additional methods for risk analysis and policy enforcement.
The document discusses various topics related to information security including security audits, application security testing, secure software development lifecycles, identity management, network security assessments, security design, vulnerability analysis, remediation recommendations, penetration testing, compliance testing, and security trainings. It also discusses motives for security incidents, system incident management, security monitoring tools, data leakage prevention, exfiltration threats, deep session inspection, social network risk mitigation, public key infrastructure systems, and port-based authentication. The presentation is in Polish and concludes by thanking the audience.
You have spent a ton of money on your security infrastructure. But how do you string all those things together so you can achieve your goals of reducing time to response, detecting, preventing threats. And most importantly, having your security team serve your business and mission. Learn how to organize your security resources to get the best benefit. See a live demonstration of operationalizing those resources so your security teams can do more for your organization.
Continuous Monitoring and Real Time Risk ScoringQ1 Labs
This document discusses continuous monitoring and real-time risk scoring. It describes how continuous monitoring can detect vulnerabilities and potential threats by monitoring network changes and comparing configurations to network activity. A two-phased compliance timeline is mentioned. Continuous monitoring provides a single console view of risk exposure needed to meet federal requirements. Metrics and risk-relevant data are discussed. The benefits of security intelligence across infrastructure for anomaly detection and actionable, informative outputs are presented.
The document discusses System Center Endpoint Protection 2012 which is integrated with System Center Configuration Manager 2012 to provide security and antimalware management for desktops, portable computers, and servers from a single infrastructure; it highlights features like improved protection against known and unknown threats, easy migration from previous versions, and role-based management.
Symantec Endpoint Detection and Response (EDR) allows security teams to rapidly detect threats, investigate incidents, and remediate impacted endpoints through features like machine learning, global threat intelligence, automated playbooks, and memory analysis. EDR integrates with security tools to enhance visibility and automation. It can be deployed both on-premises and in the cloud to extend security for organizations of any size.
The document discusses trends in IT security innovations and solutions. It covers topics like mobility raising security issues, common security problems in enterprises, and the need for monitoring systems, encryption, and network visibility solutions to address vulnerabilities. The presentation promotes specific products from SpectorSoft, PGP, and Lumension that can help with monitoring, encryption, and network access control.
Trend micro real time threat management press presentationAndrew Wong
Trend Micro is launching new real-time threat management solutions to address the insufficiency of traditional security against today's advanced threats. The solutions include the Trend Micro Threat Management System for network-wide visibility and control, the Threat Intelligence Manager for actionable threat intelligence, and vulnerability management services for timely patching. These solutions aim to detect, analyze, and remediate advanced threats in real-time through network monitoring, threat intelligence, and continuous vulnerability assessments.
IT Compliance and Governance with DLP Controls and Vulnerability Scanning Sof...Skoda Minotti
This document discusses how data loss prevention (DLP) controls and vulnerability scanning software can help with IT compliance and governance. It describes how DLP tools can aid in policy development, identify data to be protected, and provide audit reports. Vulnerability scanners can identify network device weaknesses and validate machine configurations. The document also provides an overview of a DLP solution from CTH Technologies that uses agents to monitor, analyze, and mitigate risk across desktops, customer and employee data, and applications.
Crafting Super-Powered Risk Assessments by Digital Defense Inc & VeracodeDigital Defense Inc
http://www.ddifrontline.com
Digital Defense Inc (DDI) and Veracode present the "Crafting Super-Powered Risk Assessments" webinar and slides. The presentation covers security assessments, application security, and how to manage risk.
This document summarizes an presentation about operationalizing security intelligence. It discusses three key aspects:
1. Using risk-based analytics to prioritize alerts based on correlating events over time and assigning risk scores to hosts. This helps determine which alerts require immediate investigation.
2. Adding context to alerts by integrating data from different technologies, matching context, and acquiring additional context through APIs. This provides more insight into prioritizing alerts.
3. Connecting security data with people by enabling human-mediated automation, collaboration, free-form investigation through interactive views and workflows. This allows leveraging all security data and human intuition in investigations.
The presentation promotes operationalizing security intelligence through these approaches and evaluating Spl
This document discusses computer forensics and incident response. It defines computer forensics as the application of investigation and analysis techniques to determine potential legal evidence from computers. It discusses the importance of the Daubert/Frye standards for admitting scientific evidence in court. It then describes how the EnCase digital investigations platform can help organizations integrate computer forensics into their incident response processes to more effectively investigate security incidents, comply with regulations, and proactively reduce risks.
Static Detection of Application BackdoorsTyler Shields
The document discusses detecting application backdoors through static analysis of executable code. It defines application backdoors as versions of legitimate software modified to bypass security under certain conditions. The summary discusses three main types of application backdoors that can be detected through static analysis:
1) Special credentials - Detecting hardcoded or computed credentials not from the authentication store.
2) Unintended network activity - Finding network activity not intended in the software design.
3) Deliberate information leakage - Identifying code that leaks sensitive information.
Static analysis rules can inspect for these patterns and other malicious indicators like embedded shell commands, time bombs, and rootkit-like behavior. Well-known backdoor mechanisms can be ob
This document discusses various aspects of cyber warfare and security. It introduces cyber deterrence and its challenges. It then describes components of a reference model for cyber security including surveillance, penetration testing, honey nets, forensics, attribution, monitoring, reconnaissance, scanning, vulnerability analysis and exploitation. For each component, it provides details on the concept and relevant tools. The document aims to provide an overview of the cyber warfare landscape and approaches.
Tech Throwdown: Secure Containerization vs WhitelistingInvincea, Inc.
To address the inadequacy of traditional anti-virus solutions, white-listing and secure containerization approaches have both gained traction in the enterprise. Both approaches have the overarching goal of preventing a successful breach at the endpoint, but each works differently and also focus on different parts of the cyber kill chain.
Invincea, a secure containerization solution, inoculates high-risk and Internet-facing applications against attack by running them in secure virtual containers, which have restricted access to the underlying host OS. This effectively removes the most common means of delivering the infection (see figure below). Any successful exploits of targeted applications (such as IE, Java, Flash, etc.), including by 0-day exploits, are kept safely in quarantine where additional forensic details may be uncovered.
Whitelisting attempts to prevent infections by allowing only certain known executables to run. This means whitelisting solutions will not see initial exploits; rather, whitelisting focuses on the next step beyond the exploit where many attacks then attempt to launch 2<sup>nd</sup> stage (malicious) executables with additional goals such as privilege escalation, lateral movement, or data exfiltration. In other words, whitelisting solutions do not have visibility into exploits of existing programs and for memory-resident malware. In addition, whitelisting solutions that prevent unknown software from running will flag legitimate software (such as patches) that are not updated with the whitelist.
This document discusses whether antivirus (AV) software is dead or just missing in action. It begins by comparing traditional, signature-based AV to next-generation security products that use techniques like machine learning and threat intelligence. The document then debunks common myths about AV and security technologies. It analyzes results from tests of next-generation security products on services like VirusTotal. The document concludes that while no single product can stop all threats, security defenses continue to evolve beyond traditional AV through layered approaches.
Is av dead or just missing in action - avar2016rajeshnikam
This document discusses whether antivirus (AV) software is dead or just missing in action. It begins by comparing traditional, signature-based AV to next-generation security products that use techniques like threat intelligence and machine learning. The document then debunks common security myths and discusses VirusTotal's role in evaluating next-gen AVs. Results from independent tests of various next-gen security products are presented. The document concludes that while no single product can solve all security issues, the approach to security needs to constantly evolve through layered defenses and beyond just next-gen hype.
The document discusses how Splunk can provide analytics-driven security for higher education through ingesting and analyzing machine data. It outlines how advanced threats have evolved to be more coordinated and evasive. A new approach is needed that fuses technology, human intuition, and processes like collaboration to detect attackers through contextual behavioral analysis of all available data. Examples are provided of security questions that can be answered through Splunk analytics.
CSF18 - Incident Response in the Cloud - Yuri DiogenesNCCOMMS
This document discusses how Azure Security Center (ASC) can help security operations centers (SOCs) with incident response in the cloud. ASC provides initial triage of security alerts and incidents, performs investigations across cloud and on-premises data sources, and gives SOC teams contextual awareness of incidents through linked alerts and machines. The document demonstrates ASC's capabilities through examples of detecting malware, exploiting processes, and responding to attacks.
Symantec Endpoint Protection 12 provides a single agent and console for antivirus, antispyware, firewall, and other protections across Windows and Mac devices. It uses a new Insight technology powered by data from over 175 million endpoints to detect emerging and mutated threats that evade traditional signature-based scanning. Insight analyzes factors like file age, frequency, location, and community reputation ratings to proactively protect against new threats. Testing shows Symantec provides the most effective security with fewer false positives than competitors like Sophos, Kaspersky, Trend Micro, Microsoft, and McAfee.
Similar to CYBER INTELLIGENCE & RESPONSE TECHNOLOGY (20)
1. Digital Investigations of Any Kind
ONE COMPANY
Cyber Intelligence
Response Technology
(CIRT)
www.accessdata.com
2. Who we are..
• AccessData has been in this industry for
over 25 years
• Offices in Utah, Houston, San Francisco,
London, Virginia, Maryland, Frankfurt,
Dubai, Australia and China
• Market leader/ Best of breed technologies
in Forensics and eDiscovery
• 130,000+ Clients Globally
• Train over 6000 customers each year
• Sustained annual growth year after year of
between 60% - 80%
• Gartner recognized as an Innovator in the
space
4. A Shift from Disparate Solutions
Traditional Approach: Paradigm Shift:
Point solutions do not provide a true Integrated Analysis in Single Platform
“360-degree” look at what is with Built-in Remediation
happening.
Network Forensics
Host-based Forensics
Volatile Data
Removable Media Audit Data Audit
Malicious Code Analysis / Threat Scoring
Security / Process Functions
High Entropy
Dynamic Loading
Imports Process Manipulation
Functions
Imports Security Functions
5. CIRT Platform – Built on Validated Technology
Network Forensics Host Based Forensics
Data Audit
Volatile Data
6. CIRT – The Value of Integrated Analysis
CLASSIFIED DATA SPILLAGE VIRTUAL WORKFORCE INTRUSION ALERT
Agency proactively audits using laptop checks in at intervals to be Unauthorized port 443 traffic. Visualize
scanned for anomalies which are communications, drill down into suspect
terms, such as “eyes only” and
host. Perform behavioral forensic analysis.
“top secret”. All instances all recorded, including network
Honeypot avoidance, crypto, dynamic
flagged for removal in and USB activity. Remote loading, high entropy and other criteria
accordance with federal agency monitoring helps to identify any indicate malware.
policies. instance of IP theft. Batch remediation function is leveraged.
ADVANCED MALWARE
CREDIT CARD AND ZERO DAY DETECTION
INFORMATION REPORTED Proactive monitoring the identification
Help desk is called alerting them of malicious codes behaviors from
that employee discovered credit multiple computers. Perform
card information on an unsecure differential analysis of volatile
location. Company reactively data, perform malware analysis/ threat
conducts PCI audit to locate
scoring. Analysis reveals malicious
exposed credit card holder info.
processes. Scan large enterprise for
Instances are wiped. Findings
defined processes and/or similar
are reported.
behavior and issue batch remediation.
Integrated Platform Monitor for recurrence.
7. Multi-Team Collaboration for Improved Emergency Response
Incident
Response
Team
Computer Information
Forensics Security
Team Team
Network
Compliance
Security
Team
Team
8. Key capabilities of the agent core
• Acts independently on/off network
• Has it’s own scheduler and local policy cache
• Agent can be installed as persistent or self-dissolving
after x number of days
• There is a run time version of the agent that allows
full capability without the need to actually install the
agent. (this mode does not allow for persistent/
scheduled functions)
• Has protected storage area securely store payload
until it can communicate back to site server.
9. The agent is made up of the following modules
• Core: Responsible for managing communication, policy / job execution, and defensive
measures, delivering payload, and updating itself
• NetFS: Provides the filtering, searching, collection, and preservation capabilities (same technology in
agent is what supports network share capability
• Cerberus: The ability to identify malware (with no prior knowledge) based of search/filter criteria on
running system or network shares across the enterprise. For example a job could be defined to Stage 1
Cerberus score all exe on a given set of systems. Any files that have a high threat score will be
automatically sent to the Stage 2 Cerberus analysis. There are options to choose whether the files are
preserved or just the metadata.
• Volatile: Now users can setup jobs to scan the enterprise and capture volatile data and interact with the
data in review. The volatile data includes pre-built facets and the ability to view details for all of the
volatile data payload. Volatile data includes Processes, Network
Sockets, Dll’s, Handles, Drivers, Services, Network Devices, registry, and users
• RAM: Now users can setup jobs to scan the network and analyze RAM along with Volatile or just RAM
analysis and interact with the data in review. The volatile data includes pre-built facets and the ability to
view details for all of the RAM analysis. RAM analysis includes Processes, Network
Sockets, Dll’s, Handles, Drivers, Services, Network, Devices, Processors, and registry.
• RMM (removable media module): Enables the targeted monitoring of files coming from and going to
removable media (USB/Firewire/CD/DVD). With job options to just record metadata or metadata and
payload for documents based off of user defined extensions. Results can be viewed, filtered and searched
on in the new review interface with the support of pre-made filter facets to quickly identify
documents/files coming from or going to removable media.
• SilentRunner : Advanced host based packet capture with robust filtering capabilities
• Remediation: Allows for the killing of processes and wiping of files
10. CIRT – SilentRunner Agent Module
Key Capabilities
Define operating parameters for the agent collector:
o on/off
o filter based off of these IP address
o filter based off of these ports or protocols or application
o filter based off of these IP address <to-from> these
ports/protocols
o define how much data can be collected
o define if it stops collecting once it hits max collection
o Define if it just has an open rolling buffer.
These settings would be applied as a policy/operating parameters
o Specify beginning and end for application of the policy
o Adhere to a schedule
The Pcap payload would be securely stored on the agent
Agent will store and forward for ingestion into centralized
SilentRunner System for integrated and correlated analysis
11. Intro to Cerberus
• CIRT is the first step towards automated reverse
engineering so you can triage a binary before
sending it for further analysis
• We tally all of the attributes we think are
“interesting” into a score that you can sort by
• For each binary, you can then drill down into that
score to see the attributes that we found that were
similar to malicious binaries we’ve seen in the past
12. What is Cerberus?
Cerberus reduces the level of expertise required to do
malware analysis.
Ideal for first responders.
STATIC ANALYSIS / DATA FLOW ANALYSIS
YIELDS SIMILAR RESULTS AS DYNAMIC ANALYSIS
STAGE ONE: Generic File/Metadata Analysis
• Identifies potentially malicious code, generates threat score.
Mythology Trivia:
STAGE TWO: Disassembly Analysis Cerberus guards the gates of the
• Runs elements of the code, without running actual underworld to prevent those who
executable. To find out what the binary is capable of. have crossed into Hades from
escaping.
WORKS AGAINST…
• Binaries that live on disk or network share In other words… he prevents bad
• System Memory – unpacked binaries things from breaking free.
13. Cerberus Analysis Approach
Cerberus uses a different approach than other products on the market because it doesn’t
rely on :
• Dynamic Analysis, Often not reliable, because the binary could recognize that it is
being analyzed and perform a different action in order to intentionally fool the analyst.
• Traditional Heuristics, such as the monitoring of modifications to the registry and the
insertion of hooks into certain library or system interfaces, are not based on the
fundamental characteristics of malware.
• High false positive / false negative rates.
• Signature-based /byte string analysis: cannot detect new malware or new variants and
requires prior knowledge in the form of an action or byte string.
NOTE: We are not relying on whitelists or signatures. We are able to assess behavior and
identify intent without the above methodologies.
14. What Does Cerberus Do?
STAGE ONE ANALYSIS STAGE TWO ANALYSIS
Basic Disassembly Analysis:
Executable Binary Analysis: • •Integrated disassembly engine
• Product Name • •If using network functionality, potentially what host it
• Product Version is communicating with and over what protocol(s)
• Company Name, etc. • •If using network functionality, can it bypass proxy
• Functions included in the Import Table servers?
• Network • For functions that require usernames and/or
• Process passwords, does the executable contain static string
• Security indicating insider or advanced knowledge?
• Registry
• Dynamic Loading, etc. Advanced Disassembly Analysis:
• Does the binary have high entropy (obfuscated)? • Automated unpacking
• Does the binary have signatures of: • Automated code and data flow analysis
• Internet Relay Chat (“IRC”) • •More advanced Functionality Interpretation
• Shellcode • IP addresses and Domain Names Used
• Cryptography (“Crypto”) • Debugger and Sandbox avoidance
• Does the binary contain strings associated with • Command and Control Functionality
autoruns? • Hooking Techniques
• Digital Signature Verification • Arbitrary Code Execution
• Host Forensic Artifacts
• Registry Settings
• Temp Files
• Configuration Files
25. So what?!
• This info will give you insight you’ve never had
before, in seconds!
• Your reverse engineering team will love you
because you’ll finally know what causes you
concern other than “it looked weird”
• If you’re a reverse engineer, this will save you
a ton of time!
26. CIRT – Removable Media Module
Key Capabilities Administrative Capabilities
Supports data copied to or from removable media The operator has a way to define parameters and apply
o Data copied from computer with agent policy/operating rules to the agent(s) and check status
o Data copied from removable media to machine with Ability to view activity in the form of reports
agent By user
Configurable parameters of what gets capture on the By source
agent such as: By Date range
o File with a given set of extensions The metadata captured will be accessible to a 3rd party
o Ability to turn it on/off application that can query for the tables that contain this
o Ability for it turn on/off between a date range information such as Arcsight
o Capture metadata only o Node name
o Capture the entire file o Name and extension of files copied to removable media
o Capture metadata for all files but preserve files o Date/time a given item was copied to/from removable
based off of a given filter criteria media
o Ability to trigger capture based off a filename Preserved data will be temporarily stored on the host machine
o Ability to trigger capture based off of file metadata in protected storage until it is picked up for
(extension/filename) processing/reporting
Ability to have triggers Ability specify maximum amount of storage that could be
o Does not track anything unless the file meets filter used
criteria o Ability to specify what happens when the secure
Ability to BLOCK any copy/paste operation to removable storage runs out of space
media Open buffer
Ability to track files opened from a usb/removable media Keep what it has and stop tracking
on host computer
Ability to view and analyze files that where captured as
part of interactive review.
30. CIRT – Architecture
Nodes with
Proxy Agent
Public Site Server
SilentRunner
(DB/Processing)
Network Shares
(Non agent data
sources)
Private Site Server
Application/Web Agents
Logging DB (ms sql) (Workstations/Laptops
Web Console /Servers )
Private Site Server
31. Thank You !
Jason Mical
Director of Network
Forensics
AccessData Group
Editor's Notes
Founded 1987Privately funded/ ownedHeadquartered in Utah, USLondon Office (training)FrankfurtDubaiMarket leader/ Best of breed forensic technologiesBest known for Forensic Toolkit® (FTK™) 130,000+ Clients GloballyTrain more than 6,000 individuals annuallySustained annual growth 60% - 80% YOYGartner – Innovator in the space
Agent based policy and job engineOnce a week tell me what files have been added or removedEvery day tell me about processes that are running against a baseline
Application Server: Manages workflow and eDiscovery operations within the application (orchestration services, business services, work distribution services)Web Server: Provides web services for users to drive workflow/eDiscovery operations within the application. Also hosts website for Data Modeling (first pass review)Collection Worker(s): The service that does the actual search and forensic level collection from data sources (structured/unstructured/semi structured) designed to scale up and outProxy Worker: manages collection from proxyable assetsProcessing Worker: The service that performs the post collection processing of data. Expand archives (PST’s/NSF), indexes, de-duplication analysis, file identification, 2ndary culling/filtering, and production (scales up and will soon scale out)Processing Database: Database that facilitates 2ndary culling/filtering, data modeling, searching, de-duplication and production (scales up) Orchestration and Logging Database – Database that tracks all eDiscovery matters, workflows and operationsAgent: service that runs on target nodes providing secure forensic level access and preservation of ESISilent Runner: Network Forensic Capture and Analysis Engine