This document summarizes the speaker's background and their company Risk I/O. It discusses how security is currently a "lemons market" that lacks incentives and has negative externalities. Risk I/O takes a data-driven approach to vulnerability management as a service. Several example use cases are provided, such as using data to predict vulnerabilities and breaches, perform CVE trend analysis, and calculate targets of opportunity based on vulnerability posture and threat activity. The document ends by inviting the audience to follow their blog and Twitter, and notes they are hiring.
For years businesses have been mining and culling data warehouses to measure every layer of their business right down to the clickstream information of their web sites. These business intelligence tools have helped organizations identify points of poor product performance, highlighting areas of current and potential future demand, key performance indicators, etc. In the information security field we still tend to look at our information in silos. Dedicated engineers solely focused on web application security, network security, compliance and so on, all while bemoaning a lack of support and information.
What if Information Security teams operated with the same insight as the product, marketing and business intelligence groups within their organization? Imagine if you had a data warehouse covering all of your applications, infrastructure, logs, vulnerability assessments, incidents, financial information, and meta data. What could you do with this readily available information?
By gathering and using both internal and public data, information security teams can utilize decision support systems allowing them to prioritize remediation efforts and react faster to issues. When looking through disparate data sources with a security lens, a security team can mine information that may expose threats through multiple vectors or paths.
In this talk, Ed will cover some of the many sources of security data publicly available and how to apply them to add context to your security data and tools to help make more intelligent decisions. Ed also points out a number of ways to repurpose information and tools your company is already using in order to glean a clearer view into your information security program and the threats that may effect it.
A few years ago Alex Hutton coined the term Security Mendoza Line. It was in reference to Mario Mendoza the baseball player often used as a baseline for how well a player must hit in order to stay in the major leagues and not be demoted. Keeping up with the attacks automated within Metasploit can often serve as that baseline within information security.
More recently, Josh Corman defined HD Moore's Law as "Casual Attacker power grows at the rate of Metasploit". In other words, that baseline is moving and we are not keeping up. In a hyped industry where much of the talk remains around Advanced Persistent Threats it's the baseline that we continue to miss as proven out in reports like Verizon's Data Breach Investigation Report. Looking at the most common breaches they are most likely to be targets of opportunity where the defenders have let the basics slip through the cracks.
In this talk, we will cover why paying attention to HD Moore's Law is important and how to stay on top of this changing threat measurement. We'll offer real world examples on how an organization can identify where they stand against the Security Mendoza Line and how they can alert and defend against falling below the baseline. Content will cover not only identified threats through Metasploit modules but through the myriad of exploit sources available across the internet.
That's So Meta: Gleaning Business Context In The Vulnerability Warehouse
Ed Bellis, HoneyApps
For years businesses have been mining and culling data warehouses to measure every layer
of their business right down to the clickstream information of their web sites. These
business intelligence tools have helped organizations identify points of poor product
performance, highlighting areas of current and potential future demand, key performance
indicators, etc. Imagine if you had a data warehouse covering all of your applications,
infrastructure, logs, vulnerability assessments, incidents, financial information, and
metadata. What could you do with this readily available information? In this talk, Ed will
cover some of the many sources of security data publicly available and how to apply them
to add context to your security data and tools to help make more intelligent decisions. Ed
also points out a number of ways to repurpose information and tools your company is
already using in order to glean a clearer view into your security program and the threats
that may affect it.
For years businesses have been mining and culling data warehouses to measure every layer of their business right down to the clickstream information of their web sites. These business intelligence tools have helped organizations identify points of poor product performance, highlighting areas of current and potential future demand, key performance indicators, etc. In the information security field we still tend to look at our information in silos. Dedicated engineers solely focused on web application security, network security, compliance and so on, all while bemoaning a lack of support and information.
What if Information Security teams operated with the same insight as the product, marketing and business intelligence groups within their organization? Imagine if you had a data warehouse covering all of your applications, infrastructure, logs, vulnerability assessments, incidents, financial information, and meta data. What could you do with this readily available information?
By gathering and using both internal and public data, information security teams can utilize decision support systems allowing them to prioritize remediation efforts and react faster to issues. When looking through disparate data sources with a security lens, a security team can mine information that may expose threats through multiple vectors or paths.
In this talk, Ed will cover some of the many sources of security data publicly available and how to apply them to add context to your security data and tools to help make more intelligent decisions. Ed also points out a number of ways to repurpose information and tools your company is already using in order to glean a clearer view into your information security program and the threats that may effect it.
A few years ago Alex Hutton coined the term Security Mendoza Line. It was in reference to Mario Mendoza the baseball player often used as a baseline for how well a player must hit in order to stay in the major leagues and not be demoted. Keeping up with the attacks automated within Metasploit can often serve as that baseline within information security.
More recently, Josh Corman defined HD Moore's Law as "Casual Attacker power grows at the rate of Metasploit". In other words, that baseline is moving and we are not keeping up. In a hyped industry where much of the talk remains around Advanced Persistent Threats it's the baseline that we continue to miss as proven out in reports like Verizon's Data Breach Investigation Report. Looking at the most common breaches they are most likely to be targets of opportunity where the defenders have let the basics slip through the cracks.
In this talk, we will cover why paying attention to HD Moore's Law is important and how to stay on top of this changing threat measurement. We'll offer real world examples on how an organization can identify where they stand against the Security Mendoza Line and how they can alert and defend against falling below the baseline. Content will cover not only identified threats through Metasploit modules but through the myriad of exploit sources available across the internet.
That's So Meta: Gleaning Business Context In The Vulnerability Warehouse
Ed Bellis, HoneyApps
For years businesses have been mining and culling data warehouses to measure every layer
of their business right down to the clickstream information of their web sites. These
business intelligence tools have helped organizations identify points of poor product
performance, highlighting areas of current and potential future demand, key performance
indicators, etc. Imagine if you had a data warehouse covering all of your applications,
infrastructure, logs, vulnerability assessments, incidents, financial information, and
metadata. What could you do with this readily available information? In this talk, Ed will
cover some of the many sources of security data publicly available and how to apply them
to add context to your security data and tools to help make more intelligent decisions. Ed
also points out a number of ways to repurpose information and tools your company is
already using in order to glean a clearer view into your security program and the threats
that may affect it.
Dealing with the threat of spoof and phishing mail attacks part 6#9 | Eyal ...Eyal Doron
In the following article, we will review the solution and the methods that we can use for dealing with the threat of – Phishing mail attacks and his derivative Spoof mail attack.
Heartbleed has exposed a weakness in the way we assess risk in information security. We use archaic methods and ignore new data when assessing what to fix, and we rarely go back to see what new data is telling us.
In this talk, we explore new, data-driven approaches to vulnerability management.
Security and User Experience: Pushing for Change in the Enterprise Environment
Slides from the NUX5 talk by Glenn A. Gustitus, Friday 7th October 2016.
2016.nuxconf.uk / nuxuk.org
Synopsis:
Pushing for change in an enterprise environment is a challenge. Improving the user experience of security solutions that historically have conflated being difficult to use with being more secure, is an even larger challenge. In this talk, Glenn is going to show you why security needs our help, and how to have a positive impact in an environment known for being especially change resilient.
Brad Andrews, CEO, RBA Communications
Threat Modeling Overview
This session will cover the basic elements of threat modeling, looking at what it does and why it is important. The goal is to provide a high level overview of the process and the use of things like data flow diagrams to look for trust boundaries attacks may come across. We will go through some common threats and hopefully a list of dangers to watch out for when carrying out threat modeling. The session will then work to interactively develop a flow diagram of Amazon.com and possibly another subject if we have time. This will all be based on looking at the system as a user, without any insider knowledge, though Threat Modeling is normally carried out by those who do know the system well.
Presentation on data security for nonprofit organizations presented by Ken Robey, CISSP, of Security in Focus, Inc., as part of the Project Ignite forum series.
Data Science ATL Meetup - Risk I/O Security Data ScienceMichael Roytman
This is a talk about data science operations and the applications of Risk I/Os insights to the security industry - how we went about mining insights from our large dataset
While phishing is an “old-fashioned” cyber security threat, attacks continue to increase. This course will better prepare you to defend against this threat.
KharkivJS 2018 Information Security PracticeViktor Turskyi
Real examples of hacking. Set of demos for JavaScript developers based on twitter like application written in ReactJs and NodeJs. We will run real code and real exploits during demo.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Dealing with the threat of spoof and phishing mail attacks part 6#9 | Eyal ...Eyal Doron
In the following article, we will review the solution and the methods that we can use for dealing with the threat of – Phishing mail attacks and his derivative Spoof mail attack.
Heartbleed has exposed a weakness in the way we assess risk in information security. We use archaic methods and ignore new data when assessing what to fix, and we rarely go back to see what new data is telling us.
In this talk, we explore new, data-driven approaches to vulnerability management.
Security and User Experience: Pushing for Change in the Enterprise Environment
Slides from the NUX5 talk by Glenn A. Gustitus, Friday 7th October 2016.
2016.nuxconf.uk / nuxuk.org
Synopsis:
Pushing for change in an enterprise environment is a challenge. Improving the user experience of security solutions that historically have conflated being difficult to use with being more secure, is an even larger challenge. In this talk, Glenn is going to show you why security needs our help, and how to have a positive impact in an environment known for being especially change resilient.
Brad Andrews, CEO, RBA Communications
Threat Modeling Overview
This session will cover the basic elements of threat modeling, looking at what it does and why it is important. The goal is to provide a high level overview of the process and the use of things like data flow diagrams to look for trust boundaries attacks may come across. We will go through some common threats and hopefully a list of dangers to watch out for when carrying out threat modeling. The session will then work to interactively develop a flow diagram of Amazon.com and possibly another subject if we have time. This will all be based on looking at the system as a user, without any insider knowledge, though Threat Modeling is normally carried out by those who do know the system well.
Presentation on data security for nonprofit organizations presented by Ken Robey, CISSP, of Security in Focus, Inc., as part of the Project Ignite forum series.
Data Science ATL Meetup - Risk I/O Security Data ScienceMichael Roytman
This is a talk about data science operations and the applications of Risk I/Os insights to the security industry - how we went about mining insights from our large dataset
While phishing is an “old-fashioned” cyber security threat, attacks continue to increase. This course will better prepare you to defend against this threat.
KharkivJS 2018 Information Security PracticeViktor Turskyi
Real examples of hacking. Set of demos for JavaScript developers based on twitter like application written in ReactJs and NodeJs. We will run real code and real exploits during demo.
Similar to An Economic Approach to Info Security (20)
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
2. Nice To Meet You
About Me
CoFounder HoneyApps
Former CISO Orbitz
Contributing Author
Beautiful Security
CSO Magazine/Online Writer
InfoSec Island Blogger
About Risk I/O
Data-Driven Vulnerability Management as a Service
16 Hot Startups - eWeek
3 Startups to Watch - Information Week
9. Example Use Case 2
HD Moore’s Law - Josh Corman
aka Security Mendoza Line
“Compute power grows at the rate
of doubling about every 2 years”
“Casual attacker power grows at
the rate of Metasploit”
10. Example Use Case 3
Predicting Vulnerability (or even breach)
Trending
Key Attributes
Outcomes
11. Example Use Case 4
CVSS &
The Base
credit:
Rate Fallacy Jeff Lowder
16. Q&A
follow us
the blog
http://blog.risk.io/
twitter
@ebellis And one more thing....
@risk_io We’re Hiring! https://www.risk.io/jobs
Editor's Notes
From Shaman to Scientist - A Use Case in Data Driven Security\n
\n
Talk about WEIS. Security is an opaque attribute within the software market. It is not easily apparent to the buyer how much security they are getting when they purchase software. This is similar to quality within the automotive industry. There are no good ways to determine what you are getting. This is a problem for the buyer and we need to figure out how to make security more transparent to the software purchaser. \n
Developers are rarely incented by software security. Speed to market, functionality and other code quality factors are often prioritized over secure code. Revenues and customer acquisition is rarely driven by security. This creates a lack of incentives around software security.\n
Security is a negative externality. This is creates very big issues in the broader security of systems and the internet. A commonly used example in security of a negative externality are botnets. As an avg user on the internet I have very little incentive to secure my machine from being part of a botnet. Other than some bandwidth or system resource consumption, it doesn’t do me much harm. But those suffering a DDOS attack via a botnet are suffering the consequence from the avg user not protecting their machine. In other words, those with the power to protect are not incented to do so.\n
proving the negative is hard. why not just sell on emotion? talk about secrecy of controls - no sharing of data except bad guys - follow best practices lest you be hacked. Use the data proves negligence example. Lawyers suck at risk mgmt.\n
We need to take a more data driven approach to security. Relying on metrics and yes and in some cases real live outcomes and evidence. There are a lot of complaints in our field about a lack of information, and while I don’t disagree often times we are not even using the information that we have! I’m going to walk through a few use cases. These are all baby steps to get to where we eventually need to be but we gotta start somewhere. Using less secrecy & religion and more openness and information sharing. In order to take the first steps, we have to get our own house in order.\n
simple example of using what we have. sprinkle in some metadata!\n
Metasploit has become table stakes. \n
A lot of different attributes could go into determining the “why”. Is a particular team less responsive to patching and updates? Is it the technology stack that is more prone to vulnerability or misconfiguration? Are there other environmental reasons? By determining root cause you may more accurately predict the next issue as well as risk rank new projects or applications prior to deployment. By combining vulnerability, misconfig, defect and issue data with operational data such as log and events, threat feeds, and breach data (need more of this), we could also take our predictive analytics to security breaches not just issues.\n\n
making more meaningful priority decisions - credit: Jeff Lowder CVSS ignores information about the base rates of vulnerability exploitation.\nI have an older version of Apache that if I were to upgrade to the current version would eliminate 14 vulnerabilities. I also have an older version of Tomcat that if I were to upgrade to the current version would eliminate 9 vulnerabilities. If CVSS included a base rate on these reference classes, this would help me prioritize my remediation resources more appropriately.\n
A study by Thomas Zimmerman of MS and Stephan Neuhaus mines the CVE database looking at all sorts of trends. It’s a good paper. There’s a table near the end that clearly shows the increase in vulnerabilities through the application layer with a decrease of many of the more traditional network vulnerabilities over time. Yet we continue to prioritize our spending and resources on the attacks of 5+ years ago.\n
Insert Alex Hutton formula\n4:14:40 PM Alex Hutton: once the data hits catwalk we decision and the threat is funneled off....\n4:15:46 PM Ed Bellis: haha its a new risk language\n4:16:05 PM Ed Bellis: you guys pull a lot of public sources of data?\n4:16:16 PM Alex Hutton: not yet\n4:16:21 PM Alex Hutton: some but not a ton\n4:16:24 PM Ed Bellis: also any insight on whether VERIS taking off?\n4:16:41 PM Alex Hutton: there's a broader vision (you can fit) that would make a lot of sense\n4:17:02 PM Alex Hutton: but that would take cooperation from various sources which is going to be hard, and why VERIS is a bit stalled\n4:17:17 PM Alex Hutton: there are lots of folks using VERIS, but not sharing data\n4:17:31 PM Ed Bellis:\n4:17:51 PM Ed Bellis: i'd love to tap into that\n4:19:34 PM Alex Hutton: well, the better vision is for you to feed data along with threat (and incident data) into decision making on a daily or even hourly basis\n4:19:50 PM Alex Hutton: In a sense, you hold not only the orgs perimeter, but a broad sample of perimeters\n4:20:04 PM Alex Hutton: and their\n4:20:12 PM Alex Hutton: vuln posture\n4:20:38 PM Alex Hutton: So\n4:21:07 PM Ed Bellis: right we want to move into a position where our data becomes the primary asset to help make better security decisions\n4:21:16 PM Ed Bellis: *not a verb*\n4:21:49 PM Alex Hutton: My(vuln posture * other threat activity) / (other vuln posture * other threat activity) = a much better metric than just vuln data\n4:22:24 PM Ed Bellis: meaning am i safer than my neighbor?\n4:22:25 PM Alex Hutton: in fact, a very interesting likelihood data point to do some degree of probabilistic analysis on that I don't have the time to really explore yet, but is very, very interesting\n4:23:10 PM Alex Hutton: that may be a byproduct, more about establishing a rate at which I can expect my luck to run out.\n4:23:23 PM Ed Bellis: i see\n4:23:35 PM Alex Hutton: so you're the CSO of Orbitz, you know you have a vuln. to specific threat activity.  So far, you're lucky\n4:24:02 PM Alex Hutton: but what about establishing rate of threat activity vs. how pervasive that same vuln is around other orgs?\n4:24:31 PM Ed Bellis: where do we get the rate of threat activity?\n4:24:42 PM Ed Bellis: breach db's?\n4:24:51 PM Alex Hutton: High Threat Activity + Low pervasiveness in the aggregate population = Luck running out.\n4:25:06 PM Alex Hutton: nope.  that's part of your exit strategy\n4:25:19 PM Ed Bellis: bought by verizon ?\n4:25:33 PM Alex Hutton: Some MSSP who is running or aggregating WAF data, Firewall/IDS/IPS data\n4:25:58 PM Alex Hutton: you as "Yet another managed service" well, that's a scenario I believe in, yes.\n4:26:20 PM Alex Hutton: But you as "Next generation in managed services by data aggregation" is going to be very, very compelling, I think\n4:26:46 PM Alex Hutton: what you have to understand (and I think you do) is how to leverage your data into "decision" (the verb)\n
talk about infosec vs fraud\n
This is a great TED talk about the medical industry. Talking about how we have created a system where we believe there are doctors who make mistakes and ones who dont. It’s a fallacy driven by ideology & lawyers. Our industry is very much like this. We NEED to talk about our mistakes. Talk about founding a startup and the founders who share their failure stories and WHY. Talk about fraud mgmt and how they do the same. Security NEEDS to. We need to share more about our failures and what lead to those outcomes. This can raise the bar of the entire industry and it’s completely silly to think their are orgs out there not making security mistakes. security is NOT binary, there are many shades of gray, The most common question “are we secure?”.\n