Michael Roytman's CyberTech EU presentation. This was presented in October 2023 and includes data about vulnerabilities from 660 Cisco Vulnerability Management Customers. For a deeper dive, see the prioritization to prediction reports: https://www.cyentia.com/prioritization-to-prediction-v9/.
All of the data is generated from aggregated data from Cisco VM (Kenna) customers, or from telemetry data from Cisco, Alienvault, Reversings Labs, etc.
3. Remediation capacity
Companies are closing about 15%
of their vulnerabilities every month
(typical is 5%-20%).
1K 10
K
100K 1M 10M 100M
Average monthly observed vulnerabilities
Average
monthly
closed
vulnerabilities
100
1K
10K
100K
1M
10M
10
4. The 1% that matters
1.2% of CVEs have published
and observed exploits
0.6% of CVEs just have
executed exploits in the
wild
21.2% of CVEs just have
an exploit publicly released
Source: Kenna/Cyentia
77% of CVEs have no
published or observed exploit
5. Positive predictive value of remediating a
vulnerability with property X
0
Breach probability (%)
0 5 10 15 20 25 30 35
CVSS 10
EDB
MSP
EDB+MSP
6. Variable importance (SHAP)
Top 30 contributing variables, scores represent a mean absolute contribution
EPSS: Variable importance
0.00 0.05 0.10 0.15 0.20
Tag: code execution
Exploit: Exploit DB
CVE: Count of References
Vendor: Microsoft
Exploit: Metasploit
Tag: Remote
CVSS: 3.1/PR:N
Exploit: Github
CVE: Age of CVE
Tag: SQLi
CVSS: 3.1/Scored
CVSS: 3.1/AV;N
Tag: XSS
Vendor: Adobe
CVSS: 3.1/AV.L
Tag: Denial of Service
Vendor: Apache
CVSS: 3.1/UI:N
Tag: Command Injection
Vendor: HP
Vendor: Apple
Tag: Local
Scanner: jaeles
Tag: Crafted Web
CVSS: 3.1/PR:L
CVSS: 3.1/CH
Vendor: ISC
Tag: Memory Corruption
Tag: Web
Vendor: Cat
7. What is your VM program’s coverage?
Coverage:
Of the known
exploits/exploitations
out there, how many
does your strategy
remediate?
Remediation coverage and efficiency metrics across firms
110 Kenna Customers 75-80% coverage
0% 25% 50% 75% 100%
Coverage
Efficiency
0%
20%
40%
60%
8. How efficient is your VM program?
Efficiency:
You fixed
10 vulnerabilities.
What percentage of
those are ones that
actually pose the
risk to your
organization?
Remediation coverage and efficiency metrics across firms
110 Kenna Customers
0% 25% 50% 75% 100%
Coverage
Efficiency
0%
20%
40%
60% 40% are
efficient choices
9. Remediation rate
3 mos 6 mos 9 mos 1 year
Time from discovery
Percentage
of
vulnerabilities
remediated 0%
20%
40%
60%
80%
45% of vulnerabilities are
remediated in the first month
Almost two thirds of vulnerabilities are
remediated in the first three months
Just under 20% of vulnerabilities
are still open after a year
100%
10. Remediation by category of asset
3 mos 6 mos 9 mos 1 year 1 year
3 mos
1 year
6 mos
1 year
9 mos
Time (months)
Probability
of
vulnerability
remediation
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
50%
369
254
70
36
63%
84%
86%
Mac OS X
Microsoft platforms
Linux/Unix
Appliances/devices
11. Remediation on Microsoft platforms
3 mos 6 mos 9 mos 1 year 1 year
3 mos
1 year
6 mos
1 year
9 mos
100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
Probability
of
vulnerability
remediation
Time (months)
2003 Server
2008 Server
Windows Vista
Windows 8.1
2012 Server
Windows 7
Windows 2000
Windows XP
2016 Server
Windows 10
Newly
unsupported
Supported Unsupported
12. “High-risk” capacity
Average monthly change in high-risk vulnerabilities
20%
increase
10% 0% 10% 20%
decrease
Proportion
of
firms
16% of orgs
are maintaining
33% of orgs
are falling behind
51% of orgs
are reducing their
high-risk
vulnerabilities
14. The goal of Infosec is to prevent breaches
ESG study:
• 38% of orgs had trouble filtering
noisy alerts
• 37% had trouble accommodating
security telemetry volumes
• 34% struggle to building a useful
data stream/pipeline
15. Most incidents don’t matter
• Computer data breach: 76% of
incidents had no loss, 97.5% < $440K
• Ransomware: 90% of incidents had
no loss, 98.3% < $300K
• Business email compromise:
42% had no loss
76% of incidents
had no loss.
Dots represent the
remaining 24%.
CDB n=2,781
$148
$1,274
$29,774
$438,499
$1,594,648
Loss by incident type.
Each dot represents 0.5% of incidents.
90% of incidents
had no loss.
Dots represent the
remaining 10%.
Ransomware
n=2,475
Dollars $1 $1,000 $1,000,000
$69
$500
$11,150
$296,500
$1,155,775
16. Distribution of breach losses on a log scale
1,250
1,000
750
500
250
0
Number
of
events
$10M $20M $30M $40M
There are 188 events with losses over $10M
that are impossible to see in this view
All this whitespace has a purpose. Plotting losses on a
linear scale like this causes minor events to drown out
the rare major events that are a key concern to risk managers
and enterprise directors. Don’t lose the forest for the trees!
Events with less than $1M loss
dominate this naïve view.
Distribution of breach losses on a linear scale (truncated at $50M)
Total Losses
The losses of over $10m are
now much more visible
By viewing breach losses on a log scale,
a clear pattern emerges that makes
statistical modelling much easier.
Density
$100 $1K $10K $10M $100M $1B $10B
$100K $1M
Total Losses
17. Distribution of cyber event losses on a log scale
Total Losses
$100 $1K $10K $100K $1M $10M $100M $1B $10B
Median loss: $196k
Events with losses over $20m
8% of all losses are in this region
Ed
If I fixed 2 vulnerabilities this year and both of them were really important, then I'm 100% efficient. But I fixed 2 vulnerabilities for an enterprise that has 3 million vulnerabilities, I haven’t actually made a dent in my overall risk.
Coverage: “Of the known exploits and exploitations that are out there, how many does your strategy remediate?”
It's really easy to have 100% coverage. You just remediate every vulnerability. Really expensive, very inefficient.
There's a trade-off between efficiency and coverage. You want to remain as efficient as possible while increasing your coverage.
Most of customers (Global 2000) are around the 70-80% mark and then some of the smaller customers are out here.
Note: 70-80% of risk reduction is really hard to achieve.
This is the traditional VM problem: A bunch of noise coming in, and you have to figure out an efficient strategy that gets you to the risk tolerance that you want.
Objective of RBVM: Cover as much at the most efficient cost.
Ed
About the visual: Benchmarking study feat. 110 Kenna customers; measured efficiency and coverage of their VM programs
Efficiency: “If you remediate some subset of vulnerabilities, what percentage of those are ones that actually pose the risk to your organization (had an exploit or a successful exploitation)?”
40% of the vulnerabilities that they fix are efficient choices, or vulnerabilities that do actually pose a risk to their organization.
Raise your efficiency from 40% to 60%, then you've saved 20% of the time spent on assessment, remediation, working with IT teams, etc.
Note: 40% as an average is pretty high. CVSS usually puts folks around the 20- 22% mark.
Ed
These are the individual survival curves for vulnerabilities on the four categories of assets. Compare the half-lives (50% closed) and/or the percentage remediated at one year.
Note the dotted lines are end-of-life’d and the dashed are newly unsupported (as of Jan 2020 when we gathered our data). Clearly older systems lift the remediation curve for Microsoft and newer (supported) systems are remediating much faster.