This document summarizes the findings of Veracode's 10th annual State of Software Security report. The report is based on scan data from over 85,000 applications assessed between April 2018 and March 2019. Key findings include: (1) Most security flaws are eventually fixed but fix times have remained consistent, (2) The likelihood of flaws being fixed decreases over time leading to growing "security debt", (3) Frequent scanning correlated with faster fix times and lower security debt indicating the benefits of DevSecOps practices. The data suggests security testing automation still lags DevOps adoption and developers prioritize recent flaws over severity.
3. @chriseng
@veracode
What is this research?
• Veracode State of Software Security (SoSS),
Volume 10
• Largest quantitative study of application
security findings
• Partnered with data scientists at Cyentia
Institute
4. @chriseng
@veracode
Why and how?
• Why
– Insights into industry performance, and impact of DevSecOps on fix rates
– Provide data for customers to benchmark themselves against their peers
– Generate actionable advice for improving application security programs
• How
– Formulate questions that might be answerable given the available data
– Stand back and use science
5. @chriseng
@veracode
Data sources
• Over 2,300 Veracode customers
• 12 months of application scan data
– April 1, 2018 – March 31, 2019
– Over 85,000 unique applications and 1.4 million individual assessments
8. @chriseng
@veracode
Biases in any study of vulnerability data
• Selection bias
– Example: who are our customers, which applications did they choose to
analyze, etc.
• Type I (FP) and Type II (FN) experimental error
– Mostly self-explanatory
• Capabilities bias (for lack of a better term)
– Example: we choose which flaw categories we want to scan for and how
deeply, which frameworks/languages to support, etc.
9. @chriseng
@veracode
Biases in interpretation (yours and ours)
• Put differently: drawing conclusions that may be convenient but
incorrect
• Unavailable metadata
– Example: some stats aggregate modern and legacy dev practices because
we don’t have data around an application’s age/maturity
• Attribution bias
– Example: inclination to “blame” outcomes on things that seem relevant
(e.g. developer skill) vs. other situational factors (e.g. release deadlines)
12. @chriseng
@veracode
Survival analysis
• Studies duration / time to event
• Has a lot of other names
• Accounts for censored data (participation, but no event recorded)
– Team stopped scanning an application
– Finding still open at time data collected
25. @chriseng
@veracode
What DevOps is and is not
• We use scan activity as an indicator that an organization may be
following DevOps practices
• DevOps is not just automation (also culture and processes), but
automation is easier to measure
32. @chriseng
@veracode
Our data suggests
• Security automation (as measured by scan frequency) continues to
significantly lag the widespread and accelerating adoption of DevOps
• Developers do not prioritize fixes in a security-appropriate manner;
recency appears to outweigh every other factor
• Incorporating daily application testing improves MedianTTR by 3x
relative to weekly testing
• Steady testing facilitates chipping away at security debt, while bursty
testing allows security debt to balloon