State of Software Security - Enterprise Testing of Software Supply Chain

758 views

Published on

Download your copy at - http://goo.gl/ZaHPj

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
758
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

State of Software Security - Enterprise Testing of Software Supply Chain

  1. 1. FEATURE SUPPLEMENTEnterprise Testing of theSoftware Supply ChainFeature Supplement of Veracode’s State of Software Security ReportNOVEMBER 2012
  2. 2. Enterprise Testing of the Software Supply ChainTable of ContentsIntroduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2Executive Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3Enterprise Risk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7Vendor Supplied Software Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9Security of Vendor Supplied Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14State of Enterprise Testing Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27List of FiguresFigure 1: PwC Survey, Security Incidents Attributed to Customers, Partners and Suppliers . . . . . . . . . . . . . . . . . . . . .7Figure 2: Distribution of Enterprises Requesting Assessments by Industry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9Figure 3: Percentage of Enterprises Requesting Assessments from Software Vendors . . . . . . . . . . . . . . . . . . . . . . 10Figure 4: Quocirca Survey, Frequency of Software Security Inquiries by Enterprise Customers and Auditors . . . . . . 11Figure 5: Distribution of Vendor Supplied Applications by Business Criticality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .12Figure 6: Distribution of Vendor Supplied Applications by Application Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .13Figure 7: Quarterly Trend of Number of Vendor Supplied Applications Assessed . . . . . . . . . . . . . . . . . . . . . . . . . . . .13Figure 8: Top Vulnerability Categories, Percentage of Affected Vendor Supplied Web Application Builds . . . . . . . . . .15Figure 9: Top Vulnerability Categories, Percentage of Affected Vendor Supplied Non-Web Application Builds . . . . . .16Figure 10: Distribution of Applications by Language and Platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .17Figure 11: Compliance with Policies on First Submission . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19Figure 12: Compliance with Enterprise Policy by Application Purpose on First Submission . . . . . . . . . . . . . . . . . . . . 20Figure 13: Number of Applications Assessed by an Enterprise Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22Figure 14: Number of Vendors Participating in an Enterprise Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22Figure 15: Achieving Compliance with Enterprise Policy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Figure 16: Number of Builds Submitted to Achieve Policy Compliance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .23Figure 17: Time Taken to Achieve Policy Compliance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24Figure 18: Number of Builds Submitted for Non-Compliant Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25Figure 19: Time Spent Out of Compliance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26List of TablesTable 1: Vulnerability Distribution on First Submission by Language . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18Table 2: Predefined Policy Requirements for Veracode Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 1
  3. 3. Enterprise Testing of the Software Supply ChainIntroductionVeracode has been publishing State of Software Security (SOSS)1 reports since 2010.This year we began investigating our dataset from perspectives that are not routinelycovered in our traditional SOSS reports, which allows us to extend our analysis to avariety of topical areas. The first feature supplement was published in April and focusedon the vulnerabilities in software applications used by publicly traded companies.2This is our second feature supplement for 2012.The focus of this report is the state of enterprise programs that assess the security of software purchased from vendors(where an enterprise is defined as companies with over $500 million in annual revenue). Security experts have longadvised enterprises to incorporate application security testing into their software procurement or vendor managementactivities. Yet only recently has the idea that software vulnerabilities contribute to IT supply chain risks garnered moremedia attention and enterprise interest. As a result, enterprises are looking for guidance on establishing applicationsecurity programs to expose and manage the risks associated with vendor supplied software. While our SOSS reportshave always analyzed vulnerabilities, remediation and compliance data across different supplier types (internally devel-oped, commercial, open source and outsourced), this feature supplement will extend our investigation to include:• Software security testing program metrics (e.g. program participation rates)• How different program approaches impact vendor compliance with application security policies.This SOSS feature supplement draws on continuously updated information in Veracode’s cloud-based applicationsecurity services platform. Unlike a survey, the data comes from actual security analysis of web and non-webapplications across industry verticals, languages and platforms. This data also represents multiple security testingmethodologies (static binary, dynamic and manual) on a wide range of application types and programming languages.The resulting intelligence is unique in the breadth and depth it offers.In order to focus only on enterprises with programs for vendor application security testing, we derived the dataon which this report is based from 939 application builds submitted to the Veracode platform during an 18 monthtime period from January 2011 to June 2012.3 Veracode analyzes the vendor submitted applications, attests to theapplications’ security posture to the requesting enterprise, and provides detailed, prioritized remediation guidanceto the software vendor.We believe this SOSS feature supplement presents some interesting findings and we hope you enjoy reading the report.1 Previous Volumes of State of Software Security reports are available at www.veracode.com/reports2 Study of Software Related Cybersecurity Risks in Public Companies, Veracode April 2011 (info.veracode.com/state-of-software-security-volume-4-supplement.html)3 It should be noted that in any study of this size, sampling issues arise because of the nature of the way the data was collected. For example, it is important to remember that all applications in this study came from organizations that were motivated enough about application security to engage Veracode for an independent application security assessment. Care has been taken to only present comparisons where a statistically significant sample size was present.2
  4. 4. Enterprise Testing of the Software Supply ChainExecutive SummaryThe House Select Committee on Intelligence recently recommended that U.S. companiesrefrain from purchasing telecommunications equipment from Chinese manufacturers.Among many reasons for the recommendation, the 60 page report4 cited:• “(T)he threat posed to U.S. national-security interests by vulnerabilities in the telecommunications supply chain.”• “Vendors financing their own security evaluations create conflicts of interest that lead to skepticism about the independence and rigor of the result.”These statements have ignited a firestorm of discussions and media coverage about software supply chain vulnera-bilities. However, the raging debate about Chinese cyber-spying and US protectionism does little to clarify the realissue, which is that enterprises assume too much risk when they implicitly trust their software providers to developsafe software.This report provides analysis of the actual state of vendor application security testing programs currently beingimplemented by our enterprise customers (where an enterprise is defined as companies with over $500 millionin annual revenue). In addition to our analysis of enterprise risks, the vendor assessment market and applicationvulnerabilities, we also examine the state of enterprise assessment programs. Specifically, our analysis of enterpriseprograms focuses on understanding how different enterprise approaches to implementing their programs impactmetrics such as vendor participation, applications assessed, and compliance with application security policies.Key FindingsTesting vendor applications is a growing trend in many industries.The volume of vendor supplied application assessments continues to grow with a 49% increase from the firstquarter of 2011 to the second quarter of 2012 (Figure 7). Enterprises in many industries are starting to secure theirsoftware supply chains. In fact, Figure 2 shows 51% of the enterprises requesting assessments belong to industriesother than Financial Services, Software/IT Services and Technology (i.e. the three industry segments that historicallydominated vendor assessment requests in past SOSS reports). However, fewer than one in five of our enterprisecustomers have requested a code-level security test from at least one vendor (Figure 3). The low percentage ofcompanies is an indication that formal vendor software testing programs are still in a relatively early stage of adoption.4 Investigative Report on the U.S. National Security Issues Posed by Chinese Telecommunications Companies Huawei and ZTE (intelligence.house.gov/ press-release/investigative-report-us-national-security-issues-posed-chinese-telecommunications) 3
  5. 5. Enterprise Testing of the Software Supply ChainEnterprises with a programmatic approach to vendor application security testing have more vendorsand applications participating in their programs than enterprises with an ad-hoc approach.We analyzed the program results experienced by our customers in terms of the number of participating applications(Figure 13) and vendors (Figure 14). We found that enterprises fell into two distinct groups, which aligned with theirapproach to implementing their programs:• Ad-hoc approach: Where enterprises lacked a protocol for selecting applications for testing and appeared to request application security testing from vendors on a case by case basis. Additionally, the enterprise requestor often had limited business, contractual and technical details to respond to specific vendor questions and concerns, thus the requestor appeared to lack a strong mandate from their business and procurement teams. As a result this group had fairly low numbers of vendors and applications participating in their programs (averaging 7 applications and 4 vendors).• Programmatic approach: Where enterprises developed a formal protocol for selecting and requesting vendor applications for testing. Specifically programmatic application testing programs tended to leverage best practices for defining an overall program, such as ensuring collaboration between security, business and procurement teams; specifying business, contractual and technical details as part of the policy; and providing a strong mandate for vendor application testing. This group enjoyed much higher participation levels (averaging 71 applications and 38 vendors).Setting a less rigorous compliance policy that vendors perceive as achievable encourages highervendor participation.Figure 17 shows that in enterprises with a programmatic approach, 45% of vendor applications become compliantwithin one week, whereas only 28% of applications are compliant within one week for ad-hoc programs. Additionally,most of those applications achieved compliance upon first submission (Figure 16). These results indicate someenterprises with a programmatic approach chose to design policies to enable a significant portion of vendors toachieve compliance with relative ease. In other words, obtaining initial visibility into the state of vendor softwaresecurity is more important for these enterprises than demanding compliance with a tough security policy.Vendors are more successful at complying with application security policies defined by the enterprisethan they are at meeting industry standards.38% of vendor supplied applications complied with enterprise-defined policies (Figure 11). Vendors struggle to meet themore stringent requirements of polices guided by industry standards. Only 10% of applications comply with the OWASPTop 10 and 30% with the CWE/SANS Top 25 (Figure 11). The results show secure development practices built into thesoftware development lifecycle (SDLC) are still not as widespread as they should be. However, the vendors that do striveto comply with industry standards are well positioned to comply with any enterprise-defined security policy.4
  6. 6. Enterprise Testing of the Software Supply ChainWith 62% of applications failing to reach compliance on first submission, procedures for managingnon-compliant applications are an important aspect of an enterprise’s security policy.Figures 18 and 19 delve deeper into the group of non-compliant applications. Figure 18 shows large percentagesof non-compliant applications with only one build submitted: 39% for enterprises with a programmatic approachand 57% for enterprises with an ad-hoc approach. Figure 18 also shows that 11% of applications remain out ofcompliance with enterprise policies, in spite of vendor submission of new builds for testing (resubmission is clearevidence of vendor remediation efforts). Enterprises with a programmatic approach only had 20% of applicationsbe out of compliance for more than 24 weeks, compared with 39% of applications participating in ad-hoc programs(Figure 19). The results suggest that enterprises with a programmatic approach may do a better job ushering theirvendors through the process than enterprises with an ad-hoc approach.RecommendationsClearly it is no longer acceptable to ignore the risk of security vulnerabilities in vendor supplied software. Enforcementof enterprise security policies conducted solely through vendor surveys is no longer sufficient to manage the vulnera-bilities entering the enterprise through vendor software. Enterprises should consider several factors when developingtheir security policies for vendor applications, including vulnerability prevalence, severity of software flaws and thebusiness criticality of the applications. Other factors governing the vendor relationship are equally important, suchas escalation procedures, product release timelines and mitigation acceptance processes. Enterprises that want toaccelerate vendor participation in the early stages of their programs should not design security policies that expectperfection from vendor supplied software.Successful management of vendor application security testing programs, like any other enterprise effort, dependson technology to automate specific tasks, processes to ensure broad adoption, and strong leaders who can drivethe necessary collaboration between security, business and procurement teams. With those three elements inplace, the enterprise can more effectively wield its purchasing leverage to drive early success in terms of vendorparticipation. However, enterprises should also demonstrate that they are serious about seeing significant applicationsecurity improvements from vendors, documented through test results. Over time, enterprises should strengthentheir vendor acceptance policies to reflect industry standard levels.Specifically, enterprises should take the following steps to begin the process of creating a vendor applicationsecurity testing program (where one does not exist) or refining existing programs to maximize vendor participationand compliance:Step 1: Policy Definition• Identify the business goals of application analysis.• Determine the security testing types and products/services to be used.• Document the analysis timeline and frequency of testing.• Document vulnerability remediation expectations.• Define an exception and escalation process for uncooperative vendors 5
  7. 7. Enterprise Testing of the Software Supply ChainStep 2: Requirements Mandate to Vendors• Introduce the security analysis mandate to all vendors.• State the reason behind the mandate and the goals to be achieved.• Introduce the analysis options to vendors.• Confirm willingness and ability of vendor to meet analysis and timeline requirements.Step 3: Vendor Education and Commitment• Provide participating vendors with written guidance.• Address all questions and concerns in an open, responsive manner.• Target a commitment from vendors to remediate in a timely manner.• Educate vendors on all aspects of analysis process, methodologies, IP protection, expectations and timelines.Step 4: Communication and Execution• Provide consistent project management (status meetings, reporting, etc.) in order to minimize delays in vendor compliance.• Drive vendor participation and cooperativeness by providing timely responses and accurate results.• Acknowledge that vendor participation hinges on assurances to protect their IP.Step 5: Results Communication• Accept summary test results and allow vendors to limit disclosure of vulnerability specifics.• Provide remediation advice and human guidance to vendor teams in order to enable compliance efforts.• Provide feedback mechanisms for the vendor team to communicate remediation plans, then republish their results.6
  8. 8. Enterprise Testing of the Software Supply ChainEnterprise RiskAll enterprises assume some security risk by using applications purchased fromsoftware vendors. According to PricewaterhouseCoopers (PwC) 2012 Global Stateof Information Security Survey, “Over the past 24 months, the number of securityincidents attributed to customers, partners, and suppliers has nearly doubled.”5 PwC Survey Security Incidents Attributed to Customers, Partners and Suppliers 2009 2010 2011 20% 17% 15% 15% 12% 11% 10% 10% 8% 5% 0% Customer Partner or Supplier Figure 1: PwC Survey, Security Incidents Attributed to Customers, Partners and Suppliers Source: PwC 2012 Global State of Information Security Survey. Question 22: “Estimated likely source of incident.” (Not all factors shown. Totals do not add up to 100%.)5 Third Party Risk Management, PricewaterhouseCoopers LLP, April 2012 (isaca-centralohio.org/archive/presentations/2012_04_PWC_TPRM.pdf) 7
  9. 9. Enterprise Testing of the Software Supply ChainWhen one envisions the complexity of the enterprise software supply chain it is clear why this is the current stateof affairs. Enterprises rely on an incredibly large software portfolio to conduct business. Some of Veracode’s largestcustomers admit to purchasing business applications from well over 20,000 individual software vendors, rangingfrom the largest software vendors in the world to two-person companies. Most applications and software acquiredby enterprises from their vendors are insecure, as our past reporting has shown. SOSS Volume 4 highlighted thatonly 38% of commercial non-web applications complied with the CWE/SANS Top 25 while only 14% of commercialweb applications complied with the OWASP Top 10.The same PwC study referenced above also showed that enterprises are starting to respond to those risks, but only23.6% of respondents stated they have security procedures with which partners and suppliers must comply (andcode-level application assessments may or may not have been included in the required procedures). The key barrierto adoption is the ability to enforce those security policies with actionable testing and remediation plans. Historically,security testing of vendor supplied applications has been limited to manual penetration testing by consultants orsource code analysis tools used by internal teams (in the rare cases when a vendor supplies source code) which maycover only a fraction of vendor software used by an enterprise. Enforcement of enterprise security policies is oftenconducted through vendor surveys, which is akin to trusting the vendor to attest to the security of their own code.Additionally, the lack of efficient methods of testing the security of software obtained through mergers, acquisitionsand procurement processes leaves a significant gap in the enterprise’s ability to manage the business risks of thesoftware supply chain. To fill this gap and obtain insight into the security posture of applications, regardless of their origin, more and more Only 23.6% of respondents stated enterprises are relying on independent software testing they have security procedures programs, like the one Veracode launched in 2009. While with which partners and suppliers vendors may not be pleased with discovering security issues must comply. in their software products, they do take actions to improve security in response to enterprise concerns.66 Security Manager’s Journal: Security has to extend to your customers By Mathias Thurman, Computerworld, October 22, 2012 (www.computerworld.com/s/article/9232556/Security_Manager_s_Journal_Security_has_to_extend_to_your_customers?taxonomyId=208)8
  10. 10. Enterprise Testing of the Software Supply ChainVendor Supplied Software AssessmentsIn this section we examine the assessment market to understand the organizationaldynamics at play within the enterprise and the types of applications being assessed.While we acknowledge the Veracode customer base only represents a fraction of allorganizations that depend on software supply chains, the data still provides somedirectional guidance for the market as a whole.Enterprises with Assessment ProgramsWe begin by examining the distribution of enterprises with assessmentprograms by industry segment (Figure 2). The Financial Services, Software/ 51% of enterprisesIT Services and Technology industries account for 49% of enterprises that requesting vendorhave requested application security assessments from their vendors. The application testingremaining 51% of enterprises come from a broad spectrum of industries. come from a broadThe variety of industries represented here is a significant change from past spectrum of industries.volumes of SOSS. For example, in Volume 4 we reported that 93% of theenterprises were in either the Software/IT Services industry or the FinancialServices industry. Distribution of Enterprises Requesting Assessments by Industry 24% Other 21% Financial Services 14% Software and IT Services 14% Technology 6% Telecommunications 5% Healthcare 3% Business Services 3% Entertainment and Media 3% Government 2% Aerospace and Defense 2% Education 2% Utilities and Energy Figure 2: Distribution of Enterprises Requesting Assessments by Industry 9
  11. 11. Enterprise Testing of the Software Supply ChainFigure 3 shows the percentage of enterprises (defined as companies with over $500 million in annual revenue) thathave requested an application security assessment from at least one vendor. Figure 3 indicates that only 16% ofenterprises (fewer than one in five enterprises) have begun requesting application security assessments from theirvendors. This percentage is consistent with the statistics reported in the April 2012 SOSS supplement analysis ofpublic companies.7 It is evident that formal vendor application security assessment programs are still in a relativelyearly stage of market adoption. Percentage of Enterprises Requesting Assessments from Software Vendors 84% Tested no vendor supplied applications 16% Tested at least one vendor supplied application Figure 3: Percentage of Enterprises Requesting Assessments from Software VendorsThis low percentage of assessment requests should be of concern when one considers that enterprises need tobe proactive about application security, not only for their own benefit, but to assure customers and auditors as well.A survey of 100 US and UK enterprises conducted by Quocirca8 to assess the scale of the software security problemprovides some insight. Figure 4 is based on Quocirca’s research and is not part of Veracode’s dataset. Figure 4shows that an enterprise’s customers and auditors often seek guarantees about the applications that underpin anenterprise’s business processes. 82% of enterprises surveyed indicated that they get some level of inquiry aboutsoftware security from auditors; 45% say it is a requirement. Similarly, 61% of enterprises surveyed indicated thatthey get some level of inquiry about software security from customers; 34% say it is a requirement. An example ofa customer inquiry would be customer of a payment processing service that must comply with the Payment CardIndustry Data Security Standard (PCI-DSS). This customer would ask for guarantees about the security of theapplications underpinning the payment processing service.7 Study of Software Related Cybersecurity Risks in Public Companies, Veracode April 2012 (info.veracode.com/state-of-software-security-volume-4-supplement.html)8 “Outsourcing the problem of software security,” Quocirca, 2-24-2012 (info.veracode.com/Quocirca_Outsourcing_Software_security.html)10
  12. 12. Enterprise Testing of the Software Supply Chain Quocirca Survey Frequency of Software Security Inquiries by Enterprise Customers and Auditors Requirement Inquire, not mandatory Inquire, infrequently Never100% 3% 14%90% 15%80%70% 25%60%50% 37% 5% 3% 1% 13% 11% 10%40% 27%30% 13% 15% 14% 24%20% 12%10% 45% 21% 24% 34% 23% 11% 0% Overall USA UK Overall USA UK Auditors Auditors Auditors Customers Customers Customers Figure 4: Quocirca Survey, Frequency of Software Security Inquiries by Enterprise Customers and Auditors Source: Quocirca 82% of enterprises surveyed indicated that they get some level of inquiry about software security from auditors; 45% say it is a requirement. 11
  13. 13. Enterprise Testing of the Software Supply ChainDistribution and Volume of Application AssessmentsNext we examine the nature of the vendor-supplied applications that are being assessed. Figure 5 shows thatenterprises focus their attention on business critical applications, with 77% of applications classified as Very Highor High business criticality. It is encouraging to see that Medium to Very Low criticality applications make up 23%of the assessed vendor applications because it demonstrates some enterprises are beginning to view their entireapplication portfolio as a source of risk that should be addressed. Data breach incidents are frequently multi-stageattacks, where vulnerabilities in non-critical applications become a launching point for penetration deeper intothe enterprise. Proactively managing the security posture of lower criticality applications through policies andindependent assessments should enable enterprises to lower the business risk of using these applications. Distribution of Vendor Supplied Applications by Business Criticality 21% Very High 56% High 19% Medium 4% Low <1% Very Low Figure 5: Distribution of Vendor Supplied Applications by Business Criticality Figure 6 shows that enterprise subject Operations and Financial applications to significant scrutiny. Typically these applications deal The volume of vendor with personally identifiable information (PII) such as credit card or other supplied software or financial information, or they access proprietary information that enables application assessments competitive advantage. It stands to reason that an enterprise would continues to grow with want to gauge the security of these vendor-supplied applications before a 49% increase from deploying them. What is surprising is the relatively low percentage of the first quarter of 2011 security applications that have been tested, since enterprises depend to the second quarter on security applications to manage many other aspects of enterprise risk. of 2012. As we will see in Figure 12, security applications can be as insecure as other software products.12
  14. 14. Enterprise Testing of the Software Supply Chain Distribution of Vendor Supplied Applications by Application Purpose 48% Business and IT Operations 18% Financial Services 17% Other 9% Customer Support 4% Security 4% Web Infrastructure Figure 6: Distribution of Vendor Supplied Applications by Application PurposeOn a positive note, the volume of vendor supplied application testing continues to grow as seen in Figure 7. There hasbeen a 49% increase from the first quarter of 2011 to the second quarter of 2012. The growth comes from enterprisesin a wider range of industries beginning their programs (Figure 2) and from enterprises expanding the reach of theirexisting programs (Figure 13). One possible reason for the rapid growth isthe realization that security assessments can be successfully implementedand seamlessly incorporated into procurement processes. Additionally, by Operations and Financialleveraging several best practices compiled by analysts and early practition- applications representers, enterprises can expect increased vendor participation and reduced risk 66% of the assessments.as those vendors achieve policy compliance. Quarterly Trend of Number of Vendor Supplied Applications Assessed p-value = 0.017 200 NUMBER OF APPLICATIONS ASSESSED 150 100 50 0 2011-1 2011-2 2011-3 2011-4 2012-1 2012-2 QUARTERS Figure 7: Quarterly Trend of Number of Vendor Supplied Applications Assessed 13
  15. 15. Enterprise Testing of the Software Supply ChainSecurity of Vendor Supplied ApplicationsIn this section we analyze the vulnerability categories that affect the widest numberof applications, the relative prevalence of prominent vulnerability categories acrossdifferent languages, and application compliance with various security policies.Enterprises should consider both prevalence and severity of software flaws whendeveloping their vendor acceptance policies. Enterprises typically use this informationto develop their security policies, with the goal of encouraging vendors to remediatethe flaws that most significantly impact the enterprise’s security posture.Vulnerability PrevalenceFigures 8 and 9 depict the most prevalent vulnerability categories as defined by the percentage of application buildscontaining one or more instances of each category. We examine web and non-web applications separately becausethe vulnerability categories that affect these two groups are very different. When one or more vulnerabilities arefound in an application, that application is considered to have been affected by that vulnerability category. Sincemany applications have been analyzed across multiple builds, we are treating each application build as an individualunit to be counted. Therefore if an application has one SQL injection vulnerability in build 1, then the build is countedtowards the percentage affected by SQL injection. If in build 2 no SQL injection vulnerabilities are found, then build 2is counted towards the percentage not affected by SQL injection.The figures also indicate the vulnerability categories that appear on theOWASP Top 10 (2010) list for web applications and the CWE/SANS Top 25 SQL injection is(2011) list for non-web applications. These lists represent broad consensus often reported as thefrom security experts around the world who have shared their expertise primary attack vectorabout the severity of vulnerabilities that occur in web and non-web applica- by organizations thattions. Vulnerability severity depends on likelihood of exploitation by malicious investigate and trackactors (i.e. the relative ease of finding the flaws and launching attacks to breaches. SQL injec-exploit the flaws) and the potential business impact of the attacks exploiting tion affects 40% ofthe flaw. For example, SQL injection is often reported as the primary attack vendor supplied webvector by organizations that investigate and track breaches. The 2012 Verizon application builds.Data Breach Investigations Report9 showed that when large organizationswere breached using a hacking technique, SQL injection was used 14% ofthe time. This fact points to both the ease of identifying and launching attacksto exploit SQL injection vulnerabilities and the fact that those attacks oftenresult in the exposure of large amounts of sensitive customer data.9 2012 Data Breach Investigations Report, Verizon14
  16. 16. Enterprise Testing of the Software Supply ChainFor web applications, Figure 8 shows that four of the top five flaw categoriesare also among the OWASP Top 10. Information leakage is the most prevalent Four of the top fivevulnerability category, affecting 79% of vendor-supplied web application builds. flaw categories forCross-site scripting (XSS) is next at 71%, followed by cryptographic issues and web applicationsdirectory traversal tied at 67% of application builds affected. SQL injection, are also among thea perennial favorite of attackers, is in eighth place affecting 40% of vendor OWASP Top 10 mostsupplied web application builds. These percentages are higher than those dangerous flaws.reported across all web applications in Volume 4. For example, SOSS Volume 4reported XSS at 68%, information leakage at 66%, directory traversal at 49%and SQL injection at 32%. The upcoming SOSS Volume 5 report will also showlittle variation from the web application percentages reported in Volume 4.The higher percentages seen in Figure 8 appear to imply that web applicationsuppliers still have a long way to go in developing secure applications. Top Vulnerability Categories Percentage of Affected Vendor Supplied Web Application Builds Indicate categories that are in the OWASP Top 10 Information Leakage 79% Cross-site Scripting (XSS) 71% Cryptographic Issues 67% Directory Traversal 67% CRLF Injection 63% Time and State 51%Insufficient Input Validation 48% SQL Injection 40% API Abuse 35% Credentials Management 34% Encapsulation 23% OS Command Injection 21% Session Fixation 19% Race Conditions 18% Error Handling 11% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Figure 8: Top Vulnerability Categories, Percentage of Affected Vendor Supplied Web Application Builds 15
  17. 17. Enterprise Testing of the Software Supply ChainFor non-web applications, Figure 9 shows that five of the top six vulnerability categories are also among theCWE/SANS Top 25 most dangerous flaws. The most prevalent vulnerability category is cryptographic issues,affecting 62% of vendor-supplied non-web application builds (Figure 8). Cryptographic issues are present on bothOWASP and CWE/SANS standards. One reason cryptographic issues are a concern is because enterprises dependon encryption to protect data if a network breach has occurred. For example, algorithms that incorporate insufficientsources of entropy may be susceptible to attack. Alternatively, attackers may use hard-coded cryptographic keys todecrypt confidential information.Error handling and directory traversal are next on our list at 58% and57% respectively. These percentages are also higher than those reported Five of the top six vulnera-across all non-web applications in Volume 4 and those we will report in bility categories are alsoour upcoming Volume 5. Volume 4 reported cryptographic issues at 46%, among the CWE/SANSerror handling at 24% and directory traversal at 34%. Our upcoming Top 25 most dangerous flawVolume 5 report will show cryptographic issues at 47%, error handling for non-web applications.at 23% and directory traversal at 37%. The higher percentages seen inFigure 9 demonstrate the need for vendors to continue to work towardsdeveloping more secure software. Top Vulnerability Categories Percentage of Affected Vendor Supplied Non-Web Application Builds Indicate categories that are in the CWE/SANS Top 25 Cryptographic Issues 62% Error Handling 58% Directory Traversal 57% Numeric Errors 43%Buffer Management Errors 42% Buffer Overflow 41% Time and State 35% Dangerous Functions 27% Untrusted Search Path 26% OS Command Injection 24% Information Leakage 23% Credentials Management 15% Race Conditions 15% CRLF Injection 14%Insufficient Input Validation 14% Format String 12% SQL Injection 12% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Figure 9: Top Vulnerability Categories, Percentage of Affected Vendor Supplied Non-Web Application Builds16
  18. 18. Enterprise Testing of the Software Supply ChainLanguage and Platform AnalysisFigure 10 shows that web-centric languages such as Java, .NET and PHP represent 70% of the applications as-sessed. C/C++ accounts for 25% of vendor applications, indicating that commercial software continues to rely onC/C++ code. At first glance, the relatively large percentage of applications developed on iOS may appear surprising,since one would expect that vendors supplying mobile applications for enterprise app stores would offer both An-droid and iOS versions and enterprises would have security concerns about both platforms. Veracode’s partnershipwith Good Technology to test iOS applications for inclusion in enterprise app stores accounts for this imbalance. Distribution of Applications by Language and Platform 40% Java 25% C/C++ 23% .NET 7% PHP 4% iOS 1% Android <1% ColdFusion <1% J2ME Figure 10: Distribution of Applications by Language and PlatformNext we explore the vulnerability categories that affect the most applications upon first submission by language family.The information in Table 1 is intended to help enterprises and vendors understand the likelihood of finding differentvulnerability categories in Java, .NET and C/C++, which account for the largest number of applications in this dataset.10Frequently vendors have many questions about what an assessment is likely to find when they are approached byenterprises to participate in application security programs. The answers to these questions depend on the languageused to code the applications, not because one language is necessarily more secure than others, but becausethe types of vulnerabilities that occur vary by language. For example, logging is one of the many functions thatis susceptible to CRLF injection and Java applications tend to log much more than applications written in otherlanguages. Thus one would expect to see a higher percentage of Java applications affected by CRLF injectionvulnerabilities than other languages. Table 1 shows this to be true: CRLF injection affects 71% of Java applications,but only 41% of .NET applications. Additionally, CRLF injection does not make the Top 15 list for C/C++ applications.10 The upcoming Veracode State of Software Security Report Volume 5 will contain more language analysis. 17
  19. 19. Enterprise Testing of the Software Supply ChainCode quality, cryptographic issues and directory traversal affect most vendor applications written in Java and .NET(the languages typically used to write web applications) on first submission. For Java, code quality tops the list at86% of first submissions affected, while cryptographic issues at 77% claims the top spot for .NET. Suppliers ofweb-based applications should also take note of the prevalence of SQL injection and cross-site scripting, as enter-prises are increasingly concerned about the ease of discovery and exploitation of these vulnerabilities. SQL injectionaffects 41% of Java applications and 32% of .NET applications upon first submission. Cross-site scripting is evenmore pervasive, affecting 49% and 43% of Java and .NET applications respectively.Error handling affects the most (87%) C/C++ applications on first submission. Buffer overflow flaws come in secondat 75%, while buffer management errors and numeric errors are tied for third place at 74%. These results prove thatfairly simple programming mistakes continue to persist in C/C++ development.Vulnerability Distribution on First Submission by Language Java .NET C/C++ Code Quality 86% Cryptographic Issues 78% Error Handling 87% Cryptographic Issues 73% Code Quality 75% Buffer Overflow 75% Directory Traversal 73% Directory Traversal 65% Buffer Management Errors 74% CRLF Injection 71% Information Leakage 61% Numeric Errors 74% Information Leakage 56% Time and State 46% Cryptographic Issues 66% Time and State 56% Cross-site Scripting (XSS) 43% Directory Traversal 55% Insufficient Input Validation 54% CRLF Injection 41% Dangerous Functions 51% Cross-site Scripting (XSS) 49% Insufficient Input Validation 34% Time and State 44% Credentials Management 44% SQL Injection 32% Code Quality 40% API Abuse 42% OS Command Injection 23% Untrusted Search Path 27% SQL Injection 41% Credentials Management 19% Format String 24% Encapsulation 26% Untrusted Search Path 18% Race Conditions 23% Session Fixation 25% Error Handling 18% OS Command Injection 20% OS Command Injection 21% Buffer Management Errors 6% API Abuse 13% Race Conditions 18% Buffer Overflow 6% Information Leakage 11%Table 1: Vulnerability Distribution on First Submission by Language18
  20. 20. Enterprise Testing of the Software Supply ChainCompliance with Security Policies on First SubmissionFigure 11 illustrates the compliance upon initial submission of vendor supplied applications against a variety of policies.Web applications are assessed against the OWASP Top 10. Non-web applications are assessed against the CWE/SANSTop 25. More details about how the Veracode platform determines policy compliance can be found in the Appendix.38% of vendor supplied applications comply withthe enterprise’s security policy on the first submis-sion. Complying with an enterprise security policy Vendors are more successful at complyingon the first submission is the goal for software with application security policies definedvendors as it means that no further remediation by the enterprise than they are at meetingeffort is required to be accepted by the enterprise. industry standards. 38% of applicationsYet with more than half of the applications failing complied with enterprise-defined policies,to comply, our results show that secure software but only 10% complied with the OWASPdevelopment practices built into the software and 30% with the CWE/SANS.development lifecycle (SDLC) are still not aswidespread as they should be.Vendor supplied applications have a lower compliance rate with industry standards—only 10% for the OWASPTop 10 and 30% for the CWE/SANS Top 25. These results suggest that enterprise policies are less stringent thanstandards put forth by the security industry. The results also imply that vendors striving to comply with industrystandards are well positioned to comply with any enterprise-defined security policy. A Pearson’s Chi Squaredanalysis of the results shows that Passing Customer Policy and Passing CWE/SANS Policy are highly correlatedwith a p-value of < 0.001. It also shows that Passing Customer Policy and Passing OWASP Policy are highlycorrelated with a p-value of < 0.001. Thus, statistical analysis strongly support the statement that if a build canpass OWASP or SANS, it will very likely comply with enterprise policies. Compliance with Policies on First Submission Compliant Out of Compliance Enterprise Policy 38% 62% CWE/SANS Top 25 30% 70% OWASP Top 10 10% 90% 0% 20% 40% 60% 80% 100% Figure 11: Compliance with Policies on First Submission 19
  21. 21. Enterprise Testing of the Software Supply ChainCompliance with Enterprise Policy by Application PurposeWe also explored how different application types performed against enterprise policy. Figure 12 shows compliancerates upon first submission for each application type. Customer Support and Security had the worst compliancerates; however, enterprises may be subjecting these applications to greater security scrutiny given their sensitivity. Compliance with Enterprise Policy by Application Purpose on First Submission Acceptable Not Acceptable Customer Support 20% 80% Security 24% 76%Business and IT Operations 28% 72% Other 32% 68% Financial 41% 59% Web Infrastructure 65% 35% 0% 20% 40% 60% 80% 100% Figure 12: Compliance with Enterprise Policy by Application Purpose on First Submission20
  22. 22. Enterprise Testing of the Software Supply ChainState of Enterprise Testing ProgramsIn this section we analyze the results of vendor supplied application assessmentprograms experienced by the enterprise. To ensure this data was not skewed, resultsfrom top 2% of enterprises were eliminated from the analysis as they were deemedto be outliers. The remaining enterprises fell into two distinct groups.One group had fairly low numbers of vendors participating in their programs, therefore fewer applications wereassessed. The other group enjoyed much higher levels of vendor participation and program success. Figure 13shows that the average number of applications assessed for group 1 enterprises is approximately 7 while theaverage number of applications assessed for group 2 enterprises is approximately 71. Figure 14 shows that theaverage number of vendors working with group 1 enterprises is approximately 4 while the average number ofvendors working with by group 2 enterprises is approximately 38.The key difference between the groups was their approach to implementing and managing their programs. Thoseenterprises with an ad-hoc approach to requesting vendor assessments had less success, whereas those enterpriseswith a more programmatic approach, including the implementation of best practices offered from analyst firms andearly adopters,11 experienced more success.Enterprises with an ad-hoc approach lacked a protocol for selecting applications for testing and appeared to requestapplication security testing from vendors on a case by case basis. Additionally, the enterprise requestor often hadlimited business, contractual and technical details to respond to specific vendor questions and concerns, thus therequestor appeared to lack a strong mandate from their business and procurement teams.Enterprises with a programmatic approach developed a formal protocol for selecting and requesting vendorapplications for testing. Specifically programmatic application testing programs tended to leverage best practicesfor defining an overall program, such as ensuring collaboration between security, business and procurement teams;specifying business, contractual and technical details as part of the policy; and providing a strong mandate for vendorapplication testing. Some of the best practices implemented by enterprises with a programmatic approach included:• Collaborating with purchasing groups to include application security requirements in RFPs and contract language.• Defining application security compliance policies that clearly outline required assessment methodologies, acceptable and non-acceptable flaw types and frequency of assessments.• Setting realistic timelines for vendors to meet the policy.• Creating well defined escalation procedures for vendors remaining out of compliance.• Clearly delineating the consequences of non-compliance to defined policies (i.e. reduction in annual software maintenance fees; ceasing to do business with the vendor, etc.)11 Secure Software Supply Chain Toolkit (info.veracode.com/vast-getting-started-toolkit.html) 21
  23. 23. Enterprise Testing of the Software Supply Chain Number of Applications Assessed by an Enterprise Program Group 1: Enterprises with an Ad-Hoc Approach Group 2: Enterprises with a Programmatic Approach 150 Enterprises with a NUMBER OF APPLICATIONS ASSESSED programmatic approach had approximately 10 times 100 more vendors and applica- tions participating in their programs than enterprises 50 with an ad-hoc approach. 0 Figure 13: Number of Applications Assessed by an Enterprise Program Number of Vendors Participating in an Enterprise Program Group 1: Enterprises with an Ad-Hoc Approach Group 2: Enterprises with a Programmatic Approach 80 NUMBER OF VENDORS PARTICIPATING 60 40 20 0 Figure 14: Number of Vendors Participating in an Enterprise Program22
  24. 24. Enterprise Testing of the Software Supply ChainAchieving Compliance with Enterprise PoliciesNext we investigated whether the differences in program approach extended to other metrics. Figure 15 showsthat there is a significant difference between the two groups in the number of applications that eventually achievedpolicy compliance within the 18 month timeframe of our analysis. 52% of applications achieved policy compliancewhen enterprises had a programmatic approach, while only 34% of applications achieved policy compliancewhen enterprises had an ad-hoc approach. To delve deeper into this finding, we examine the group of compliantapplications with Figures 16 and 17 and the group of non-compliant applications with Figures 18 and 19. Achieving Compliance with Enterprise Policy Compliant Out of Compliance Enterprises with a 52% 48% Programmatic Approach Enterprises with an 34% 66% Ad-Hoc Approach 0% 20% 40% 60% 80% 100% Figure 15: Achieving Compliance with Enterprise PolicyFirst we examined the number of application builds vendors submitted on their path to compliance with enterprisepolicies. Figure 16 shows the percentage of applications that achieved compliance on the first build submitted,second build submitted, etc. With ad-hoc programs 30% of vendor applications complied with enterprise policyon first submission, while 47% of vendor applications achieved compliance on first submission for enterpriseswith a programmatic approach. Number of Builds Submitted to Achieve Policy Compliance 1 Build 2 Builds 3 Builds 4 Builds 5-10 Builds 1% 1% Enterprises with a 47% 1% Programmatic Approach 2% <1% Enterprises with an 30% 1% Ad-Hoc Approach <1% 0% 10% 20% 30% 40% 50% 60% 70% Figure 16: Number of Builds Submitted to Achieve Policy Compliance 23
  25. 25. Enterprise Testing of the Software Supply ChainWe also considered the amount of time taken forapplications to achieve compliance. We found that Obtaining initial visibility into thesubmitted applications take a shorter amount of time state of vendor software security isto reach compliance for enterprises with a programmatic more important for these enterprisesapproach than those with ad-hoc programs. Figure 17 than demanding compliance with ashows 45% of vendor applications are compliant within tough security policy. For enterprisesone week for enterprises with a programmatic approach, using a programmatic approach,whereas 28% of applications are compliant within one 45% of vendor applications becomeweek for ad-hoc programs. Once again vendors working compliant within one week.with enterprises using a programmatic approachoutperform vendors working with enterprises withad-hoc programs.The time to reach compliance depends on many factors including the strength of the security policy, the complexityof the remediations required, whether remediations can be incorporated into previously scheduled release cycles,and the number of times the application must be remediated and retested before compliance is reached. However,recall that Figure 11 shows that applications had an easier time complying with enterprise policies than with industrystandards, indicating the relative weakness of enterprise policies relative to industry standards. A similar conclusioncan be drawn here, that the relative strength of the security policy is a significant factor in determining the differencein the results between ad-hoc and programmatic approaches. Enterprises with an ad-hoc approach appear tomandate stronger security policies and therefore fewer vendor supplied applications achieve compliance. One areaof future investigation would be to understand the relative strengths of the policies from enterprises in both groupsand how the policies change over time. In particular, we will begin tracking whether enterprises do strengthen theirsecurity policies to reflect industry standard levels over time. Time Taken to Achieve Policy Compliance* 0-1 Week 1-4 Weeks 4-12 Weeks Enterprises with a 45% 3% 4% Programmatic Approach 2% Enterprises with an 28% 4% Ad-Hoc Approach 0% 10% 20% 30% 40% 50% 60% Figure 17: Time Taken to Achieve Policy Compliance * Slight differences between the total percentages in figures are due to rounding24
  26. 26. Enterprise Testing of the Software Supply ChainAnalysis of Non-Compliant ApplicationsNext we dive deeper into the group of applications that have yet to comply with enterprise policy. Figure 18 showsthat 11% of vendors resubmitted new builds of applications for testing but are still out of compliance with enterprisepolicies. Resubmission is clear evidence that the vendor is taking the program seriously, is willing to take actions toremediate the identified flaws and is looking to validate improvements in their security posture. The Veracode platformcan document improvements in an application’s overall security quality score even when the application fails to complywith enterprise policy on subsequent scans. This is one area where the escalation and decision-making aspects of theenterprise’s security policy come into play. Depending on the enterprise’s goals and risk tolerance, the decision maybe made to continue with acquiring the application or renewing their licensing agreement if the vendor’s resubmissionresults show some a positive trend and good faith effort in improving their security posture. Figure 18 also shows large percentages of non-compliant applications with only one build Procedures for managing non-compliant submitted: 39% for enterprises with a program- applications are an important aspect of matic approach and 57% for enterprises with an enterprises security policy, since large an ad-hoc approach. These large percentages percentages of non-compliant applications illustrate the need for enterprise procedures have only one build submitted. for managing non-compliant applications. Number of Builds Submitted for Non-Compliant Applications* 1 Build 2 Builds 3 Builds 4 Builds 5-10 Builds Enterprises with a 39% 7% 3% 1% Programmatic Approach Enterprises with an 57% 6% 3%66% 2% Ad-Hoc Approach 0% 10% 20% 30% 40% 50% 60% 70% Figure 18: Number of Builds Submitted for Non-Compliant Applications * Slight differences between the total percentages in figures are due to rounding 25
  27. 27. Enterprise Testing of the Software Supply ChainFigure 19 shows the time spent out of compliance, which is the number of days between the vendor receivingnon-compliant results from their first submission and the end of the dataset time period. For example, if a vendorreceived non-compliant results from their first submission on June 27, 2012 and the application remained in anon-compliant state until July 1, 2012, then the time spent out of compliance would be listed as 0-1 week. Similarly,if a vendor received non-compliant results from their first submission on March 15, 2012 and the application remainedin a non-compliant state until July 1, 2012, then the time spent out of compliance would be listed as 13-24 weeks.Figure 19 shows both groups of enterprises having a small percentage of vendors that received their first submissionresults in the last month of our research timeframe. These vendors may still be creating their initial remediationplans. Additionally, 13% of applications reside in the 5-12 weeks band for enterprises with a programmatic approachcompared with 7% of applications participating in ad-hoc programs (7%). Some areas for future investigation includetracking how many of these applications eventually become compliant and determining whether applications thatmove into the longer ‘out of compliance’ timeframes have made security improvements.Another interesting comparison between the two program types is in the number of applications that have spentmore than 24 weeks out of compliance. Enterprises with a programmatic approach only had 20% of applicationsbe out of compliance for more than 24 weeks, compared with 39% of applications participating in ad-hoc programs.The results suggest that enterprises with a programmatic approach may do a better job ushering their vendorsthrough the process. One potential reason for the difference is the clarity, or lack thereof, of the enterprise’snon-compliance policy or a lack of escalation procedures to drive compliance within a defined timeframe. Time Spent Out of Compliance* 0-1 Week 1-4 Weeks 4-12 Weeks 12-24 Weeks 24-36 Weeks 36+ Weeks 1% Enterprises with a 5% 13% 9% 3% 17% Programmatic Approach 2% Enterprises with an 3% 7% 16% 16% 23% Ad-Hoc Approach 0% 10% 20% 30% 40% 50% 60% 70% Figure 19: Time Spent Out of Compliance * Slight differences between the total percentages in figures are due to roundingThe vendors with non-compliant applications may also be working with their customers on mitigation strategies tominimize vulnerability risks until vendors are able to remediate critical flaws. The Veracode platform gives enterprisesa workflow for accepting and tracking mitigation procedures proposed by vendors until such time the flaws areremediated and the application becomes compliant with enterprise policy. The workflow data therefore can offerinsights into how many applications include mitigations. That analysis shows 92% of non-compliant applications haveat least one flaw mitigation procedure accepted by the requesting enterprise. These results show the importance ofcodifying the procedures for managing non-compliant applications so that vendors have clear guidelines.26
  28. 28. Enterprise Testing of the Software Supply ChainAppendixUnderstanding How the Veracode Platform Determines Policy ComplianceThe Veracode Platform automatically determines whether an application is compliant with OWASP Top 10,CWE/SANS Top 25, the assigned Veracode Level and the assigned enterprise policy.Veracode looks for specific flaws enumerated by the CWE list and uses standardized mappings to determine whethera flaw belongs in the most recently published OWASP Top 10 and CWE/SANS Top 25 lists. Flaws discovered in anapplication are compared to the respective standards. If a even a single flaw belonging to the standard is discovered,then the application is deemed out of compliance with the standard. For our report web applications are assessedagainst the OWASP Top 10 while non-web applications are assessed against the CWE/SANS Top 25.Veracode Levels are Veracode’s independent standard for evaluating an application’s software quality. VeracodeLevels are assigned based on the business criticality of application. Each Veracode Level provides a predefinedsecurity policy that aligns with different levels of risk the organization is willing to accept for applications of varyingbusiness criticality. Table 2 shows the five Veracode Levels aligned with the five business criticality levels as definedby the National Institute of Standards and Technology (NIST). When an application is assigned a business criticalitythe Veracode platform automatically assigns the appropriate Veracode Level as a predefined policy and determineswhether the application is compliant with the Veracode Level. Each Veracode Level policy contains a combinationof the severity of the flaws found in the application, the types of tests being performed on the application and theapplication’s overall security quality score. An application is deemed compliant when all three aspects of theVeracode Level policy are met.Predefined Policy Requirements for Veracode Levels Predefined Policy Requirements Business Veracode Flaw Severities Not Required Test Minimum Security Criticality Levels Allowed in This Level Methodologies Quality Score Very High VL5 Very High, High, Medium Static and Manual 90 High VL4 Very High, High, Medium Static 80 Medium VL3 Very High, High Static 70 Low VL2 Very High Static or Dynamic or Manual 60 Very Low VL1 Not Applicable Static or Dynamic or Manual Not ApplicableTable 2: Predefined Policy Requirements for Veracode Levels 27
  29. 29. Enterprise Testing of the Software Supply ChainVeracode provides enterprises several options for defining custom policies, including:• Disallowing specific flaw severities• Disallowing specific types of flaws (specified by CWE number)• Attaining a minimum security quality score• Using predefined policies such as OWASP Top 10, CWE/SANS Top 25 or industry standards such as PCI• Specifying application test methodologies and frequencies (e.g. monthly, quarterly, annually)• Meeting timelines for remediating specified flaw severitiesA custom enterprise policy may include any or all of these options. An application is deemed compliant when allaspects of the enterprise’s custom policy are met.Analyzing Compliance Metrics for Long Standing ApplicationsMany of our long standing vendor customers have applications that oscillate in and out of compliance with enterprisepolicies over long periods of time for a variety of reasons. For example, new functionality in a new application versionmay introduce new vulnerabilities causing the application to drop of out compliance with the enterprise policy.Additionally, the enterprise policy may have changed over time. The Veracode platform may also have expandedthe scanning engine to look for a broader range of flaws. In an effort to aggregate and normalize the impact of thevarious change reasons on our metrics, we designed the current SOSS analysis to treat a subset of application policystate changes as if they are a first submission. In other words, SOSS queries look for ‘compliant’ to ‘non-compliant’state changes that are caused by publishing results of a new scan. The SOSS queries then treat the state changingscan as a new release of an application (a “SOSS-release” if you will). This analysis methodology recognizes twosets of SOSS releases: ones that achieve compliance within the SOSS timeframe and ones that do not, based uponall the possible reasons for state changes, from unacceptable to acceptable and from acceptable to unacceptable.Our metrics are designed to provide insight into 1) how long it takes to achieve compliance for all the releases thatdo and 2) for the releases that do not, on how much time the vendor has had to remediate.For example, consider a vendor that uploads application 153 version1 and receives compliant results on the firstscan which was performed on March 15, 2012. Then SOSS analysis would count that as one compliant SOSS-releaseand place it in the 0-1 week bucket in Figure 17: Time Taken To Achieve Policy Compliance. Suppose the vendor thenuploads application 153 verison2 and receives non-compliant results on June 27th, this means that the compliancestatus of application 153 has changed from a compliant to non-compliant state because of the most recent June 27thscan. This June 27th release is counted as Never Passed. Further, let us assume that the vendor does not make anyfurther submissions before the time period for the SOSS volume terminates. For Volume 5, this date is June 30, 2012.The age assigned to the June 27th release is 0-1 week since it was only 3 days old when the SOSS analysis periodterminated. SOSS analysis counts this result in the 0-1 week bucket in Figure 19: Time Spent Out of Compliance.This analysis methodology aggregates and normalizes all the reasons why the state may have changed and focusesour metrics on when the vendor becomes aware of the policy compliance state change—i.e. when the vendorreceives the results of a new scan.28
  30. 30. Veracode, Inc. ABOUT VERACODE65 Network Drive Veracode is the only independent provider of cloud-based application intelligence and securityBurlington, MA 01803 verification services. The Veracode platform provides the fastest, most comprehensive solution to improve the security of internally developed, purchased or outsourced software applicationsTel +1.339.674.2500 and third-party components. By combining patented static, dynamic and manual testing, extensiveFax +1.339.674.2502 eLearning capabilities, and advanced application analytics, Veracode enables scalable, policy-drivenwww.veracode.com application risk management programs that help identify and eradicate numerous vulnerabilities by leveraging best-in-class technologies from vulnerability scanning to penetration testing and© 2012 Veracode, Inc. static code analysis. Veracode delivers unbiased proof of application security to stakeholders acrossAll rights reserved. All other the software supply chain while supporting independent audit and compliance requirements for allbrand names, product names, applications no matter how they are deployed, via the web, mobile or in the cloud. Veracode worksor trademarks belong to their with customers in more than 80 countries worldwide representing Global 2000 brands. For morerespective holders. information, visit www.veracode.com, follow on Twitter: @Veracode or read the Veracode Blog.SSSR /FEATURE/US/1112

×