SAVI: Static-AnalysisVulnerability IndicatorJAMES WALDEN AND MAUREEN DOYLENORTHERN KENTUCKY UNIVERSITYPRESENTED BY: ASIF IMRAN (MSSE0119), JOBAER ISLAM KHAN (MSSE0109)
Addressed Problem Frequently the target of attackers  Largest source of security vulnerabilities  Identity theft , phishing, malware, etc erode trust and cause financial loss 
Proposed Solution Static analysis of source code to detect vulnerabilities of web application. SAVI: Static- Analysis Vulnerability Indicator Combines several static-analysis results Ranks vulnerability of Web Applications
Sources of vulnerability count Vulnerability repositories : National Vulnerability Database (NVD) Microsoft Security Bulletins Drupal Security Advisories Output of static-analysis tools Output of security-focused dynamic-analysis tools Note: source types comprises many sources with different vulnerability databases and analysis tools application’s vulnerability history can be obtained from reported databases
Vulnerability Detection Techniques Static Analysis: Static-analysis tools find an application’s current vulnerabilities by evaluating its source code without executing it. Advantages Disadvantages 1. Find vulnerabilities objectively 1. Produce false negatives 2. Find vulnerabilities rapidly 2. Produce false positives Example: Fortify SCA Reduce business risk by identifying vulnerabilities that pose the biggest threat Identify and remove exploitable vulnerabilities quickly with a repeatable process Reduce development cost by identifying vulnerabilities early in the SDLC Educate developers in secure coding practices while they work
Vulnerability Detection Techniques[cont] Dynamic Analysis: identify vulnerabilities in running Web applications Advantages Disadvantages 1. Simulates a malicious user by 1. Increased efforts attacking and probing 2. Independent of Programming 2. False Positives and False Languages Negatives Examples: Veracode-DA
False positives and False negatives False negatives occur when tools don’t report existing security bugs False positives occur when tools report vulnerabilities that do not exist Triaging: Manually auditing source code to identify false positives  Manually auditing enough results, a security team can predict the rate at which false positives and false negatives occur for a given project and extrapolate the number of true positives from a set of raw results .
Methodology Static Analysis Fast results Current Bugs can be detected Repeatability Vulnerability Repository: NVD to validate the predictions of static analysis metrics. Correlation between static-analysis and reported vulnerability for the analyzed software in the future.
Methodology [cont] Normalize vulnerabilities based on code SAVD (Static Analysis Vulnerability Density) NVD Correlation between SAVD and NVD
Methodology [cont] Open Source applications as test cases Dokuwiki :wiki Mediawiki: wiki phpBB: web forum Source code: PHP phpMyAdmin: system administration Squirrelmail: email client
Methodology [cont] Fortify Source Code Analyzer (SCA) Output in XML : vulnerability data Custom Ruby scripts used to convert the vulnerability data and line counts into a form that could be analyze with statistical software 29,000 LOC <= code <= 162,000 LOC 180 second <= time <= 3600 seconds Core i5 processor and 8 Gbytes of RAM
Discussion Context independent metric: applications have same data, functionality and same installation standards SAVI indicates postrelease vulnerability density. SAVI lets organizations choose less vulnerable applications Further investigation is required to determine whether similar results might hold for other application classes
Conclusion[cont] SAVD for each application version correlated significantly with the NVD vulnerability density for that version’s year and subsequent years. For example, the SAVD of a project for 2009 correlated with the project’s NVD density for 2010, and 2011. This result means that static-analysis tools indicate an application’s postrelease vulnerability.
References  M. Gegick and L. Williams, “Toward the Use of Automated Static Analysis Alerts for Early Identification of Vulnerability- and Attack-Prone Components,” Proc. 2nd Int’l Conf. Internet Monitoring and Protection (ICIMP 07), IEEE CS, 2007, p. 18.  M. Gegick et al., “Prioritizing Software Security Fortification through Code-Level Metrics,” Proc. 4th ACM Workshop Quality of Protection (QoP 08), ACM, 2008, pp. 31–38.  “Coverity Scan: 2010 Open Source Integrity Report,” Coverity, 1 Nov. 2010; www.coverity.com/library/pdf/coverity-scan-2010-open-source- integrity-report.pdf.  http://www.informit.com/articles/article.aspx?p=768662&seqNum=3