CVSS

1,119 views

Published on

Delivered at the RMSIG of aisa.org.au on July 2007 and at RUXCON 2K6

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,119
On SlideShare
0
From Embeds
0
Number of Embeds
31
Actions
Shares
0
Downloads
20
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • AISO Web Server
  • Accusations of Subjectivity Due to Lack of Consistent Score Vulnerability Disclosure Framework P8 Scoring - To protect the nation’s critical information infrastructure, the Council believes reliable, consistent vulnerability scoring methods are essential. The Study Group evaluated alternative procedures actively employed by several stakeholders to categorize reported vulnerabilities. Existing vulnerability scoring methods vary widely. To protect the nation’s critical information infrastructure, the Working Group concluded that reliable, consistent vulnerability scoring methods are essential. Unfortunately, the existing diversity in the methods used to identify vulnerabilities and assign scoring metrics presents a contradictory risk—disagreements provide malicious actors increased time to exploit the vulnerability or increase the damages resulting from existing exploitative situations. Therefore, the NIAC commissioned a research task to develop a consistent scoring methodology. The results of the Scoring Subgroup’s work will be published separately when complete. P 37 Support development and use of a universally compatible vulnerability scoring methodology. When complete, such a scoring method should: Employ standardized threat scoring classification schemes structured around accepted criteria by which to assess and evaluate vulnerabilities. The goal of standardized threat scoring is to promote understanding by a range of private and public sector researchers regarding reported vulnerabilities. Allow for local variations, depending on impact, environment, culture, and roles of those developing scores. Permit ongoing adjustment of an assigned score or set of scores in order to reflect research results or the impact of confirmed exploitations or remediation efforts. Incorporate procedures for independent validation of the suitability of any score or set of scores assigned to a vulnerability, along with a means for improper results to be adjusted in a neutral manner.
  • Recommendations Support use of CVSS by all Federal Departments and Agencies by calculating Environmental Metrics. Encourage DHS to promote the use of CVSS to the global community, including critical infrastructure owners and outside of the USA NIAC appointed to identify organization to function as the permanent home for CVSS. NAIC appointed FIRST 11th May 2005 Significant Technical Expertise Experience in Managing Vulnerabilities Maintains a Global Focus Renamed again CVSS v2 to CVSS v1.1 to CVSS v2
  • Base Metrics “ Intrinsic to any given vulnerability that do not change over or in different environments ” Six Metrics Scored by Vendor Temporal Metrics “ Characteristics of the vulnerability which evolve over the lifetime of the vulnerability ” Three Metrics Scored by Vendor and/or FIRST Member Environmental Metrics “ Contain those characteristics of vulnerability which are tied to a specific implementation of the end user. ” Five Metrics Scored by End User
  • Six Metrics Scored by Vendor Changes from CVSS v1 Authentication includes multiple use of same credentials Access Vector – Bluetooth. 802.11 Wireless, etc
  • Three Metrics Scored by Vendor and/or FIRST Member
  • Six Metrics Scored by Vendor Changes from CVSS v1 Authentication includes multiple use of same credentials Access Vector – Bluetooth. 802.11 Wireless, etc
  • Reduced Privileges of Running Process
  • Binary Diff and Fuzzing weren’t considered by Vulnerability Disclosure Framework either!
  • Jeff Jones Complies Vulnerability Statistics for Microsoft.
  • Vulnerability Disclosure Framework – Equal Involvement from All Parties
  • Vulnerability Disclosure Framework – Equal Involvement from All Parties
  • CVSS

    1. 1. Common Vulnerability Scoring System Christian Heinrich ASIA RMSIG July 2007
    2. 2. cmlhCurrently Security Researcher – Defeating Network Intrusion Detection/Prevention and Forensics – Presented at RUXCON 2K5 and RUXCON 2K6Former Security Manager – News Limited – DSD Gateway Certified Service Provider – Federal Government Endorsed BusinessPublic Profile on LinkedIn - http://www.linkedin.com/in/ChristianHeinrich
    3. 3. Agenda1. History from the VDF to CVSS v22. CVSS v2 from the End User’s Perspective3. Caveats, Politics and other Traps :)
    4. 4. Vulnerability Disclosure FrameworkNational Infrastructure Advisory Council (NIAC) Vulnerability Disclosure Working Group (VDWG) – 13 Jan 2004Findings with Existing Methodologies from Microsoft, CERT, etc – Specific to Vendor x Product y not Vendor z Product y – No consideration to • Environment of End User • Time Line of Vulnerability
    5. 5. CVSS to CVSS v212 October 2004 - Vulnerability Scoring Working Sub Group of VDWGFebruary 2005 - Presented at RSA by Mike Schiffman (Cisco)11 May 2005- NAIC Appointed Forum of Incident Response and Security Teams (FIRST)- FIRST formed Special Interest Group (CVSS-SIG)20 June 2007 – CVSS v2
    6. 6. CVSS v2
    7. 7. Base MetricsIntrinsic to any given vulnerability that do not change over or in different environments1. Access from Local Console or Remote Network via Bluetooth -> Internet2. “Technical” Likelihood3. Authentication“Technical” Impact to 4. Confidentiality, 5. Integrity and 6. Availability
    8. 8. Temporal MetricsCharacteristics of the vulnerability which evolve over the lifetime of the vulnerability1. Maturity of the Exploit i.e. Proof of Concept, Worm, etc?2. Is a Patch and/or Workaround, Available?3. Confidence in the Report?
    9. 9. Environmental MetricsContain those characteristics of vulnerability which are tied to a specific implementation of the end user1. Potential Collateral Damage to Critical Infrastructure?2. Total number of Targets?“Business” Impact to 3. Confidentiality, 4. Integrity and 5. Availability
    10. 10. ScoringCalculators published via the “Scores and Calculators” Page at http://www.first.org/cvssPresentation of Base Metrics AV:[L,A,N]/AC:[H,M,L]/Au:[M,S,N]/C:[N,P,C]/I:[N,P,C]/A:[N,P,C]Presentation of Temporal Metrics E:[U,POC,F,H,ND]/RL:[OF,TF,W,U,ND]/RC:[UC,UR,C,ND]Presentation of Environmental Metrics CDP:[N,L,LM,MH,H,ND]/TD:[N,L,M,H,ND]/CR:[L,M,H,ND]/IR:[L,M,H,ND]/AR:[L,M,H,ND]Presentation of Base Metrics Example: AV:L/AC:M/Au:N/C:N/I:P/A:C
    11. 11. Caveats, Politics and other Traps :)Base MetricsVendor’s “subjective” interpretation of Base Metrics “Independent” NIST National Vulnerability Database (NVD)Vendor publishes Base Score but withholds Base Metrics Derive Possible Base Metrics from Base Score with FuzzerAttack Vector – Metric with Highest Numerical Value, not most common Some attacks e.g. XSS only considers Web Server, not BrowserAuthentication – Can be “reduced” due to certain implementations e.g. Token, S/KEYConsiderations towards End User’s Environment – Probability of Deriving Authentication Credential – Range of Wireless Network? What if High Gain Antenna? What if Faraday Cage?
    12. 12. Caveats, Politics and other Traps :)Temporal Metrics“Will this affect my network range?”- No feed, real-time or otherwise, is providedDoesn’t Consider reduction in time due to “Binary Diff” and/or “Fuzzing”Environmental MetricsTarget Distribution - Map “Connectivity” with Active and Passive DiscoveryDoesn’t Consider: - Cost to Implement Patch and/or Workaround - Technical Knowledge Required for Attack Complexity
    13. 13. Caveats, Politics and other Traps :)ScoringDeveloping “Fuzzer” to Derive All Scores by Calculating All Numerical Values Rounding to “Reduce” Score. Substitution – Different Metric Yet Same Score Derive Possible Metrics from ScoreBased on CVSS v1 FuzzerExpect an Announcement from Jeff Jones (Microsoft)Come to the Security Interchange meeting later this year
    14. 14. Caveats, Politics and other Traps :)Lack of Representation: – No invitation to End Users and little from Security Researchers (e.g. Schiffman) – No lesson learnt by CERTThe Horse has Bolted – First Impressions Last: – Optional Scores – Resistance from Initial Supporters such as Microsoft – CVE still in process of reclassifying vulnerabilities to updated schemaAdvocate to Vendor as it provides YOU with Advantages in removing Subjectivity from: – Priorities Remediation regardless of Vendor and/or Product and/or Technology – Objective Vulnerability Distribution Studies
    15. 15. ThanksJohn GreavesDavid Palmer & WestpacChris Wood & PatchlinkDavid ReinholdJohn DaleJohn Frisken

    ×