• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Tracking the Progress of an SDL Program: Lessons from the Gym
 

Tracking the Progress of an SDL Program: Lessons from the Gym

on

  • 1,560 views

This presentation is from the 29 June 2009 OWASP Minneapolis-St. Paul (MSP) chapter meeting. ...

This presentation is from the 29 June 2009 OWASP Minneapolis-St. Paul (MSP) chapter meeting.

Cassio Goldschmidt of Symantec talked about defining consistent metrics for tracking security vulnerabilities throughout the security development lifecycle.

Statistics

Views

Total Views
1,560
Views on SlideShare
1,557
Embed Views
3

Actions

Likes
0
Downloads
17
Comments
0

1 Embed 3

http://www.slideshare.net 3

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial-NoDerivs LicenseCC Attribution-NonCommercial-NoDerivs LicenseCC Attribution-NonCommercial-NoDerivs License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Forcing muscle growth is a long process which requires high intensity weight training and high mental concentration. While the ultimate goal is often clear, one of the greatest mistakes bodybuilders consistently make is to overlook the importance of tracking their weight lifting progress.  Like a successful bodybuilding workout, a security development lifecycle program must consistently log simple to obtain, yet meaningful metrics throughout the entire process. Good metrics must lack subjectivity and clearly aid decision makers to determine areas that need improvement. In this pragmatic presentation we’ll discuss metrics used at Symantec, the world’s largest security ISV, to classify and appropriately compare security vulnerabilities found in different phases of the SDL by different teams working in different locations and in different products. We’ll also discuss how to easily provide decision makers different views of the same data and verify whether the process is indeed catching critical vulnerabilities internally and how the numbers compare with the competition.
  • Cassio Goldschmidt is senior manager of the product security team under the Office of the CTO at Symantec Corporation. In this role he leads efforts across the company to ensure the secure development of software products. His responsibilities include managing Symantec’s internal secure software development process, training, threat modeling and penetration testing. Cassio’s background includes over 13 years of technical and managerial experience in the software industry.  During the seven years he has been with Symantec, he has helped to architect, design and develop several top selling product releases, conducted numerous security classes, and coordinated various penetration tests. Cassio is also internationally known for leading the OWASP chapter in Los Angeles.Cassio represents Symantec on the SAFECode technical committee and (ISC)2 in the development of the CSSLP certification. He holds a bachelor degree in computer science from PontificiaUniversidadeCatolica do Rio Grande Do Sul, a masters degree in software engineering from Santa Clara University, and a masters of business administration from the University of Southern California.

Tracking the Progress of an SDL Program: Lessons from the Gym Tracking the Progress of an SDL Program: Lessons from the Gym Presentation Transcript

  • Tracking the Progress of an SDL ProgramLessons from the Gym
    Cassio Goldschmidt
    June 29th, 2009
  • Introduction
    2
  • Who am I?
    Cassio Goldschmidt
    Sr. Manager, Product Security
    Chapter Leader, OWASP Los Angeles
    Education
    MBA, USC
    MS Software Engineering, SCU
    BS Computer Science, PUCRS
    Certified Software Sec. Lifecycle Professional – CSSLP, (ISC)2
    When I’m not in the office…
    Volleyball (Indoor, Beach)
    Coding
    Gym…
    3
  • Typical Project Lifecycle
    4
    DESIGN
    CODE
    TEST
    SUPPORT
  • How your workout looks like
    5
    May 13th Workout
    Exercise: Pile Squat
    Repetitions: 35
    Weight: 20 lbs
    Exercise: Barbell Squat
    Repetitions: 35
    Weight: 150 lbs
    Exercise: Rev. Curl
    Repetitions: 20
    Weight: 25 lbs
  • How your METRICS should look like
    6
    May 13thSec. Metrics
    Exercise type:
    CWE
    Exercise: Pile Squat
    Repetitions: 35
    Weight: 20 lbs
    Exercise: Barbell Squat
    Repetitions: 35
    Weight: 150 lbs
    Exercise: Rev. Curl
    Repetitions: 20
    Weight: 25 lbs
  • How your METRICS should look like
    7
    May 13thSec. Metrics
    Number of Reps:
    Number of Findings
    CWE: 79 - XSS
    Repetitions: 35
    Weight: 20 lbs
    Exercise: Barbell Squat
    Repetitions: 35
    Weight: 150 lbs
    Exercise: Rev. Curl
    Repetitions: 20
    Weight: 25 lbs
  • How your METRICS should look like
    8
    May 13thSec. Metrics
    Exercise Intensity:
    CVSS
    CWE: 79 - XSS
    Findings: 10
    Weight: 20 lbs
    Exercise: Barbell Squat
    Repetitions: 35
    Weight: 150 lbs
    Exercise: Rev. Curl
    Repetitions: 20
    Weight: 25 lbs
  • How your METRICS should look like
    9
    May 13thSec. Metrics
    CWE: 20 – Input Val
    Findings: 1
    CVSS: 8.6
    DESIGN
    Threat Model
    CWE: 79 - XSS
    Findings: 3
    CVSS:
    TEST
    Pen Test
    CWE: 314
    Findings: 1
    CVSS: 2.3
    Support
    Vul. Mgmt
  • Common Weakness Enumeration
  • Common Weakness EnumerationWhat is it?
    A common language for describing software security weaknesses
    Maintained by the MITRE Corporation with support from the National Cyber Security Division (DHS).
    Hierarchical
    Each individual CWE represents a single vulnerability type
    Deeper levels of the tree provide a finer granularity
    Higher levels provide a broad overview of a vulnerability
    11
  • Common Weakness EnumerationPortion of CWE structure
    12
  • Common Weakness EnumerationWhat data is available for each CWE?
    Weakness description
    Applicable platforms and programming languages
    Common Consequences
    Likelihood of Exploit
    Coding Examples
    Potential Mitigations
    Related Attacks
    Time of Introduction
    Taxonomy Mapping
    13
    Link to CWE Page on XSS
  • Common Weakness Enumeration How useful is this information?
    14
    Pie Chart showing the frequency of CWEs
    found in penetration tests
  • Common Vulnerability Scoring System
  • Objective (and “perfect enough”) metric
    A universal way to convey vulnerability severity
    Can be used for competitive analysis
    CVSS score ranges between 0.0 and 10.0
    Can be expressed as high, medium, low as well
    Composed of 3 vectors
    Base
    Represents general vulnerability severity: Intrinsic and immutable
    Temporal
    Time-dependent qualities of a vulnerability
    Environmental
    Qualities of a vulnerability specific to a particular IT environment
    16
    Common Vulnerability Scoring System (CVSS)What is it?
  • 17
    Common Vulnerability Scoring System (CVSS)BASE Vector
    Exploitability
    Impact
    Sample Score: 7.5
    Sample Vector: (AV:N/AC:L/Au:N/C:P/I:P/A:P)
    Every CVSS score should be accompanied by the corresponding vector
  • 18
    Common Vulnerability Scoring System (CVSS)The Calculator
  • Training and Metrics.
  • Training and MetricsA special activity in the SDL
    20
    • Security training is what food is to a workout
    • Same workout metrics do not apply
    • Quality of your intake affects overall performance
    • Staff needs ongoing training
  • Training and Metrics Security Learning Process
    21
  • Training and Metrics Security Learning Process
    22
    Understand who is the audience
    • Previous knowledge about secure coding and secure testing
    • Programming languages in use
    • Supported platforms
    • Type of product
  • Training and Metrics Security Learning Process
    23
    Train everyone involved in the SDL
    • Developers: Secure Coding, Threat Model
    • QA: Security Testing, Tools
    • Managers: Secure Development Lifecycle (also known as Symmunize)
  • Training and Metrics Security Learning Process
    24
    Quality Assurance - Capture the flag
    • Use Beta software
    • Approximately 3 hours long
    • Top 3 finders receive prizes and are invited to explain what techniques and tools they used to find the vulnerabilities to the rest of the group
  • Training and Metrics Security Learning Process
    25
    Pos Class Survey
    • Anonymous
    • Metrics
    • Class content
    • Instructor knowledge
    • Exercises
  • Training and Metrics Security awareness is more than training
    26
    Knowledge Sharing Activities
    Tech Exchanges
    Cutting Edge
    CTO Newsletter Articles
  • Conclusions and final thoughts
  • Why This Approach Makes Sense?
    28
    DESIGN
    CODE
    TEST
    SUPPORT
    • Compare Apples to Apples
    • Quantify results in a meaningful way to “C” executives
    • Past results can be used to explain impact of new findings
    • Can be simplified to a number from 1-10 or semaphore (green, yellow and red).
    • Can be used for competitive analysis
    • Harder to game CVSS
    • CWE can be easily mapped to different taxonomies
  • Thank You!
    Cassio Goldschmidt
    cassio_goldschmidt@symantec.com
    cassio@owasp.org
    Copyright © 2007 Symantec Corporation. All rights reserved.  Symantec and the Symantec Logo are trademarks or registered trademarks of Symantec Corporation or its affiliates in the U.S. and other countries.  Other names may be trademarks of their respective owners.
    This document is provided for informational purposes only and is not intended as advertising.  All warranties relating to the information in this document, either express or implied, are disclaimed to the maximum extent allowed by law.  The information in this document is subject to change without notice.