Application Assessment Techniques

OWASP Northern Virginia

August 6th, 2009
Agenda
•   Background
•   Common Pitfalls in Application Assessment
•   Moving Beyond
    – Threat Modeling
    – Code Review
    – Dynamic Testing
       y            g
•   Presenting Results
•   Questions / Panel Discussion




                                                1
Background
•   Dan Cornell
    – Principal at Denim Group www.denimgroup.com
    – Software Developer: MCSD Java 2 Certified Programmer
                           MCSD,
    – OWASP: Global Membership Committee, Open Review Project, SA Chapter Lead


•   Denim Group
    – Application Development
        • Java and .NET
    – Application Security
        • Assessments, penetration tests, code reviews, training, process consulting




                                                                                       2
How Not To Do It




                   3
How Not To Do It
•   Q: What are you all doing to address application security concerns in
    your organization?
•   A: We b
    A W bought “XYZ Scanner”
                 ht      S         ”
•   Q: Okay… Are you actually using it?
•   A: We ran some scans
•   Q: And how did that go?
•   A: Oh we found some stuff…
•   Q: How did you address those issues?
•   A: I think we sent the report to the developers. Not sure what they did
    with them. I guess I ought to check in on that…



                                                                              4
Goals of Application Assessment
•   Vary by organization, by application and by assessment

•   Determine the security state of an application
•   Characterize risk to executives and decision makers
•   Prove a point
             p
•   Set the stage for future efforts




                                                             5
Common Pitfalls in Application Assessment




                                            6
Common Pitfalls in Application Assessment
•   Ad hoc approach
    – Non-repeatable, non-comprehensive
•   Reliance on automated t l
    R li          t   t d tools
    – Can only find a subset of vulnerabilities – false negatives
    – Even the good tools need tuning to reduce false positives
•   Current commercial tools are biased
    – Rulesets and capabilities typically over-focused on web applications
•   Too focused on one approach
    – Static and dynamic testing have different strengths
    – Economic concerns constrain the amount of testing that can be performed – make
      the most of the time you have




                                                                                       7
Moving Beyond
•   Automated versus Manual
•   Threat Modeling
•   Dynamic Testing
•   Source Code Review




                              8
Automated Versus Manual




                          9
Automated Versus Manual
•   Automated tools are great at:
     – Consistency - not getting tired
     – Data flow analysis
•   Automated tools are terrible for:
     – Understanding business context
•   Manual testing is great at:
     – Identifying business logic flaws
•   Manual testing is terrible for:




                                          10
Threat Modeling
•   Provides high-level understanding of the system
    – Useful for creating a structured test plan
•   Provides
    P id application context
             li ti      t t
    – Crucial for characterizing results
•   Complementary with Abuse Cases




                                                      11
Threat Modeling Approach
•   Establish scope and system boundaries
•   Decompose the system into a Data Flow Diagram (DFD)
•   Assign potential threats based on asset types




                                                          12
Threat Model Example




                       13
Mapping Threats to Asset Types
Threat Type                  External     Process   Data Flow   Data Store
                             Interactor
S – Spoofing
S S     fi                   Yes
                             Y            Yes
                                          Y

T – Tampering                             Yes       Yes         Yes

R – Repudiation              Yes          Yes                   Yes

I – Information Disclosure                Yes       Yes         Yes

D – Denial of Service                     Yes       Yes         Yes

E – Elevation of Privilege                Yes



                                                                             14
Threat Modeling
•   Result is a structured, repeatable list of threats to check
     – Strength is to find known problems repeatably
•   Augment with Ab
    A     t ith Abuse C
                      Cases
     – “What could go wrong” scenarios
     – More creative and unstructured




                                                                  15
Dynamic, Static and Manual Testing
Source Code Review
•   Advantages
•   Disadvantages
•   Approaches




                     17
Static Analysis Advantages
•   Have access to the actual instructions the software will be executing
     – No need to guess or interpret behavior
     – Full access to all the software’s possible behaviors
                              software s
•   Remediation is easier because you know where the problems are
Static Analysis Disadvantages
•   Require access to source code or at least binary code
     – Typically need access to enough software artifacts to execute a build
•   Typically
    T i ll require proficiency running software b ild
                    i     fi i         i     ft     builds
•   Will not find issues related to operational deployment environments
Approaches
•   Run automated tools with default ruleset
     – Provides a first-cut look at the security state of the application
     – Identify “hot spots
                 hot spots”
•   Craft custom rules specific to the application
     –   3rd party code
     –   Break very large applications into manageable chunks
     –   Application-specific APIs – sources, sinks, filter functions
     –   Compliance-specific constructs
•   This is an iterative process




                                                                            20
Approaches
•   Auditing results from an automated scan
    – Typically must sample for larger applications (or really bad ones)
    – Many results tend to cluster on a per application basis – coding idioms for error
                                        per-application
      handling, resource lifecycle
•   Manual review
    – Must typically focus the effort for economic reasons
    – Hot spots from review of automated results
    – Security-critical functions from review of automated results – encoding,
      canonicalization
    – Security-critical areas
    – Startup, shutdown




                                                                                          21
Dynamic Testing
•   Advantages
•   Disadvantages
•   Approaches




                    22
Dynamic Analysis Advantages
•   Only requires a running system to perform a test
•   No requirement to have access to source code or binary code
•   No need to understand how to write software or execute builds
     – Tools tend to be more “fire and forget”
•   Tests a specific, operational deployment
     – Can find infrastructure, configuration and patch errors that Static Analysis tools will
       miss
Dynamic Analysis Disadvantages
•   Limited scope of what can be found
     – Application must be footprinted to find the test area
     – That can cause areas to be missed
     – You can only test what you have found
•   No access to actual instructions being executed
     – Tool is exercising the application
     – Pattern matching on requests and responses
Approaches
•   Where possible/reasonable confirm findings of the source code review
•   Determine if mitigating factors impact severity
     – WAFs, SSO, etc
     – Be careful with this
•   Look at things easiest to test on a running application
     – M
       Macro error h dli
                   handling
     – Authentication and authorization implementation




                                                                           25
Bringing Approaches Together
•   These approaches feed one another
    – Valuable to be able to re-run tools and iterate between static and dynamic testing
•   Results must be communicated i th context th Th t M d l
    R   lt     tb         i t d in the   t t the Threat Model
    – Severity, compliance implications, etc




                                                                                           26
Presenting Results




                     27
Presenting Results
•   Universal developer reaction:
     – “That’s not exploitable”
     – “That’s not the way it works in production”
        That s                         production
•   Demonstrations of attacks can inspire comprehension
     – This can be a trap – often demonstrating exploitability of a vulnerability takes longer
       than fixing the vulnerability
•   Properly characterize mitigating factors
     – Often deployed incorrectly
     – Code has a tendency to migrate from application to application
                          y       g         pp             pp
•   Risk is important – so is the level of effort required to fix




                                                                                                 28
Questions?
Dan Cornell
dan@denimgroup.com
Twitter: @d i l
T itt @danielcornell
                  ll

(210) 572-4400

Web: www.denimgroup.com
Blog: denimgroup.typepad.com




                               29

Application Assessment Techniques

  • 1.
    Application Assessment Techniques OWASPNorthern Virginia August 6th, 2009
  • 2.
    Agenda • Background • Common Pitfalls in Application Assessment • Moving Beyond – Threat Modeling – Code Review – Dynamic Testing y g • Presenting Results • Questions / Panel Discussion 1
  • 3.
    Background • Dan Cornell – Principal at Denim Group www.denimgroup.com – Software Developer: MCSD Java 2 Certified Programmer MCSD, – OWASP: Global Membership Committee, Open Review Project, SA Chapter Lead • Denim Group – Application Development • Java and .NET – Application Security • Assessments, penetration tests, code reviews, training, process consulting 2
  • 4.
    How Not ToDo It 3
  • 5.
    How Not ToDo It • Q: What are you all doing to address application security concerns in your organization? • A: We b A W bought “XYZ Scanner” ht S ” • Q: Okay… Are you actually using it? • A: We ran some scans • Q: And how did that go? • A: Oh we found some stuff… • Q: How did you address those issues? • A: I think we sent the report to the developers. Not sure what they did with them. I guess I ought to check in on that… 4
  • 6.
    Goals of ApplicationAssessment • Vary by organization, by application and by assessment • Determine the security state of an application • Characterize risk to executives and decision makers • Prove a point p • Set the stage for future efforts 5
  • 7.
    Common Pitfalls inApplication Assessment 6
  • 8.
    Common Pitfalls inApplication Assessment • Ad hoc approach – Non-repeatable, non-comprehensive • Reliance on automated t l R li t t d tools – Can only find a subset of vulnerabilities – false negatives – Even the good tools need tuning to reduce false positives • Current commercial tools are biased – Rulesets and capabilities typically over-focused on web applications • Too focused on one approach – Static and dynamic testing have different strengths – Economic concerns constrain the amount of testing that can be performed – make the most of the time you have 7
  • 9.
    Moving Beyond • Automated versus Manual • Threat Modeling • Dynamic Testing • Source Code Review 8
  • 10.
  • 11.
    Automated Versus Manual • Automated tools are great at: – Consistency - not getting tired – Data flow analysis • Automated tools are terrible for: – Understanding business context • Manual testing is great at: – Identifying business logic flaws • Manual testing is terrible for: 10
  • 12.
    Threat Modeling • Provides high-level understanding of the system – Useful for creating a structured test plan • Provides P id application context li ti t t – Crucial for characterizing results • Complementary with Abuse Cases 11
  • 13.
    Threat Modeling Approach • Establish scope and system boundaries • Decompose the system into a Data Flow Diagram (DFD) • Assign potential threats based on asset types 12
  • 14.
  • 15.
    Mapping Threats toAsset Types Threat Type External Process Data Flow Data Store Interactor S – Spoofing S S fi Yes Y Yes Y T – Tampering Yes Yes Yes R – Repudiation Yes Yes Yes I – Information Disclosure Yes Yes Yes D – Denial of Service Yes Yes Yes E – Elevation of Privilege Yes 14
  • 16.
    Threat Modeling • Result is a structured, repeatable list of threats to check – Strength is to find known problems repeatably • Augment with Ab A t ith Abuse C Cases – “What could go wrong” scenarios – More creative and unstructured 15
  • 17.
    Dynamic, Static andManual Testing
  • 18.
    Source Code Review • Advantages • Disadvantages • Approaches 17
  • 19.
    Static Analysis Advantages • Have access to the actual instructions the software will be executing – No need to guess or interpret behavior – Full access to all the software’s possible behaviors software s • Remediation is easier because you know where the problems are
  • 20.
    Static Analysis Disadvantages • Require access to source code or at least binary code – Typically need access to enough software artifacts to execute a build • Typically T i ll require proficiency running software b ild i fi i i ft builds • Will not find issues related to operational deployment environments
  • 21.
    Approaches • Run automated tools with default ruleset – Provides a first-cut look at the security state of the application – Identify “hot spots hot spots” • Craft custom rules specific to the application – 3rd party code – Break very large applications into manageable chunks – Application-specific APIs – sources, sinks, filter functions – Compliance-specific constructs • This is an iterative process 20
  • 22.
    Approaches • Auditing results from an automated scan – Typically must sample for larger applications (or really bad ones) – Many results tend to cluster on a per application basis – coding idioms for error per-application handling, resource lifecycle • Manual review – Must typically focus the effort for economic reasons – Hot spots from review of automated results – Security-critical functions from review of automated results – encoding, canonicalization – Security-critical areas – Startup, shutdown 21
  • 23.
    Dynamic Testing • Advantages • Disadvantages • Approaches 22
  • 24.
    Dynamic Analysis Advantages • Only requires a running system to perform a test • No requirement to have access to source code or binary code • No need to understand how to write software or execute builds – Tools tend to be more “fire and forget” • Tests a specific, operational deployment – Can find infrastructure, configuration and patch errors that Static Analysis tools will miss
  • 25.
    Dynamic Analysis Disadvantages • Limited scope of what can be found – Application must be footprinted to find the test area – That can cause areas to be missed – You can only test what you have found • No access to actual instructions being executed – Tool is exercising the application – Pattern matching on requests and responses
  • 26.
    Approaches • Where possible/reasonable confirm findings of the source code review • Determine if mitigating factors impact severity – WAFs, SSO, etc – Be careful with this • Look at things easiest to test on a running application – M Macro error h dli handling – Authentication and authorization implementation 25
  • 27.
    Bringing Approaches Together • These approaches feed one another – Valuable to be able to re-run tools and iterate between static and dynamic testing • Results must be communicated i th context th Th t M d l R lt tb i t d in the t t the Threat Model – Severity, compliance implications, etc 26
  • 28.
  • 29.
    Presenting Results • Universal developer reaction: – “That’s not exploitable” – “That’s not the way it works in production” That s production • Demonstrations of attacks can inspire comprehension – This can be a trap – often demonstrating exploitability of a vulnerability takes longer than fixing the vulnerability • Properly characterize mitigating factors – Often deployed incorrectly – Code has a tendency to migrate from application to application y g pp pp • Risk is important – so is the level of effort required to fix 28
  • 30.
    Questions? Dan Cornell dan@denimgroup.com Twitter: @di l T itt @danielcornell ll (210) 572-4400 Web: www.denimgroup.com Blog: denimgroup.typepad.com 29