Application Assessment Techniques

OWASP Northern Virginia

August 6th, 2009
Agenda
•   Background
•   Common Pitfalls in Application Assessment
•   Moving Beyond
    – Threat Modeling
    – Code Rev...
Background
•   Dan Cornell
    – Principal at Denim Group www.denimgroup.com
    – Software Developer: MCSD Java 2 Certifi...
How Not To Do It




                   3
How Not To Do It
•   Q: What are you all doing to address application security concerns in
    your organization?
•   A: W...
Goals of Application Assessment
•   Vary by organization, by application and by assessment

•   Determine the security sta...
Common Pitfalls in Application Assessment




                                            6
Common Pitfalls in Application Assessment
•   Ad hoc approach
    – Non-repeatable, non-comprehensive
•   Reliance on auto...
Moving Beyond
•   Automated versus Manual
•   Threat Modeling
•   Dynamic Testing
•   Source Code Review




             ...
Automated Versus Manual




                          9
Automated Versus Manual
•   Automated tools are great at:
     – Consistency - not getting tired
     – Data flow analysis...
Threat Modeling
•   Provides high-level understanding of the system
    – Useful for creating a structured test plan
•   P...
Threat Modeling Approach
•   Establish scope and system boundaries
•   Decompose the system into a Data Flow Diagram (DFD)...
Threat Model Example




                       13
Mapping Threats to Asset Types
Threat Type                  External     Process   Data Flow   Data Store
                ...
Threat Modeling
•   Result is a structured, repeatable list of threats to check
     – Strength is to find known problems ...
Dynamic, Static and Manual Testing
Source Code Review
•   Advantages
•   Disadvantages
•   Approaches




                     17
Static Analysis Advantages
•   Have access to the actual instructions the software will be executing
     – No need to gue...
Static Analysis Disadvantages
•   Require access to source code or at least binary code
     – Typically need access to en...
Approaches
•   Run automated tools with default ruleset
     – Provides a first-cut look at the security state of the appl...
Approaches
•   Auditing results from an automated scan
    – Typically must sample for larger applications (or really bad ...
Dynamic Testing
•   Advantages
•   Disadvantages
•   Approaches




                    22
Dynamic Analysis Advantages
•   Only requires a running system to perform a test
•   No requirement to have access to sour...
Dynamic Analysis Disadvantages
•   Limited scope of what can be found
     – Application must be footprinted to find the t...
Approaches
•   Where possible/reasonable confirm findings of the source code review
•   Determine if mitigating factors im...
Bringing Approaches Together
•   These approaches feed one another
    – Valuable to be able to re-run tools and iterate b...
Presenting Results




                     27
Presenting Results
•   Universal developer reaction:
     – “That’s not exploitable”
     – “That’s not the way it works i...
Questions?
Dan Cornell
dan@denimgroup.com
Twitter: @d i l
T itt @danielcornell
                  ll

(210) 572-4400

Web: ...
Upcoming SlideShare
Loading in...5
×

Application Assessment Techniques

3,742

Published on

This talk will review a number of application assessment techniques and discuss the types of security vulnerabilities they are best suited to identify as well as how the different approaches can be used in combination to produce more thorough and insightful results. Code review will be compared to penetration testing and the capabilities of automated tools will be compared to manual techniques. In addition, the role of threat modeling and architecture analysis will be examined. The goal is to illuminate assessment techniques that go beyond commodity point-and-click approaches to web application or code scanning.

From the OWASP Northern Virginia meeting August 6, 2009.

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
3,742
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
79
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Application Assessment Techniques

  1. 1. Application Assessment Techniques OWASP Northern Virginia August 6th, 2009
  2. 2. Agenda • Background • Common Pitfalls in Application Assessment • Moving Beyond – Threat Modeling – Code Review – Dynamic Testing y g • Presenting Results • Questions / Panel Discussion 1
  3. 3. Background • Dan Cornell – Principal at Denim Group www.denimgroup.com – Software Developer: MCSD Java 2 Certified Programmer MCSD, – OWASP: Global Membership Committee, Open Review Project, SA Chapter Lead • Denim Group – Application Development • Java and .NET – Application Security • Assessments, penetration tests, code reviews, training, process consulting 2
  4. 4. How Not To Do It 3
  5. 5. How Not To Do It • Q: What are you all doing to address application security concerns in your organization? • A: We b A W bought “XYZ Scanner” ht S ” • Q: Okay… Are you actually using it? • A: We ran some scans • Q: And how did that go? • A: Oh we found some stuff… • Q: How did you address those issues? • A: I think we sent the report to the developers. Not sure what they did with them. I guess I ought to check in on that… 4
  6. 6. Goals of Application Assessment • Vary by organization, by application and by assessment • Determine the security state of an application • Characterize risk to executives and decision makers • Prove a point p • Set the stage for future efforts 5
  7. 7. Common Pitfalls in Application Assessment 6
  8. 8. Common Pitfalls in Application Assessment • Ad hoc approach – Non-repeatable, non-comprehensive • Reliance on automated t l R li t t d tools – Can only find a subset of vulnerabilities – false negatives – Even the good tools need tuning to reduce false positives • Current commercial tools are biased – Rulesets and capabilities typically over-focused on web applications • Too focused on one approach – Static and dynamic testing have different strengths – Economic concerns constrain the amount of testing that can be performed – make the most of the time you have 7
  9. 9. Moving Beyond • Automated versus Manual • Threat Modeling • Dynamic Testing • Source Code Review 8
  10. 10. Automated Versus Manual 9
  11. 11. Automated Versus Manual • Automated tools are great at: – Consistency - not getting tired – Data flow analysis • Automated tools are terrible for: – Understanding business context • Manual testing is great at: – Identifying business logic flaws • Manual testing is terrible for: 10
  12. 12. Threat Modeling • Provides high-level understanding of the system – Useful for creating a structured test plan • Provides P id application context li ti t t – Crucial for characterizing results • Complementary with Abuse Cases 11
  13. 13. Threat Modeling Approach • Establish scope and system boundaries • Decompose the system into a Data Flow Diagram (DFD) • Assign potential threats based on asset types 12
  14. 14. Threat Model Example 13
  15. 15. Mapping Threats to Asset Types Threat Type External Process Data Flow Data Store Interactor S – Spoofing S S fi Yes Y Yes Y T – Tampering Yes Yes Yes R – Repudiation Yes Yes Yes I – Information Disclosure Yes Yes Yes D – Denial of Service Yes Yes Yes E – Elevation of Privilege Yes 14
  16. 16. Threat Modeling • Result is a structured, repeatable list of threats to check – Strength is to find known problems repeatably • Augment with Ab A t ith Abuse C Cases – “What could go wrong” scenarios – More creative and unstructured 15
  17. 17. Dynamic, Static and Manual Testing
  18. 18. Source Code Review • Advantages • Disadvantages • Approaches 17
  19. 19. Static Analysis Advantages • Have access to the actual instructions the software will be executing – No need to guess or interpret behavior – Full access to all the software’s possible behaviors software s • Remediation is easier because you know where the problems are
  20. 20. Static Analysis Disadvantages • Require access to source code or at least binary code – Typically need access to enough software artifacts to execute a build • Typically T i ll require proficiency running software b ild i fi i i ft builds • Will not find issues related to operational deployment environments
  21. 21. Approaches • Run automated tools with default ruleset – Provides a first-cut look at the security state of the application – Identify “hot spots hot spots” • Craft custom rules specific to the application – 3rd party code – Break very large applications into manageable chunks – Application-specific APIs – sources, sinks, filter functions – Compliance-specific constructs • This is an iterative process 20
  22. 22. Approaches • Auditing results from an automated scan – Typically must sample for larger applications (or really bad ones) – Many results tend to cluster on a per application basis – coding idioms for error per-application handling, resource lifecycle • Manual review – Must typically focus the effort for economic reasons – Hot spots from review of automated results – Security-critical functions from review of automated results – encoding, canonicalization – Security-critical areas – Startup, shutdown 21
  23. 23. Dynamic Testing • Advantages • Disadvantages • Approaches 22
  24. 24. Dynamic Analysis Advantages • Only requires a running system to perform a test • No requirement to have access to source code or binary code • No need to understand how to write software or execute builds – Tools tend to be more “fire and forget” • Tests a specific, operational deployment – Can find infrastructure, configuration and patch errors that Static Analysis tools will miss
  25. 25. Dynamic Analysis Disadvantages • Limited scope of what can be found – Application must be footprinted to find the test area – That can cause areas to be missed – You can only test what you have found • No access to actual instructions being executed – Tool is exercising the application – Pattern matching on requests and responses
  26. 26. Approaches • Where possible/reasonable confirm findings of the source code review • Determine if mitigating factors impact severity – WAFs, SSO, etc – Be careful with this • Look at things easiest to test on a running application – M Macro error h dli handling – Authentication and authorization implementation 25
  27. 27. Bringing Approaches Together • These approaches feed one another – Valuable to be able to re-run tools and iterate between static and dynamic testing • Results must be communicated i th context th Th t M d l R lt tb i t d in the t t the Threat Model – Severity, compliance implications, etc 26
  28. 28. Presenting Results 27
  29. 29. Presenting Results • Universal developer reaction: – “That’s not exploitable” – “That’s not the way it works in production” That s production • Demonstrations of attacks can inspire comprehension – This can be a trap – often demonstrating exploitability of a vulnerability takes longer than fixing the vulnerability • Properly characterize mitigating factors – Often deployed incorrectly – Code has a tendency to migrate from application to application y g pp pp • Risk is important – so is the level of effort required to fix 28
  30. 30. Questions? Dan Cornell dan@denimgroup.com Twitter: @d i l T itt @danielcornell ll (210) 572-4400 Web: www.denimgroup.com Blog: denimgroup.typepad.com 29
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×