Creating Practical Security Test-Cases for Web Applications

7,652 views

Published on

This is the talk given at the StarEast 2009 software quality conference on May 7th, 2009.

Published in: Technology

Creating Practical Security Test-Cases for Web Applications

  1. 1. Creating Practical Security Test-Cases for Web Applications Rafal M. Los HP ASC Sr. Security Solutions Expert 7 May 2009 1
  2. 2. Agenda Understanding the QA/Security Relationship Negative Testing 360° Building Negative Tests Implementation and Execution Looking Ahead 7 May 2009 2
  3. 3. Agenda Understanding the QA/Security Relationship Negative Testing 360° Building Negative Tests Implementation and Execution Looking Ahead 7 May 2009 3
  4. 4. Background Why do QA teams care about security? • Traditionally security is left to the security team • Security issues must be addressed throughout SDL • QA teams add missing element QA teams are crucial to security • Understand application test-cases • Understand application workflows • Security is a natural extension of quality 7 May 2009 4
  5. 5. QA – Security Relationship Similarities – core principles • Testing web application logic • Functional testing on live code • Specific data-sets used Differences – outlying goals • Stress-test vs. break test • Positive vs. negative data sets • Reinforcing positive vs. uncovering negative 7 May 2009 5
  6. 6. The “Hacker” Mindset  Why would anyone want to break an application?  Fun  Malice  Profit  Attack users  Attack systems  Mentality difference  QA asks – How does it perform?  Hacker asks – How can I break it? 7 May 2009 6
  7. 7. Whose Problem is Security?  Many components to the security “problem”  Policy  Development frameworks/standards  Audit  Metrics  Security is a pillar of overall quality  Does it function?  Does it perform?  Is it secure? 7 May 2009 7
  8. 8. Agenda Understanding the QA/Security Relationship Negative Testing 360° Building Negative Tests Implementation and Execution Looking Ahead 7 May 2009 8
  9. 9. Negative Testing Overview What is negative testing?  Testing for unintended features  Testing using unintended data sets  Testing for unintended logic flow “Negative testing involves understanding the application, and finding ways to manipulate the code to perform in ways as to create unintended exposures” 7 May 2009 9
  10. 10. Negative Testing Overview Selection bias  Confirmation bias  Testing to confirm desired results  Testing using known desired data and flows  Testing which completely misses the point… “…confirmation bias is a tendency to search for or interpret new information in a way that confirms one's preconceptions and to avoid information and interpretations which contradict prior beliefs” 7 May 2009 10
  11. 11. Negative Testing Mindset Traditional QA: proving the positive • Prove certain activity functions as defined by business case • Requirements are easily defined in application flow and function Negative testing: finding the negative • Find negative (unintended) functions/results • No way to clearly define “bad stuff” as a requirement to test against 7 May 2009 11
  12. 12. Negative Testing - Data Types of negative data depends on purpose • Exploit a client • Client-side script or technology • Corrupt or crash a system • Database control characters • Non-native character sets, system characters • System commands • Retrieve data from the system • Database queries, control language • System commands 7 May 2009 12
  13. 13. Negative Testing - Flow Goal is to manipulate application logic Identify “breakable” application logic • Create a race condition • Break application control-flow • Force an out of process action • Inject a rogue process Test-cases based off of proper application logic flows Requires in-depth knowledge of application flow 7 May 2009 13
  14. 14. Negative Testing - Tools Tools are an integral part of negative testing • Manual tools • Flow diagrams • Data sets • Logic charts • Automated tools • Black-box scanners • Manu-matic framework tools 7 May 2009 14
  15. 15. Negative Testing - Tools Automated tools *cannot* perform all testing • Workflow-base vulnerabilities • Complex attacks • Multi-stage • Business logic Human beings must… • Analyze the application logic and data • Guide tools • Interpret results 7 May 2009 15
  16. 16. Agenda Understanding the QA/Security Relationship Negative Testing 360° Building Negative Tests Implementation and Execution Looking Ahead 7 May 2009 16
  17. 17. Building the Test Phase 1 Phase 2 Phase 3 Phase 4 Mapping the Tools-Based Manual-Based Analysis & Application Testing Testing Correlation •Business logic •Tools-generated •Workflow defect •Analyze results data-set testing tests of automated & •Application flow manual results •Automated •Business logic •Application •Correlate P2/P3 crawler-based tests surface testing results •Complex •Application •Known defects attacks entry-points 7 May 2009 17
  18. 18. Building Data-Negative Tests All possible inputs Data- •Letters Negative Test unknown Situational •Numbers Data Refinement (unknown •Special characters impact) •Control characters •Database  SQLi •Client-side  XSS •XMLdb  X-Path.i Case- Allowed specific (positive) •Cross-site scripting malicious characters •SQL Injection •Overflows 7 May 2009 18
  19. 19. Building Data-Negative Tests  Manual human testing  Must build test data sets manually  Sniper approach (can be precise)  Often very slow, methodical  Identifies false-positives  Tools-based testing  Builds test data sets automatically  Shotgun approach (not precise)  Ability to be extremely fast  Trouble with false-positives 7 May 2009 19
  20. 20. Negative Data Sets Facts about negative data  Negative data sets are best generated by tools if the tester is not a security expert  Many pre-built negative data sets already exist  Sla.ckers.org – XSS cheat-sheet  Tools can point  click  test  Black-box testing tools save time & effort  Humans must analyze results  Must mix positive/negative data for completeness  Workflows often require good data to proceed  Automated negative-data testing fails without good data 7 May 2009 20
  21. 21. Flow Analysis Testing Can a process step be bypassed? Submit quote for someone Step 1 Step 2 Step 3 Step 4 else? Verify Identity Request quote Receive quote Submit for purchase Can a process step be injected? Step 1 Step 2 Step 3 Step 4 Verify Identity Request quote Receive quote Submit for purchase Injected! Modify quote 7 May 2009 21
  22. 22. Flow Analysis Testing  Manual human testing  Can analytically identify specific weak points  Distinguishes between success/failure readily  Often very slow, methodical  Ability to tailor testing to situation/process  Tools-based testing  Attacks every point, cannot distinguish  Difficulty distinguishing success/failure  Ability to be extremely fast  Cannot think therefore has limited abilities 7 May 2009 22
  23. 23. Flow Analysis Testing Facts about flow analysis testing  Tester must understand application flow  Proper application flow to turn into negative  “Random manipulation” rarely works  Focus on application control-points  Key points in application logic  Don’t leave your testing to tools-only  Most tools can’t identify control points, dive deep into flows  Human analyst has an obvious advantage (critical thinking) 7 May 2009 23
  24. 24. Agenda Understanding the QA/Security Relationship Negative Testing 360° Building Negative Tests Implementation and Execution Looking Ahead 7 May 2009 24
  25. 25. Negative Testing Process Analyze Requirements New Functionality Execute Phase 1 Discovered? Execute Phase 4 Execute Phase 2 Execute Phase 3 7 May 2009 25
  26. 26. Testing Negative Data Identify all visible inputs (data “source”) 1. Input positive data i. Analyze behavior  Input negative data ii. Analyze behavior  Identify all hidden fields (data “source”) 2. Input positive data i. Analyze behavior  Input negative data ii. Analyze behavior  7 May 2009 26
  27. 27. Testing Negative Flow … as we’ve learned this will be manual work  Map out all control-flows  Identify a potentially weak logic element  Walk the positive-control flow path  Ensure proper positive path is understood  Map possible negative-control flow paths  Execute negative-control flow paths  Analyst difference between positive/negative attempts  Repeat if necessary to adjust/adapt until satisfied  Attempt at least 3-5 loop-repetitions 7 May 2009 27
  28. 28. Identify Weaknesses  How do you identify a weakness/defect  Undesired application reaction  Crash?  Skip control step?  Disclosure of unintended data  Debug information  Disclosure of internal data  Disclosure of controlled data 7 May 2009 28
  29. 29. Agenda Understanding the QA/Security Relationship Negative Testing 360° Building Negative Tests Implementation and Execution Looking Ahead 7 May 2009 29
  30. 30. Looking Ahead  Addressing “deep” defects  Workflow-based security defects  Traditionally cannot be scanned for (with automated tools)  Analysis of Defects  When is a critical defect… not?  QA expertise  contextualized defects 7 May 2009 30
  31. 31. Questions? • Security Strategist • Application Security Specialist • Following the White Rabbit: http://www.communities.hp.com/securitysoftware/blogs/rafal • Digital Security SoapBox: http://preachsecurity.blogspot.com/ • Email: Rafal@hp.com • Direct: (404) 606-6056 7 May 2009 31

×