OWASP-Italy-DayIII-Aaron_Visaggio.ppt

304 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
304
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

OWASP-Italy-DayIII-Aaron_Visaggio.ppt

  1. 1. Automatic Generation of Test Cases for Web Application Security: a Software Engineering Perspective <ul><ul><li>Corrado Aaron Visaggio </li></ul></ul><ul><ul><li>Prof. of Software Testing and Database </li></ul></ul><ul><ul><li>Dept. Of Engineering- University of Sannio </li></ul></ul>
  2. 2. <ul><li>Software Testing Automation </li></ul><ul><li>Web Application Testing </li></ul><ul><li>Web Pen Tools Taxonomy </li></ul><ul><li>Web Pen Testing Automation </li></ul>Agenda
  3. 3. Introduction and Goals <ul><li>Web App Pen Test has not any relation with traditional Software System Testing Process </li></ul><ul><li>Web App Pen Test could benefit from automation (+ effectiveness, - costs) </li></ul><ul><li>Software Engineering Research should point out directions to follow for evolving approaches used in web app pen test </li></ul><ul><li>The goal: establish a road map for Web App Pen Test Automation </li></ul><ul><ul><li>Where we are, where we are going to, what we have, what we need. </li></ul></ul>
  4. 4. What is Test Automation <ul><li>Software test automation activities could be performed in three different scopes: - Enterprise-oriented test automation , where the major focus of test automation efforts is to automate an enterprise-oriented test process so that it could be used and reused to support different product lines and projects in an organization. - Product-oriented test automation , where test automation activities are performed to focus on a specific software product line to support its related testing activities. - Project-oriented test automation , where test automation effort is aimed at a specific project and its test process. </li></ul>
  5. 5. Automating Software Testing Frameworks(1/2) <ul><li>Modular Framework . </li></ul><ul><ul><li>The Modular framework is the natural progression derived from Record-and-Playback </li></ul></ul><ul><ul><li>The modular framework seeks to minimize repetition of code by grouping similar actions into “modules” (e.g.: login) </li></ul></ul><ul><ul><li>Test Data is in a script or in an internal table </li></ul></ul><ul><li>Data Driven Keyword Driven </li></ul><ul><ul><li>These frameworks are similar in that the data is separated from the test script </li></ul></ul><ul><ul><li>The script is just a &quot;driver&quot; or delivery mechanism for the data </li></ul></ul><ul><ul><li>In keyword-driven testing, the navigation data and test data are contained in the data source </li></ul></ul><ul><ul><li>In data-driven testing, only test data is contained in the data source </li></ul></ul><ul><ul><li>It aims at validating functionality’s level. </li></ul></ul>
  6. 6. Automating Software Testing Frameworks(2/2) <ul><li>Model Based Testing </li></ul><ul><ul><li>Model-based testing is software testing in which test cases are derived in whole or in part from a model that describes some (usually functional) aspects of the system. </li></ul></ul><ul><ul><li>Model-based testing for complex software systems is still an evolving field. </li></ul></ul><ul><ul><li>It is based usually on intermediate models of the system, as well as: a control/data flow graph, or a finite state machine. </li></ul></ul>
  7. 7. Automating Software Testing: Techniques, Models, and Tools <ul><li>The main benefits: </li></ul><ul><ul><li>Reduce manual software testing operations and eliminate redundant testing efforts. </li></ul></ul><ul><ul><li>Produce more systematic repeatable software tests , and generate more consistent testing results. </li></ul></ul><ul><ul><li>To speed up a software testing process, and to reduce software testing cost and time during a software life cycle </li></ul></ul><ul><ul><li>To increase the quality and effectiveness of a software test process by achieving pre-defined adequate test criteria with a better testing coverage in a limited schedule </li></ul></ul>
  8. 8. Software Testing Tools
  9. 9. Web Application Testing <ul><li>White Box testing: </li></ul><ul><ul><li>Web Application Test Model (WATM) proposed by Liu et al., based on Object Model (client-server exchange of pages) and Structure Model (captures data flow throughout four graphs: CFG, ICFG, OCFG, CCFG) </li></ul></ul><ul><ul><li>Ricca and Tonella propose a Navigational Model to capture the control flow, including static pages, dynamic pages, and server pages. </li></ul></ul><ul><ul><li>A test case for a Web application is defined as a sequence of pages to be visited, plus the input values to be provided to pages containing forms. </li></ul></ul><ul><ul><li>Coverage Criteria : branch coverage, node coverage, path coverage. </li></ul></ul><ul><ul><li>Data Flow coverage Criteria : all-uses, all-defs, all def-use </li></ul></ul>
  10. 10. Web Application Testing <ul><li>Black Box Strategies: </li></ul><ul><ul><li>Di Lucca et al. exploit decision tables as a combinatorial model for representing the behavior of the Web application and producing test cases. </li></ul></ul><ul><ul><li>Responses are each one associated with a specific condition. </li></ul></ul><ul><ul><li>Each unique combination of conditions and actions is a variant, represented as a single row in the table. </li></ul></ul><ul><ul><li>Andrews et al. make use of Finite State Machines (FSM) for modeling software behavior and deriving test cases from them </li></ul></ul><ul><ul><li>The process for test generation comprises two phases: </li></ul></ul><ul><ul><ul><li>the First phase , the Web application is modeled by a hierarchical collection of FSMs , where the bottom level FSMs are formed by Web pages and parts of Web pages, while a top level FSM represents the whole application. </li></ul></ul></ul><ul><ul><ul><li>In the Second phase , test cases are generated from this representation . </li></ul></ul></ul>
  11. 11. Web Pen Testing: a taxonomy of tools <ul><li>Gray Box Testing: </li></ul><ul><ul><li>User Session Testing : Approaches based on data captured in user sessions collect user interactions with the Web server and transform them into test cases. </li></ul></ul><ul><ul><li>Captured data about user sessions can be transformed into a set of HTTP requests , each one providing a separate test case. </li></ul></ul><ul><ul><li>Advantages : </li></ul></ul><ul><ul><ul><li>generating test cases without analyzing the internal structure of the Web application, thus reducing the costs </li></ul></ul></ul><ul><ul><ul><li>it is less dependent on the heterogeneous and fast changing technologies used by Web applications </li></ul></ul></ul><ul><ul><li>Drawback : the effectiveness of user session techniques depends on the set of user session data collected </li></ul></ul>
  12. 12. Web Pen Testing Tools: a taxonomy (1/2) <ul><li>Source Code Analyzers </li></ul><ul><ul><li>Basic static analyzers run simple text-based searches for strings and patterns in source-code files, </li></ul></ul><ul><ul><li>Dynamic analyzers first attempt to construct all possible runtime functional call stacks. Next, they attempt to determine if a call (toStrcpy, for example) can be reached by data received as an input from the user or the environment. </li></ul></ul><ul><li>Web application scanners typically record a good HTTP transaction, and then attempt to inject malicious payloads into subsequent transactions and watch for indications of success in the resulting HTTP response. </li></ul><ul><li>Database scanners typically act as an SQL client and perform various database queries to analyze the database’s security configuration. </li></ul><ul><li>Binary Analysis tools attempt to “fuzz” the input parameters, looking for signs of an application crash, common vulnerability signatures, and other improper behavior. </li></ul>
  13. 13. Web Pen Testing Tools: a taxonomy (2/2) <ul><li>Configuration Analisys tools operate against the application configuration files, host settings, or Web/application server configuration. </li></ul><ul><li>Proxies . Security analysts typically use Web proxies to intercept Web traffic dynamically. Proxies sit between the tester’s Web browser and the Web server hosting the application. Most Web proxies will let you trap the HTTP request (after it leaves the browser) and response (before it returns to the browser). </li></ul><ul><li>Runtime Analysis Tools . Runtime analysis tools essentially act like profilers and intercept function calls as they occur. They can also be configured to log parameter values, as well as record two different sessions—say, one as an administrator and the other as regular user—and then compare the two. </li></ul>
  14. 14. Three Levels of Web Application Security Defects <ul><li>domain level , i.e. when the user process allows illegal access to sensitive resources; for instance, when web pages that should be accessed with https protocol could be reached with a http connection, too; </li></ul><ul><li>design level , i.e. when the design exposes security bugs; an example is the sql injection vulnerability; </li></ul><ul><li>technology level , i.e. when the bug is due to the specific technology (programming language, dbms, frameworks, api’s, and so forth). </li></ul>
  15. 15. Automating Web App Pen Testing (1/2) – Next Steps of Research <ul><li>An olistic process which integrates all the phases: </li></ul><ul><ul><li>modeling attacks, </li></ul></ul><ul><ul><li>executing pen test, </li></ul></ul><ul><ul><li>interpretation of pen test results . </li></ul></ul><ul><li>Modelling attacks: </li></ul><ul><ul><li>Model attack (misuse case, threat modeling, abuse case) </li></ul></ul><ul><ul><li>Model web application (WAMT, Navigational, FSMs) </li></ul></ul><ul><ul><li>Link attack models with web app models . </li></ul></ul><ul><li>Execute Pen Test </li></ul><ul><ul><li>Standard definitions of Scripts for managing vulnerabilities lifecycle. </li></ul></ul><ul><ul><li>Adequacy criteria for Test Selection </li></ul></ul><ul><ul><li>Integrating Semantics (Thesaurus, Dictionary, Euristics) </li></ul></ul>
  16. 16. Automating Web App Pen Testing (2/2) – Next Steps of Research <ul><li>Interpretation of Test Results : </li></ul><ul><ul><li>Decision Models </li></ul></ul><ul><ul><ul><li>Mapping vulnerabilities with countermeasures </li></ul></ul></ul><ul><ul><ul><li>Impact model of security bugs. </li></ul></ul></ul><ul><ul><li>Test Process Models </li></ul></ul><ul><ul><ul><li>Stop criteria </li></ul></ul></ul><ul><ul><ul><li>Quality model for Pen Test </li></ul></ul></ul><ul><ul><ul><li>Cost Model for Pen Test </li></ul></ul></ul>
  17. 17. Conclusion and future work <ul><li>Web Pen Test could benefit from advances in Software Testing Automation </li></ul><ul><li>As well, scientific and industrial community should focus on some aims: </li></ul><ul><ul><li>Developing models and integrating them into the design of web application </li></ul></ul><ul><ul><li>Realize shared framework for security knowledge </li></ul></ul><ul><ul><li>Define proper test models. </li></ul></ul>

×