slides

404 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
404
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
15
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

slides

  1. 1. CSCE 522 Secure Software Development Best Practices
  2. 2. Reading <ul><li>This lecture: </li></ul><ul><ul><ul><li>Jan Jürjens, Towards Development of Secure Systems using UMLsec, http://citeseer.ist.psu.edu/536233.html </li></ul></ul></ul>
  3. 3. Application of Touchpoints Requirement and Use cases Architecture and Design Test Plans Code Tests and Test Results Feedback from the Field 5. Abuse cases 6. Security Requirements 2. Risk Analysis External Review 4. Risk-Based Security Tests 1. Code Review (Tools) 2. Risk Analysis 3. Penetration Testing 7. Security Operations
  4. 4. Design Flaws <ul><li>50 % of security problems </li></ul><ul><li>Need: explicitly identifying risk </li></ul><ul><li>Quantifying impact: tie technology issues and concerns to business </li></ul><ul><li>Continuous risk management </li></ul>
  5. 5. Security Risk Analysis <ul><li>Risk analysis: identifying and ranking risks </li></ul><ul><li>Risk management: performing risk analysis exercises, tracking risk, mitigating risks </li></ul><ul><li>Need: understanding of business impact </li></ul>
  6. 6. Risk Analysis <ul><li>Address risk as early as possible in the requirements level </li></ul><ul><li>Impact: </li></ul><ul><ul><li>Legal and/or regulatory risk </li></ul></ul><ul><ul><li>Financial or commercial considerations </li></ul></ul><ul><ul><li>Contractual considerations </li></ul></ul><ul><li>Requirements: “must-haves,” “important-to-have,” and “nice-but-unnecessary-to-have” </li></ul>
  7. 7. Basic Risk Analysis <ul><li>Tailored for specific vulnerabilities </li></ul><ul><li>High-level overview </li></ul><ul><li>Meaningful results </li></ul><ul><li>Cross-tier analysis – different trust zones </li></ul><ul><li>Use of deployment pattern </li></ul><ul><li>Decomposing software on a component-by-component basis </li></ul>
  8. 8. Risk Analysis Practice <ul><li>Ad-hoc manner </li></ul><ul><li>Does not scale and not repeatable or consistent </li></ul><ul><li>Depends on knowledge and expertise of analyst </li></ul><ul><li>Results are difficult to compare </li></ul>
  9. 9. Attack Resistance Analysis <ul><li>Information about known attacks, attack patterns, and vulnerabilities – known problems </li></ul><ul><ul><li>Identify general flaws: using secure design literature and checklists </li></ul></ul><ul><ul><li>Map attack patterns: based on abuse cases and attack patterns </li></ul></ul><ul><ul><li>Identify risk in the architecture: using checklist </li></ul></ul><ul><ul><li>Understand and demonstrate the viability of known attacks </li></ul></ul>
  10. 10. Ambiguity Analysis <ul><li>Discover new risks </li></ul><ul><li>Parallel activities of team members  unify understanding </li></ul><ul><ul><li>Private list of possible flaws </li></ul></ul><ul><ul><li>Describe together how the system worked </li></ul></ul><ul><li>Need a team of experienced analysts </li></ul>
  11. 11. Weakness Analysis <ul><li>Understanding the impact of external software dependencies </li></ul><ul><ul><li>Middleware </li></ul></ul><ul><ul><li>Outside libraries </li></ul></ul><ul><ul><li>Distributed code </li></ul></ul><ul><ul><li>Services </li></ul></ul><ul><ul><li>Physical environment </li></ul></ul><ul><ul><li>Etc. </li></ul></ul>
  12. 12. Application of Touchpoints Requirement and Use cases Architecture and Design Test Plans Code Tests and Test Results Feedback from the Field 5. Abuse cases 6. Security Requirements 2. Risk Analysis External Review 4. Risk-Based Security Tests 1. Code Review (Tools) 2. Risk Analysis 3. Penetration Testing 7. Security Operations
  13. 13. Misuse Cases <ul><li>Software development: making software do something </li></ul><ul><ul><li>Describe features and functions </li></ul></ul><ul><ul><li>Everything goes right </li></ul></ul><ul><li>Need: security, performance, reliability </li></ul><ul><ul><li>Service level agreement – legal binding </li></ul></ul><ul><li>How to model non-normative behavior in use cases? </li></ul><ul><ul><li>Think like a bad guy </li></ul></ul>
  14. 14. Software Vendor Accountability <ul><li>Proper implementation of security features </li></ul><ul><li>Looking for known security flaws </li></ul><ul><li>Passing third party validation </li></ul><ul><li>Source code analysis </li></ul>
  15. 15. Checking for Known Vulnerabilities <ul><li>Need tool </li></ul><ul><li>Possible attacks and attack types </li></ul><ul><li>How the software behaves if something goes WRONG </li></ul><ul><li>What motivates an attacker? </li></ul>
  16. 16. Misuse Cases <ul><li>Extends use case diagrams </li></ul><ul><li>Represent actions the system should prevent </li></ul><ul><li>Represent together </li></ul><ul><ul><li>Desired functionalities </li></ul></ul><ul><ul><li>Undesired actions </li></ul></ul><ul><li>Security: emergent property  must be built in from the ground up </li></ul><ul><li>Making explicit trade offs </li></ul>
  17. 17. Misuse Cases <ul><li>Analyze system design and requirements </li></ul><ul><ul><li>Assumptions </li></ul></ul><ul><ul><li>Failure of assumptions </li></ul></ul><ul><ul><li>Attack patterns </li></ul></ul><ul><li>Software that is used also going to be attacked </li></ul><ul><li>What can a bad guy do and how to react to malicious use </li></ul>
  18. 18. Misuse Case Development <ul><li>Team work – software developers and security experts </li></ul><ul><li>Identifying and documenting threats </li></ul><ul><li>Creating anti-requirements: how the system can be abused </li></ul><ul><li>Creating attack model </li></ul><ul><ul><li>Select attack pattern relevant to the system </li></ul></ul><ul><ul><li>Include anyone who can gain access to the system </li></ul></ul>
  19. 19. Application of Touchpoints Requirement and Use cases Architecture and Design Test Plans Code Tests and Test Results Feedback from the Field 5. Abuse cases 6. Security Requirements 2. Risk Analysis External Review 4. Risk-Based Security Tests 1. Code Review (Tools) 2. Risk Analysis 3. Penetration Testing 7. Security Operations
  20. 20. Software Testing <ul><li>Application fulfills functional requirements </li></ul><ul><li>Dynamic, functional tests late in the SDLC </li></ul><ul><li>Contextual information </li></ul>
  21. 21. Security Testing <ul><li>Look for unexpected but intentional misuse of the system </li></ul><ul><li>Must test for all potential misuse types using </li></ul><ul><ul><li>Architectural risk analysis results </li></ul></ul><ul><ul><li>Abuse cases </li></ul></ul><ul><li>Verify that </li></ul><ul><ul><li>All intended security features work (white hat) </li></ul></ul><ul><ul><li>Intentional attacks cannot compromise the system (black hat) </li></ul></ul>
  22. 22. Penetration Testing <ul><li>Testing for negative – what must not exist in the system </li></ul><ul><li>Difficult – how to prove “non-existence” </li></ul><ul><li>If penetration testing does not find errors than </li></ul><ul><ul><li>Can conclude that under the given circumstances no security faults occurred </li></ul></ul><ul><ul><li>Little assurance that application is immune to attacks </li></ul></ul><ul><li>Feel-good exercise </li></ul>
  23. 23. Penetration Testing Today <ul><li>Often performed </li></ul><ul><li>Applied to finished products </li></ul><ul><li>Outside  in approach </li></ul><ul><li>Late SDLC activity </li></ul><ul><li>Limitation: too little, too late </li></ul>
  24. 24. Late-Lifecycle Testing <ul><li>Limitations: </li></ul><ul><ul><li>Design and coding errors are too late to discover </li></ul></ul><ul><ul><li>Higher cost than earlier designs-level detection </li></ul></ul><ul><ul><li>Options to remedy discovered flaws are constrained by both time and budget </li></ul></ul><ul><li>Advantages: evaluate the system in its final operating environment </li></ul>
  25. 25. Success of Penetration Testing <ul><li>Depends on skill, knowledge, and experience of the tester </li></ul><ul><li>Important! Result interpretation </li></ul><ul><li>Disadvantages of penetration testing: </li></ul><ul><ul><li>Often used as an excuse to declare victory and go home </li></ul></ul><ul><ul><li>Everyone looks good after negative testing results </li></ul></ul>
  26. 26. Application of Touchpoints Requirement and Use cases Architecture and Design Test Plans Code Tests and Test Results Feedback from the Field 5. Abuse cases 6. Security Requirements 2. Risk Analysis External Review 4. Risk-Based Security Tests 1. Code Review (Tools) 2. Risk Analysis 3. Penetration Testing 7. Security Operations
  27. 27. Software Testing <ul><li>Running a program or system with the intent of finding errors </li></ul><ul><li>Evaluating capability of the system and determining that its requirements </li></ul><ul><li>Physical processes vs. Software </li></ul><ul><li>Testing purposes </li></ul><ul><ul><li>To improve quality </li></ul></ul><ul><ul><li>For Verification & Validation (V&V) </li></ul></ul><ul><ul><li>For reliability estimation </li></ul></ul>
  28. 28. Quality Assurance <ul><li>External quality: correctness, reliability, usability, integrity </li></ul><ul><li>Interior (engineering) quality: efficiency, testability, documentation, structure </li></ul><ul><li>Future (adaptability) quality: flexibility, reusability, maintainability </li></ul>
  29. 29. Correctness Testing <ul><li>Black box: </li></ul><ul><ul><li>Test data are derived from the specified functional requirements without regard to the final program structure </li></ul></ul><ul><ul><li>Data-driven, input/output driven, or requirements-based </li></ul></ul><ul><ul><li>Functional testing </li></ul></ul><ul><ul><li>No implementation details of the code are considered </li></ul></ul>
  30. 30. Correctness Testing <ul><li>White box: </li></ul><ul><ul><li>Software under test are visible to the tester </li></ul></ul><ul><ul><li>Testing plans: based on the details of the software implementation </li></ul></ul><ul><ul><li>Test cases: derived from the program structure </li></ul></ul><ul><ul><li>glass-box testing, logic-driven testing, or design-based testing </li></ul></ul>
  31. 31. Performance Testing <ul><li>Goal: bottleneck identification, performance comparison and evaluation, etc. </li></ul><ul><li>Explicit or implicit requirements </li></ul><ul><li>&quot;Performance bugs&quot; – design problems </li></ul><ul><li>Test: usage, throughput, stimulus-response time, queue lengths, etc. </li></ul><ul><li>Resources to be tested: network bandwidth requirements, CPU cycles, disk space, disk access operations, memory usage, etc. </li></ul>
  32. 32. Reliability Testing <ul><li>Probability of failure-free operation of a system </li></ul><ul><li>Dependable software: it does not fail in unexpected or catastrophic ways </li></ul><ul><li>Difficult to test </li></ul><ul><li>(see later lecture on software reliability) </li></ul>
  33. 33. Security Testing <ul><li>Test: finding flaws in software can be exploited by attackers </li></ul><ul><li>Quality, reliability and security are tightly coupled </li></ul><ul><li>Software behavior testing </li></ul><ul><ul><li>Need: risk-based approach using system architecture information and attacker’s model </li></ul></ul>
  34. 34. Behavior in the Presence of Malicious Attack <ul><li>What happens when the software fails? </li></ul><ul><ul><li>Safety critical systems </li></ul></ul><ul><li>Track risk over time </li></ul><ul><li>Security relative to </li></ul><ul><ul><li>Information and services protected </li></ul></ul><ul><ul><li>Skills and resources of adversaries </li></ul></ul><ul><ul><li>Cost of protection </li></ul></ul><ul><li>System vulnerabilities </li></ul>
  35. 35. Malicious Input <ul><li>Software: takes input </li></ul><ul><li>Trust input? </li></ul><ul><ul><li>Malformed or malicious input may lead to security compromise </li></ul></ul><ul><ul><li>What is the input? </li></ul></ul><ul><ul><ul><li>Data vs. control </li></ul></ul></ul><ul><li>Attacker toolkit </li></ul>
  36. 36. Application of Touchpoints Requirement and Use cases Architecture and Design Test Plans Code Tests and Test Results Feedback from the Field 5. Abuse cases 6. Security Requirements 2. Risk Analysis External Review 4. Risk-Based Security Tests 1. Code Review (Tools) 2. Risk Analysis 3. Penetration Testing 7. Security Operations
  37. 37. Traditional Software Development <ul><li>No information security consideration </li></ul><ul><li>Highly distributed among business units </li></ul><ul><li>Lack of understanding of technical security risks </li></ul>
  38. 38. Don’t stand so close to me <ul><li>Best Practices </li></ul><ul><ul><li>Manageable number of simple activities </li></ul></ul><ul><ul><li>Should be applied throughout the software development process </li></ul></ul><ul><li>Problem: </li></ul><ul><ul><li>Software developers: lack of security domain knowledge  limited to functional security </li></ul></ul><ul><ul><li>Information security professionals: lack of understanding software  limited to reactive security techniques </li></ul></ul>
  39. 39. Deployment and Operations <ul><li>Configuration and customization of software application’s deployment environment </li></ul><ul><li>Activities: </li></ul><ul><ul><li>Network-component-level </li></ul></ul><ul><ul><li>Operating system-level </li></ul></ul><ul><ul><li>Application-level </li></ul></ul>
  40. 40. Deployment and Operations <ul><li>Configuration and customization of software application’s deployment environment </li></ul><ul><li>Fine tuning security functionality </li></ul><ul><li>Evaluate entire system’s security properties </li></ul><ul><li>Apply additional security capabilities if needed </li></ul>
  41. 41. Who are the attackers? <ul><li>Amateurs : regular users, who exploit the vulnerabilities of the computer system </li></ul><ul><ul><li>Motivation: easy access to vulnerable resources </li></ul></ul><ul><li>Crackers : attempt to access computing facilities for which they do not have the authorization </li></ul><ul><ul><li>Motivation: enjoy challenge, curiosity </li></ul></ul><ul><li>Career criminals : professionals who understand the computer system and its vulnerabilities </li></ul><ul><ul><li>Motivation: personal gain (e.g., financial) </li></ul></ul>
  42. 42. Attacker’s Knowledge <ul><li>Insider </li></ul><ul><ul><li>Understand organizational data, architecture, procedures, etc. </li></ul></ul><ul><ul><li>May understand software application </li></ul></ul><ul><ul><li>Physical access </li></ul></ul><ul><li>Outsider </li></ul><ul><ul><li>May not understand organizational information </li></ul></ul><ul><ul><li>May have software specific expertise </li></ul></ul><ul><ul><li>Use of tools and other resources </li></ul></ul>
  43. 43. Vulnerability Monitoring <ul><li>Identify security weaknesses </li></ul><ul><li>Methods: </li></ul><ul><ul><li>Automated tools </li></ul></ul><ul><ul><li>Human walk-through </li></ul></ul><ul><ul><li>Surveillance </li></ul></ul><ul><ul><li>Audit </li></ul></ul><ul><ul><li>Background checks </li></ul></ul>
  44. 44. System Security Vulnerability <ul><li>Software installation </li></ul><ul><ul><li>Default values </li></ul></ul><ul><ul><li>Configurations and settings </li></ul></ul><ul><li>Monitoring usage </li></ul><ul><ul><li>Changes and new resources </li></ul></ul><ul><ul><li>Regular updates </li></ul></ul><ul><li>Tools </li></ul><ul><ul><li>Look for known vulnerabilities </li></ul></ul>
  45. 45. Red Team <ul><li>Organized group of people attempting to penetrate the security safeguards of the system. </li></ul><ul><li>Assess the security of the system  future improvement </li></ul><ul><li>Requested or permitted by the owner to perform the assessment </li></ul><ul><li>Wide coverage: computer systems, physical resources, programming languages, operational practices, etc. </li></ul>
  46. 46. Next Class <ul><li>Malicious code </li></ul>

×