3.Secure Design Principles And Process


Published on

Published in: Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Have each person read a principle to the class, describe it in their own words, then possibly discuss it further as a class.
  • Organization for Internet Safety (OISafety) refers to this role as a finder
  • 3.Secure Design Principles And Process

    1. 1. Course 1: Overview of Secure Programming, Section 3 <ul><li>Pascal Meunier, Ph.D., M.Sc., CISSP </li></ul><ul><li>May 2004; updated August 12, 2004 </li></ul><ul><li>Developed thanks to support and contributions from Symantec Corporation, support from the NSF SFS Capacity Building Program (Award Number 0113725) and the Purdue e-Enterprise Center </li></ul><ul><li>Copyright (2004) Purdue Research Foundation. All rights reserved. </li></ul>
    2. 2. Course 1 Learning Plan <ul><li>Security overview and patching </li></ul><ul><li>Public vulnerability databases and resources </li></ul><ul><li>Secure software engineering </li></ul><ul><li>Security assessment and testing </li></ul><ul><li>Shell and environment </li></ul><ul><li>Resource management </li></ul><ul><li>Trust management </li></ul>
    3. 3. Secure Software Engineering: Learning Objectives <ul><li>Understand and apply secure design principles </li></ul><ul><li>Understand and specify security requirements </li></ul><ul><li>Understand how internal processes affect security and code quality </li></ul>
    4. 4. Secure Software Engineering: Parts <ul><li>Art or Science? </li></ul><ul><li>Assurance </li></ul><ul><li>Design Principles </li></ul><ul><li>Requirements </li></ul><ul><li>Processes </li></ul><ul><li>Role of Cryptography </li></ul>
    5. 5. Secure Programming: Art or Science? <ul><li>Artisanal work </li></ul><ul><ul><li>Individual work </li></ul></ul><ul><ul><li>Completely dependent on individual’s skills and organization </li></ul></ul><ul><li>Engineering Science </li></ul><ul><ul><li>Objective </li></ul></ul><ul><ul><ul><li>Independent from one individual’s perception </li></ul></ul></ul><ul><ul><ul><li>Does not require truly unique skills </li></ul></ul></ul><ul><ul><li>Reproducible </li></ul></ul><ul><ul><li>Predictable </li></ul></ul><ul><ul><li>Systematic </li></ul></ul>
    6. 6. Software Engineering <ul><li>Aims to be a science, but is most often ad-hoc </li></ul><ul><ul><li>Uses reproducible processes to provide guarantees, controls, measurements </li></ul></ul><ul><li>Capability Maturity Models: </li></ul><ul><ul><li>Characterize an organization’s processes </li></ul></ul><ul><ul><li>Levels of maturity </li></ul></ul><ul><ul><ul><li>Level 1: Ad-hoc, individual effort and heroics (Art) </li></ul></ul></ul><ul><ul><ul><li>Level 2: Repeatable </li></ul></ul></ul><ul><ul><ul><li>Level 3: Defined </li></ul></ul></ul><ul><ul><ul><li>Level 4: Managed </li></ul></ul></ul><ul><ul><ul><li>Level 5: Optimizing (Science) </li></ul></ul></ul>
    7. 7. Results from Current Software Engineering Methods <ul><li>>1000 new vulnerabilities reported every year </li></ul><ul><li>About 50% of vulnerabilities are commonly repeated mistakes </li></ul><ul><li>About 25% could be avoided by considering secure design principles </li></ul><ul><li>Need new methods </li></ul><ul><li>“We can't solve problems by using the same kind of thinking we used when we created them.” (Albert Einstein) </li></ul>
    8. 8. Exercise <ul><li>Discuss whether you see secure programming as an art or a science. </li></ul><ul><li>Are both possible? </li></ul><ul><ul><li>Maybe neither is done if qualified people can't be found! </li></ul></ul><ul><li>Which is easier? </li></ul><ul><li>Which is preferable </li></ul><ul><ul><li>As a customer </li></ul></ul><ul><ul><li>As a project manager </li></ul></ul><ul><ul><li>As a developer </li></ul></ul>
    9. 9. Secure Software Engineering: Parts <ul><li>Art or Science </li></ul><ul><li>Assurance </li></ul><ul><li>Design Principles </li></ul><ul><li>Requirements </li></ul><ul><li>Processes </li></ul><ul><li>Role of Cryptography </li></ul>
    10. 10. Assurance <ul><li>Axiom: It is impossible to demonstrate with absolute certainty that a moderately complex application doesn't have any vulnerabilities. </li></ul><ul><li>Second Best: We can provide assurance that an application was designed, implemented, tested in rigorous ways (and by skilled people) that decrease the chances of having vulnerabilities and other defects. </li></ul><ul><ul><li>e.g., training in secure programming provides assurance </li></ul></ul><ul><ul><li>software engineering processes designed for assurance </li></ul></ul>
    11. 11. How do you measure assurance? <ul><li>International Standard: Common Criteria </li></ul><ul><li>Defines Evaluation Assurance Levels (EALs) 1-7 </li></ul><ul><li>EALs 3-4 commonly requested by governments and security-demanding organizations </li></ul><ul><li>EAL 4 evaluation typically costs $1 million </li></ul><ul><li>High assurance (EALs 5-7) is out of the scope of this tutorial </li></ul><ul><li>This section provides an overview of selected topics related to assurance </li></ul>
    12. 12. Question <ul><li>What role does assurance play in computer security? </li></ul><ul><li>It protects vendors that get sued </li></ul><ul><li>It's how you deal with customers afraid of the latest vulnerability </li></ul><ul><li>It describes how the application of specific software engineering practices lowers risks </li></ul>
    13. 13. Question <ul><li>What role does assurance play in computer security? </li></ul><ul><li>It protects vendors that get sued </li></ul><ul><li>It's how you deal with customers afraid of the latest vulnerability </li></ul><ul><li>It describes how the application of specific software engineering practices lowers risks </li></ul>
    14. 14. Question <ul><li>What is the name of the international standard that specifies assurance levels? </li></ul><ul><li>Common Criteria </li></ul><ul><li>Frequent Criteria </li></ul><ul><li>The EAL </li></ul>
    15. 15. Question <ul><li>What is the name of the international standard that specifies assurance levels? </li></ul><ul><li>Common Criteria </li></ul><ul><li>Frequent Criteria </li></ul><ul><li>The EAL </li></ul>
    16. 16. Secure Software Engineering: Parts <ul><li>Art or Science </li></ul><ul><li>Assurance </li></ul><ul><li>Design Principles </li></ul><ul><li>Requirements </li></ul><ul><li>Processes </li></ul><ul><li>Capability Maturity Models </li></ul><ul><ul><li>Change Control </li></ul></ul><ul><ul><li>Vulnerability Response </li></ul></ul><ul><li>Role of Cryptography </li></ul>
    17. 17. List of Design Principles <ul><li>Least Privilege </li></ul><ul><li>Fail Safe Defaults </li></ul><ul><li>Economy of Mechanism </li></ul><ul><li>Complete Mediation </li></ul><ul><li>Open Design </li></ul><ul><li>Separation of Privilege </li></ul><ul><li>Least Common Mechanism </li></ul><ul><li>Psychological Acceptability </li></ul>Reference: Saltzer and Schroeder (1974) http://www.cs.virginia.edu/~evans/cs551/saltzer/
    18. 18. 1. Least Privilege <ul><li>&quot;A subject should only be given those privileges it needs in order to complete its task.&quot; </li></ul><ul><li>Access control problem: how closely can (role-based, etc...) access control or capabilities match the needed privileges? At what cost? </li></ul><ul><li>Weakness of some Microsoft Applications: IIS 5 runs under the Local System account, equivalent to root privileges. Apache may run as “nobody” under UNIX; under Windows the equivalent procedure is possible but convoluted (and rarely done). </li></ul>
    19. 19. Compartmentalization <ul><li>Technique to separate the code in different parts, so that each part runs with least privilege. </li></ul><ul><ul><li>if a part is compromised, others are still OK </li></ul></ul><ul><ul><li>Example: Separating a user interface from the program running with special privileges (e.g., root) </li></ul></ul><ul><ul><ul><li>Good implementation examples in Linux </li></ul></ul></ul><ul><ul><ul><li>Bad idea: Windows task bar tray icons running with Local System privileges </li></ul></ul></ul><ul><ul><ul><ul><li>Secunia advisory SA10949, Dell TrueMobile WLAN card utility </li></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>tray icon launches help with SYSTEM privileges </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>can be exploited to execute arbitrary code with SYSTEM privileges </li></ul></ul></ul></ul></ul>
    20. 20. 2. Fail-Safe Defaults <ul><li>&quot;Unless a subject is given explicit access to an object, it should be denied access to that object&quot; </li></ul><ul><li>Apache access control through .htaccess: first (default) rule: deny from all allow from ... </li></ul><ul><li>Tied with the issue of failing “safe” vs failing “functional” </li></ul><ul><ul><li>Switches that fail open as hubs under unusual circumstances </li></ul></ul><ul><ul><li>“ Brittle” or catastrophic failures vs graceful </li></ul></ul>
    21. 21. 3. Economy of Mechanism <ul><li>“ Security mechanisms should be as simple as possible” </li></ul><ul><li>Complex mechanisms may not be correctly: </li></ul><ul><ul><li>understood </li></ul></ul><ul><ul><li>modeled </li></ul></ul><ul><ul><li>configured </li></ul></ul><ul><ul><li>implemented </li></ul></ul><ul><ul><li>used </li></ul></ul><ul><li>Complex mechanisms may engender partial implementations and compatibility </li></ul>
    22. 22. Example Failed Economy of Mechanism <ul><li>IPSEC: Can do almost everything to secure TCP/IP but is monstrously complex </li></ul><ul><ul><li>Sub-protocols with their own headers, which may be nested... </li></ul></ul><ul><ul><li>Every vendor’s implementation is slightly different and often incompatible with the others </li></ul></ul><ul><ul><li>Design by political committee tries to be everything to everyone </li></ul></ul><ul><li>People switch to SSL VPNs </li></ul><ul><ul><li>Proven, robust </li></ul></ul><ul><ul><li>Much more simple </li></ul></ul><ul><ul><li>Compatible </li></ul></ul>
    23. 23. 4. Complete Mediation <ul><li>“All accesses to objects must be checked to ensure that they are allowed” </li></ul><ul><li>Performance vs security issue </li></ul><ul><ul><li>Results of access check are often cached </li></ul></ul><ul><ul><li>What if permissions have changed since the last check? </li></ul></ul><ul><ul><li>Mechanisms to invalidate or flush caches after a change are often missing </li></ul></ul><ul><li>Architecture issue </li></ul><ul><ul><li>Capability granting and management </li></ul></ul><ul><ul><ul><li>How did a capability given to Alice end up in Malory's hands? </li></ul></ul></ul>
    24. 24. Example Failure of Complete Mediation <ul><li>Access is checked only when opening a file, which returns a file descriptor </li></ul><ul><li>UNIX: forked and exec’ed processes inherit file descriptors </li></ul><ul><li>Even if processes call setuid to relinquish access to a high privilege, there may remain open files that stay open even if the process should now not be able to access them. </li></ul><ul><li>Emacs used to have this problem </li></ul>
    25. 25. Another example: CAN-2002-0871 <ul><li>xinetd 2.3.4 leaks file descriptors for the signal pipe to services that are launched by xinetd, which could allow those services cause a denial of service via the pipe. </li></ul>
    26. 26. 5. Open Design <ul><li>“ The security of a mechanism should not depend on the secrecy of its design or implementation.” </li></ul><ul><li>If the details of the mechanism leaks (through reverse engineering, dumpster diving or social engineering), then it is a catastrophic failure for all the users at once. </li></ul><ul><li>If the secrets are abstracted from the mechanism, e.g., inside a key, then leakage of a key only affects one user. </li></ul><ul><li>This does not mean you should reveal source code! </li></ul>
    27. 27. Example Failure of Open Design <ul><li>Electronic voting machines! Diebold voting machines source code analysis: </li></ul><ul><ul><li>Passwords embedded in the source code. </li></ul></ul><ul><ul><li>Unauthorized privilege escalation and other vulnerabilities </li></ul></ul><ul><ul><li>Incorrect use of cryptography </li></ul></ul><ul><ul><li>Undetected, unlimited votes by voters </li></ul></ul><ul><ul><li>Insider threats - company workers or election officials can alter voters' ballot choices without their knowledge </li></ul></ul><ul><ul><li>(source: Kohno, Stubblefield, Rubin and Wallach, 2003 Johns Hopkins University) </li></ul></ul>
    28. 28. Notes on Open Design <ul><li>Some hackers would rather have the binary than the source code when designing exploits </li></ul><ul><li>Obscurity is OK if the design is secure regardless </li></ul><ul><ul><li>If details are leaked, the software is still secure </li></ul></ul>
    29. 29. 6. Separation of Privilege <ul><li>“A system should not grant permission based on a single condition.” </li></ul><ul><li>Removes a single point of failure </li></ul><ul><li>Example: two-factor authentication </li></ul><ul><ul><li>Requiring both biometric and token recognition systems reduces risks </li></ul></ul><ul><li>Analogous to the separation of duty: </li></ul><ul><ul><li>By requiring multiple factors, collusion becomes necessary, and risks due to bribery (compromise of one factor) are reduced </li></ul></ul><ul><ul><li>Dual-signature checks </li></ul></ul>
    30. 30. Successful Example <ul><li>UNIX: sudo allows the execution of commands with root privileges </li></ul><ul><li>Possible only if </li></ul><ul><ul><li>user knows the appropriate password and </li></ul></ul><ul><ul><li>user is already member of an authorized group (e.g., wheel) </li></ul></ul><ul><ul><li>(This example is from Bishop M., &quot;Computer Security: Art and Science&quot;) </li></ul></ul>
    31. 31. Notes on the Separation of Privilege <ul><li>Often confused with the principle of least privilege </li></ul><ul><ul><li>e.g., OpenSSH has a &quot;UsePrivilegeSeparation&quot; option which really is an implementation of least privilege, in two parts of the code. </li></ul></ul><ul><ul><li>Compartmentalization is the technique used to separate code so that the principle of least privilege can be applied on these parts. </li></ul></ul>
    32. 32. 7. Least Common Mechanism <ul><li>“Mechanisms used to access resources should not be shared” </li></ul><ul><li>Concept: You have two different services, of different priorities and value, provided to two different sets of users. The more they share resources, the more likely one can influence the other to: </li></ul><ul><ul><li>Transmit forbidden data (covert channels issue) </li></ul></ul><ul><ul><li>Limit availability (denial of service) </li></ul></ul>
    33. 33. Failed Example of Least Common Mechanism <ul><li>Microsoft NT architecture: FTP and Web services on the same computer share a common thread pool Exhausting the FTP thread pool will cause failed connection requests for the Web service. </li></ul><ul><li>CVE-1999-1148 IIS processes passive FTP connection requests by assigning a thread to each port waiting for a client to connect </li></ul><ul><ul><li>What if they never connect? </li></ul></ul>
    34. 34. 8. Psychological Acceptability <ul><li>Security mechanisms should not make the resource more difficult to access than if the security mechanism were not present. </li></ul><ul><li>Example: Commercial where users have become bald and lost (all?) other hair in order to comply with a biometric authentication mechanism requesting hair samples. </li></ul><ul><li>Problem: Users looks for ways to defeat the mechanisms and “prop the doors open” </li></ul><ul><li>In practice, difficulty proportionate to the value of the protected asset is accepted </li></ul>
    35. 35. Example mechanism defeated for convenience's sake <ul><li>Trusted hosts -- if you’re logged into host 'A', then you automatically have access to host 'B' </li></ul><ul><li>.rhosts mechanism bypasses password security check </li></ul><ul><li>.rhosts file in / directory allows remote root access without a password </li></ul><ul><li>Authentication is based on IP addresses, which can be mapped to a different host through ARP poisonning </li></ul>
    36. 36. Recap of Design Principles <ul><li>Least Privilege </li></ul><ul><li>Fail Safe Defaults </li></ul><ul><li>Economy of Mechanism </li></ul><ul><li>Complete Mediation </li></ul><ul><li>Open Design </li></ul><ul><li>Separation of Privilege </li></ul><ul><li>Least Common Mechanism </li></ul><ul><li>Psychological Acceptability </li></ul>Instructor: This list is to help with the exercise on the next slide
    37. 37. Exercise <ul><li>Divide the class into 8 groups </li></ul><ul><li>Each group will &quot;own&quot; one of the 8 design principles, and identify a product instance where it applies (not one already presented). The group will present their findings to the class, and whether the principle was applied correctly. </li></ul>Instructor: Allow 15-20 minutes for the groups to discuss, then have them present their findings in no more than 3 minutes each.
    38. 38. Secure Software Engineering: Parts <ul><li>Art or Science </li></ul><ul><li>Assurance </li></ul><ul><li>Design Principles </li></ul><ul><li>Requirements </li></ul><ul><li>Processes </li></ul><ul><li>Role of Cryptography </li></ul>
    39. 39. Security Requirements <ul><li>No requirements == you're done </li></ul><ul><li>Security functional requirements </li></ul><ul><ul><li>e.g., Security audit data generation </li></ul></ul><ul><ul><li>Access controls </li></ul></ul><ul><ul><li>Preserve integrity of transaction X </li></ul></ul><ul><ul><li>Fail secure </li></ul></ul><ul><ul><li>Threat modeling can be used to determine security functional requirements </li></ul></ul><ul><li>Security assurance requirements </li></ul><ul><ul><li>e.g., Developer vulnerability analysis (an EAL 2 assurance component) </li></ul></ul>
    40. 40. Exercise <ul><li>Go to http://csrc.nist.gov/cc/ </li></ul><ul><ul><li>Click on the most recent cc version on the left </li></ul></ul><ul><ul><li>Download the pdf &quot;Part 2: Security functional requirements&quot; </li></ul></ul><ul><li>Instructions to the class: </li></ul><ul><ul><li>Observe that functional requirements are divided into several different &quot;classes&quot; (11 classes in version 2.2). </li></ul></ul><ul><ul><li>Observe that classes are further sub-divided into &quot;families&quot; </li></ul></ul><ul><ul><li>Read &quot;Part 1: Scope&quot;; learn the meaning of the acronyms TOE, ST, PP </li></ul></ul><ul><ul><li>What are TSFs? (Hint: look at Figure 1.1) </li></ul></ul><ul><ul><li>Pick a functional requirement family you like in the list; read its definition. </li></ul></ul>Instructor: Allow 10-15 minutes for students to read the document and let them present a requirement in no more than 2 minutes.
    41. 41. Exercise Question <ul><li>Briefly explain a requirement family to the class. </li></ul><ul><ul><li>What is its purpose? </li></ul></ul><ul><ul><li>Which threats could it address, if you were performing a threat modeling exercise? </li></ul></ul>
    42. 42. Secure Software Engineering: Parts <ul><li>Art or Science </li></ul><ul><li>Assurance </li></ul><ul><li>Design Principles </li></ul><ul><li>Requirements </li></ul><ul><li>Processes </li></ul><ul><li>Role of Cryptography </li></ul>
    43. 43. Processes <ul><li>Provide guarantees </li></ul><ul><li>Areas using processes: </li></ul><ul><ul><li>Development environment </li></ul></ul><ul><ul><li>Vulnerability response </li></ul></ul><ul><ul><li>Production network </li></ul></ul><ul><ul><li>Engineering network </li></ul></ul><ul><ul><ul><li>What assurances do you have about the safety of your code against unwarranted changes? </li></ul></ul></ul><ul><ul><ul><li>Who can get control version software accounts? </li></ul></ul></ul><ul><ul><ul><li>Are the access controls configured for security? </li></ul></ul></ul>
    44. 44. Development Environment <ul><li>Processes that we follow affect the security of the code/applications that we develop. </li></ul><ul><ul><li>Security measures in the development environment are an EAL 3 assurance component </li></ul></ul><ul><ul><li>Do we lock our workstations when we leave? </li></ul></ul><ul><ul><li>Does the workstation lock automatically? </li></ul></ul><ul><ul><li>Do we hire convicted felons to work on our source code? </li></ul></ul><ul><ul><li>Is there a change control process? </li></ul></ul><ul><ul><li>How are backups handled? Are they secure? </li></ul></ul>
    45. 45. Development Tools <ul><li>Can you trust the compiler? The linker? </li></ul><ul><li>Perhaps the compiler on your workstation has been modified to insert a trojan... </li></ul><ul><ul><li>See &quot;Reflections on Trusting Trust&quot; by Ken Thompson </li></ul></ul>
    46. 46. Change Control <ul><li>Do we have authentication and access control for who can submit source code? </li></ul><ul><li>Is there a QA process for code acceptance? </li></ul><ul><li>Are changes justified and linked to a mandate (requirement, specification, bug report)? </li></ul><ul><li>Is the integrity of the installer guaranteed? </li></ul><ul><li>Are the source code repository and version control system secure? </li></ul><ul><ul><li>Real life example: trojan in OpenBSD OpenSSH code (2002) </li></ul></ul>
    47. 47. Vulnerability Response Process <ul><li>Goals </li></ul><ul><ul><li>Protect customers </li></ul></ul><ul><ul><li>Protect sales </li></ul></ul><ul><ul><li>Produce a timely and robust patch </li></ul></ul><ul><ul><li>Present a consistent and accurate vendor viewpoint </li></ul></ul><ul><ul><li>Respond in a coordinated manner </li></ul></ul><ul><li>Actors: </li></ul><ul><ul><li>Researcher </li></ul></ul><ul><ul><li>Vendor </li></ul></ul><ul><ul><li>3rd Party Coordinator </li></ul></ul><ul><ul><li>Arbitrator </li></ul></ul>
    48. 48. Vulnerability Lifecycle
    49. 49. Researcher <ul><li>Finds and reports vulnerabilities to: </li></ul><ul><ul><li>Vendors </li></ul></ul><ul><ul><li>3 rd party coordinators </li></ul></ul><ul><ul><li>Public forums </li></ul></ul><ul><ul><ul><li>full-disclosure </li></ul></ul></ul><ul><ul><ul><li>Bugtraq lists </li></ul></ul></ul><ul><li>May create proof-of-concept code </li></ul>
    50. 50. Researcher Motivation <ul><li>Improving security </li></ul><ul><li>Career advancement </li></ul><ul><li>Gratification </li></ul><ul><li>Curiosity </li></ul><ul><li>Vendetta </li></ul><ul><li>Other reasons known only to themselves </li></ul>
    51. 51. Vendor <ul><li>The software provider </li></ul><ul><ul><li>Commercial entity (Symantec) </li></ul></ul><ul><ul><li>Open Source Software community </li></ul></ul><ul><li>Responsible for fixing vulnerabilities </li></ul>
    52. 52. 3 rd Party Coordinator <ul><li>Liaison between Vendors and Researchers </li></ul><ul><ul><li>CERT </li></ul></ul><ul><li>Alternative to contacting vendor </li></ul><ul><ul><li>Vendor is unavailable, unknown or unresponsive </li></ul></ul><ul><ul><li>Existing hostilities between vendor and researcher </li></ul></ul><ul><ul><li>Vulnerability affects multiple vendors </li></ul></ul><ul><ul><ul><li>Protocol issues </li></ul></ul></ul><ul><ul><ul><li>Common components </li></ul></ul></ul>
    53. 53. Arbitrator <ul><li>Used when Researcher and Vendor cannot agree </li></ul><ul><li>Acceptable to both parties </li></ul><ul><li>Independent and impartial </li></ul>
    54. 54. Process <ul><li>Vulnerability reported </li></ul><ul><li>Reported to product(s) primary and secondary contacts </li></ul><ul><li>Vulnerability is evaluated </li></ul><ul><li>Evaluation results reported to researcher </li></ul><ul><li>Fix timeline identified </li></ul><ul><li>PR / IT / Support notified </li></ul><ul><li>Fix created and distributed </li></ul><ul><li>Advisory written and posted </li></ul>
    55. 55. Vulnerability Lifecycle
    56. 56. Contact information <ul><li>NIAC standards for contact points </li></ul><ul><ul><li>Public process documents, PGP keys, and advisories </li></ul></ul><ul><ul><ul><li>http://www.company.com/security/ </li></ul></ul></ul><ul><ul><li>Email contact </li></ul></ul><ul><ul><ul><li>[email_address] </li></ul></ul></ul>
    57. 57. Secure Software Engineering: Parts <ul><li>Art or Science </li></ul><ul><li>Assurance </li></ul><ul><li>Design Principles </li></ul><ul><li>Requirements </li></ul><ul><li>Processes </li></ul><ul><li>Role of Cryptography </li></ul>
    58. 58. Role of Cryptography <ul><li>Cryptography is involved in many design issues to make attacks much more difficult </li></ul><ul><ul><li>Man-in-the-middle attacks </li></ul></ul><ul><ul><li>Sniffing </li></ul></ul><ul><ul><li>Replay attacks </li></ul></ul><ul><ul><li>Etc... </li></ul></ul><ul><li>Cryptography is not a do-it-yourself task! </li></ul><ul><ul><li>802.11b cryptography fiasco </li></ul></ul><ul><li>Cryptography doesn't solve common programming errors or design flaws </li></ul>
    59. 59. About These Slides <ul><li>You are free to copy, distribute, display, and perform the work; and to make derivative works, under the following conditions. </li></ul><ul><ul><li>You must give the original author and other contributors credit </li></ul></ul><ul><ul><li>The work will be used for personal or non-commercial educational uses only, and not for commercial activities and purposes </li></ul></ul><ul><ul><li>For any reuse or distribution, you must make clear to others the terms of use for this work </li></ul></ul><ul><ul><li>Derivative works must retain and be subject to the same conditions, and contain a note identifying the new contributor(s) and date of modification </li></ul></ul><ul><ul><li>For other uses please contact the Purdue Office of Technology Commercialization. </li></ul></ul><ul><li>Developed thanks to the support of Symantec Corporation </li></ul>
    60. 60. Pascal Meunier [email_address] <ul><li>Contributors: </li></ul><ul><li>Jared Robinson, Alan Krassowski, Jeremy Bennett, Craig Ozancin, Tim Brown, Wes Higaki, Melissa Dark, Chris Clifton, Gustavo Rodriguez-Rivera </li></ul>