Your SlideShare is downloading. ×
Whittaker How To Break Software Security - SoftTest Ireland
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Whittaker How To Break Software Security - SoftTest Ireland

1,968
views

Published on

Visit SoftTest Ireland www.softtest.ie and sign up for access to free Irish Software Testing events.

Visit SoftTest Ireland www.softtest.ie and sign up for access to free Irish Software Testing events.


0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,968
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
48
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Welcome to How to Break Software Security! This course is based on the book by the same name published in 2003. This book followed How to Break Software and precedes How to Break Web Software . That is a lot of How-To’s, which is a good thing because that’s what this course is about. It’s about understanding security vulnerabilities and how to do something about it for your own applications. Whether you are a developer, tester, integrator, manager, decision-maker, whatever…you’ll find this material to be invaluable for understanding security and security vulnerabilities. Welcome to the wonderful world of breaking things!
  • We’ve been performing functional testing for decades and the process is pretty well-entrenched. We have a spec or a test plan that tells us what the application is supposed to do. Say, for example, our test plan tells us to apply input A and that the application should generate output B. As a functional tester, that’s what we do: apply A, watch for B and when we see it, we mark the test case as ‘passed.’ What we are doing here is verifying that the application did what is was supposed to do. But this is both too much and not enough for security testing. It’s too much in that security testers really don’t bother with what the app is supposed to do. We’re concerned more with what the app is not supposed to do! In other words, we apply that same input A but don’t care about output B that is supposed to occur. Instead, we try to verify that some bad output C does not occur. That’s what you’ll learn in this course. How to anticipate insecure behaviors and test for their absence.
  • To highlight the difference, let’s examine two bugs, one functional and one security, and analyze the differences.
  • This screen snap is just for the slides…during the course we will repro the bug in Excel. This bug is in the “scenarios” feature and has the following analysis: 1. That the expected functionality DOES NOT WORK. We do not see the required output. 2. That the failure symptoms are pretty easy to see. This is, in essence, a typical functional bug.
  • This will also be demoed live. The bug in Macromedia flash (which has been fixed) doesn’t show up when the application executes this SWF file. The bug has the following properties. 1. The desired result (output) does indeed happen: the file is rendered correctly. This means that the insecure side-effect (which is a buffer overflow) is masked by the fact that the software did what it was supposed to do. 2. Insecurity often happens invisibly. New tools and thought processes are required to find them. This means that testers need to think about what SHOULD NOT HAPPEN when they are doing security testing.
  • In order to help us think about security bugs, we offer two models for testers to keep in their heads while they are doing security testing. The first deals with the software itself and teaches us how to think about software behaviors. The second deals with the environment in which the application runs and teaches us to think about how the application interacts with other entities in its environment of use.
  • On the left hand side of this diagram we have the specification, or intended behavior of the application. This is what the application is SUPPOSED to do. Then the application gets coded (which is the second, rightmost circle) we have ACTUAL behavior to compare to the EXPECTED behavior. This is the process of testing…find problems, fix problems and make the two circles merge. But functional testing only finds bugs on the left part of the Venn diagram. These are behaviors that SHOULD happen but DON’T…just like the Excel bug shown earlier. To find the security bugs on the right side we need to train ourselves to look for what “isn’t there”…to look in places we don’t look in traditional testing. We need to think about what should NOT happen.
  • If we think about the Macromedia bug for a moment, we realize that we could not see the security bug through the user interface. The UI is rarely the place where security bugs manifest (but it can be as we will see later). Instead, we have to think more holistically about the execution environment. The UI is one aspect of the environment. It is the interface where the application receives user input that must be carefully error-checked. It is also the place where outputs are rendered and we have to make sure those outputs do not reveal anything useful to an attacker The File System is the interface where data from files is read and written. Unlike the UI, this interface is normally invisible and require special tools to observe the traffic that crosses application boundaries. Another important set of inputs crosses this boundary and must be error-checked. However, error checking here is much less common than the UI because developers tend to trust the content of files more than they trust the content of UI text boxes and so forth. The Software interface is where data to third-party controls and applications comes from. For example, network libraries, databases, math libraries and so forth. This is also an invisible interface requiring special tools. The Kernel interface is where applications get memory and other resources. This is where evidence of memory-based exploits will be found and also requires special tools to observe. One such special tool is Holodeck and it will be demoed here.
  • Here’s where we play the All Your Base Are Belong To Us video that underscores hacker’s motivations and sheer delight in doing what they do. The lessons learned from this: 1. Hackers have some free time on their hands…they don’t ship products! 2. Hackers have some skills and they know how to use the tools. 3. Hackers are motivated to break anyone’s application.
  • Beginning in 1996, we undertook a massive project to analyze bugs. This project was partially funded by industry and government sources and had as its goal to develop a better understand of important bugs and to describe better techniques to prevent and find defects. We began studying functional bugs and the result was How to Break Software by James A. Whittaker. We then turned our attention to security bugs which resulted in How to Break Software Security by James and Herbert H. Thompson. In both cases, we studying BUGS THAT SHIPPED because it is this set of bugs that our current processes are the worst at preventing and finding…after all these are the ones that got away.
  • Placeholder for text of Conclusions, SPAs and others (substitute your own text) No source line is necessary unless the source is something other than Gartner Research
  • Transcript

    • 1. How to Break Software Security
    • 2. Functionality vs. Security
      • Functional testing: verify that the app does what it is supposed to do
        • Apply inputs, verify correct outputs
        • Funtional testers ask: ‘what is the software supposed to do?’
      • Security testing: verify that the app does not do what it is not supposed to do
        • Apply inputs, verify that no bad things happen
        • Security testers ask: ‘what is the software not supposed to do?’
      • But what is the set of bad things that can cause an application to behave insecurely?
    • 3. An Example
      • Let’s examine two bugs:
        • A functional bug in which the software fails to do what it is supposed to do
        • A security bug in which the software does what it was supposed to (and a little bit extra!)
    • 4. The Functional Bug
    • 5. The Security Vulnerability
    • 6. What Have We Learned?
      • That security bugs:
        • Are much harder to spot…they often have no visible (to the human eye) behavior…we need better tools
        • Require us to think about side effects and what sensitive data might be exposed
        • Require us to “think backwards”…that is, instead of thinking what should happen, we need to think about what shouldn’t happen
    • 7. Two Models to Guide Our Thinking
      • A model of software behavior
        • Functional testing is simply not designed to find security flaws
      • A model of the environment in which software executes
        • Many security flaws are caused by environment interaction
        • Many security flaws are discovered by analyzing an application’s environment
    • 8. The Behavior Model (the “eclipse” diagram) Intended Behavior Actual Behavior Traditional Bugs Most Security Bugs
    • 9. The Environment Model (the “life preserver” diagram) Application Under Test kernel UI file system Soft- ware o p e r a t i n g s y s t e m
    • 10. Are Hackers Really Motivated to Attack You?
      • Thousands of underground hacking tools
      • Thousands of hacker sites
      • The cold hard truth: Hackers have the advantage
        • They have to find only one bug…we have to close them all
        • They have a lot more time on their hands
      • Enter their world…
    • 11. The HtBSS Project
      • Goal: develop prescriptive techniques for finding security vulnerabilities in software
        • Not a “hacker how-to,” we are interested in the bug not the exploit
      • Process: study bugs that shipped
        • What fault caused it?
        • What failure symptoms do we look for?
        • What testing technique would have found it?
      • Outcome: four classes of vulnerabilities/techniques
        • External dependencies – banana peels in the environment
        • Unanticipated user input – magic bullets
        • Vulnerable design – bad design habits that bite
        • Vulnerable implementation – just plain bugs
    • 12. External Dependencies
      • Software does not exist in isolation
        • Think about the “life preserver” diagram
        • Software uses OS resources, runtime libraries, set up files, data files, …
        • Programming an application to handle all these things gracefully is very complicated
          • Ex: when do you check for a LoadLibrary failure?
            • These are rare events, check too often you slow down the app
            • These are crucial events, check often so you don’t get bitten
      • The result: software often fails insecurely because of its dependency on its environment
    • 13. Unanticipated User Input
      • Some inputs are problematic and developers tend to ignore them:
        • Operating system reserved words
        • Programming language format strings
        • Character set boundary values (e.g., extended ascii)
        • Scripts, code and commands embedded in input fields
      • It is important to maintain lists of these and make sure they get applied during testing
    • 14. Vulnerable Design
      • Some good design practices are actually bad for security
        • Performance and security are at odds
        • Usability and security are at odds
        • Cohesion, coupling and many other design practices can often aid an attacker
      • We need to understand how some security vulnerabilities have their roots in design practices
    • 15. Vulnerable Implementation
      • As an industry, we’ve been writing bugs since the dawn of computing and we don’t seem to be slowing down
      • What common bugs have security implications?
        • What do these bugs look like?
        • How do we find them?
    • 16. External Dependency Attacks
    • 17. Dependency Attack Vectors
      • Block access to libraries
      • Manipulate registry values
      • Force the application to use corrupt files (write protected, inaccessible, data errors...) and file names
      • Replace files that the application reads from, writes to, creates and executes
      • Force the application to operate in stressed memory/disk space/network availability conditions
    • 18. The IE content advisor
    • 19. We get stopped by the parental controls…
    • 20. … So we block access to the msrating.dll library
    • 21. Now we can view that site!
    • 22. Holodeck monitors Update Expert for registry interactions
    • 23. U.E. Shows a patch applied to the local machine
    • 24. U.E. reads patch info from the registry...
    • 25. We target an unapplied patch…
    • 26. We create a folder with the key “installed”…
    • 27. U.E. Reads the bogus directory and shows the patch as installed…
    • 28. User Input Attack Vectors
      • Overflow input buffers
      • Examine all command line switches and input options
      • Explore escape characters, character sets and commands
    • 29. Buffer Overflow Details
    • 30. Modify file with new data
    • 31. Executable data
      • 6A 00 Push x00
      • Parameter describing type of message box.
      • 68 B0 FB 11 00 Push x0011FBB0
      • Pointer to the message box caption text.
      • 68 D5 FB 11 00 Push x0011FBD5
      • Pointer to message box body text.
      • 6A 00 Push x00
      • Handle to a window.
      • FF 15 88 20 40 00 Call User32.MessageBoxA
      • Calling the windows message box function. In this case we are
      • calling indirectly through a pointer.
      • Now we have an exploit!
    • 32. Open the file and you are owned!
    • 33. Design Attack Vectors
        • Try common default and test account names and passwords
        • Expose unprotected test APIs
        • Connect to all ports
        • Fake the source of data
        • Create loop conditions in any application that interprets script, code etc
        • Use alternate routes to accomplish the same task
        • Force the system to reset values
    • 34. Common Accounts Windows; Unix “” ; web web Windows; Unix user user Common to many applications “” ; test; Test test Unix sysadmin sysadmin Unix sys; system; bin sys Unix setup setup Windows SQL server; others “” sa Unix “” ; root Root Windows “” ; Guest; guest Guest Many “” ; demo; demos Demo Windows; Unix and many other platforms and applications “” ; Admin; admin; administrator; Administrator; root Admin Windows; Unix and many other platforms and applications “” ; Admin; admin; administrator; Administrator; root Administrator Systems Affected Passwords Username
    • 35. Implementation Attack Vectors
      • Get between time of check and time of use
      • Create files with the same name as files protected with a higher classification
      • Force all error messages
      • Look for temporary files and screen their contents for sensitive information
    • 36. Logging in with a bogus account…
    • 37. … produces this error message
    • 38. A legit account…
    • 39. … produces this error message
    • 40. Summary and Take-Aways
    • 41. Always Remember…
      • Security testing is different and requires us to think differently
      • There are testing techniques specifically aimed at security testing and these can and should be part of your software practice
    • 42. Take Away (1)
      • Understand your app’s behavior…think about what should NOT happen!
      Intended Behavior Actual Behavior Traditional Bugs Most Security Bugs
    • 43. Take Away (2)
      • Understand your app’s environment…think about where the action is!
      kernel UI file system SW o p e r a t i n g s y s t e m Application Under Test common language runtime
    • 44. Take Away (3)
      • Ask yourself: what is the nightmare scenario for this app (or this customer)!
      • Then…test every entry point for that scenario being realized
        • Its UI
        • Exposed remote functionality
        • Its communication paths
        • The files it reads
    • 45. Take Away (4)
      • Use the attacks, master the tools
        • Attacks:
          • Exploit external dependencies
          • Find unanticipated user input
          • Expose insecure design
          • Determine insecure implementation practices
        • Tools
          • The software Holodeck
          • Hex editors, debuggers, port scanners, …
      • Never cease your vigilance!
    • 46. THE END Questions?