David Rook Agnitio It’s static analysis, but not as we know it SecurityBSides, London
if (slide == introduction) System.out.println( &quot; I’m  David Rook &quot; ); <ul><li>Security Analyst, Realex Payments,...
<ul><li>What is static analysis? </li></ul><ul><li>Security code reviews: the good, the bad and the ugly </li></ul><ul><li...
Static analysis <ul><li>What do I mean by static analysis? </li></ul><ul><ul><li>A review of source code without executing...
Static analysis <ul><li>Wetware or software? </li></ul><ul><ul><li>Humans are needed with or without static analysis tools...
Static analysis <ul><li>Wetware or software? </li></ul>http://www.ibm.com/developerworks/rational/library/11-proven-practi...
Static analysis <ul><li>Wetware or software? </li></ul>http://www.ibm.com/developerworks/rational/library/11-proven-practi...
Static analysis <ul><li>Wetware or software? </li></ul><ul><ul><li>Tools can cover more code in less time than a human </l...
 
 
 
 
 
 
 
The ugly security code reviews <ul><li>“ Ugly reviews” implies you do actually review code </li></ul><ul><ul><li>An unplan...
The bad security code reviews <ul><li>“ Bad reviews” might be fine for some companies </li></ul><ul><ul><li>A single plann...
The good security code reviews <ul><li>“ Good reviews” don’t happen by accident </li></ul><ul><ul><li>Multiple reviews def...
<ul><li>What are the principles of secure development? </li></ul>The principles of secure development
<ul><li>We put the cart before the application security horse </li></ul><ul><ul><li>Security guys tell developers about sp...
<ul><li>What if we taught drivers in the same way? </li></ul><ul><ul><li>Instructor tells driver about the different ways ...
<ul><li>Many lists of vulnerabilities </li></ul><ul><ul><li>OWASP Top 10 </li></ul></ul><ul><li>White Hat Sec Top 10 </li>...
Cross Site Scripting Injection Flaws Security Misconfiguration Information Leakage Race Condition Broken Authentication Se...
<ul><li>Many lists of vulnerabilities </li></ul><ul><ul><li>OWASP Top 10 </li></ul></ul><ul><li>White Hat Sec Top 10 </li>...
<ul><li>Give a man a fish and you feed him for a day, teach him to fish and you feed him for a lifetime. </li></ul>Philoso...
What we need to do <ul><li>Put the application security horse before the cart </li></ul><ul><ul><li>Security guys tell dev...
Cross Site Scripting Injection Flaws Security Misconfiguration Information Leakage Race Condition Broken Authentication Se...
Agnitio <ul><li>What is Agnitio? </li></ul><ul><ul><li>Tool to help with manual static analysis </li></ul></ul><ul><li>Che...
Agnitio <ul><li>Why did I develop Agnitio? </li></ul><ul><ul><li>Even if your review process is good it might not be smart...
Agnitio <ul><li>Why did I develop Agnitio? </li></ul><ul><ul><li>My own review process was good but it wasn’t smart </li><...
Why did I develop Agnitio? x 1 x 2 x 1 x 2 <ul><li>2 reviews, 3 deliverables x ~200 releases in 2010 </li></ul>
Why did I develop Agnitio? x 10 <ul><li>2 reviews: 3 deliverables x ~200 releases in 2010 </li></ul><ul><li>400 security c...
Why did I develop Agnitio? <ul><li>Demonstration: security code reviews </li></ul>
Why did I develop Agnitio? x 10 <ul><li>2 reviews: 3 deliverables x ~200 releases in 2010 </li></ul><ul><li>Minimum of 4 W...
Why did I develop Agnitio? <ul><li>Demonstration: security code review reports </li></ul>
Why did I develop Agnitio? x 10 <ul><li>2 reviews: 3 deliverables x ~200 releases in 2010 </li></ul><ul><li>Note pad file ...
Why did I develop Agnitio? <ul><li>Demonstration: application security metrics </li></ul>
Why did I develop Agnitio?
Agnitio v2.0 <ul><li>Automated code analysis module linked to checklist </li></ul><ul><li>Data editor for developer and ch...
Agnitio v2.0 <ul><li>Agnitio v2.0 super early, Alpha demonstration </li></ul>
Agnitio v2.0
My “shoot for the moon” vision for Agnitio “ we pretty much need a Burp Pro equivalent for Static Analysis – awesome, powe...
Using the principles and Agnitio <ul><li>How you can apply the principles approach </li></ul><ul><ul><li>Download principl...
www.securityninja.co.uk @securityninja QUESTIONS? /realexninja /securityninja /realexninja
Upcoming SlideShare
Loading in...5
×

Agnitio: its static analysis, but not as we know it

1,101

Published on

BSidesLondon 20th April 2011 - David Rook (@securityninja)
-----------------------
This demonstration filled talk will start by discussing the problems with the security code review approaches most people follow and the reasons why I created Agnitio. This will include a look at existing manual and automated static analysis procedures and tools. The talk will move onto exploring the Principles of Secure Development and how the principles have been mapped to over 60 different checklist items in Agnitio.
---- for more about David go to
http://www.securityninja.co.uk/
---- for more about Agnito go to
http://sourceforge.net/projects/agnitiotool/

Published in: Technology
1 Comment
0 Likes
Statistics
Notes
  • Hello. I would invite all who are interested in static code analysis, try our tool PVS-Studio.
    PVS-Studio is a static analyzer that detects errors in source code of C/C++/C++11 applications (Visual Studio 2005/2008/2010).
    Examples of use PVS-Studio:
    100 bugs in Open Source C/C++ projects
    http://www.viva64.com/en/a/0079/
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

No Downloads
Views
Total Views
1,101
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
1
Likes
0
Embeds 0
No embeds

No notes for slide
  • Even the best tools can be &amp;quot;noisy&amp;quot; requiring manual intervention to identify &amp;quot;real&amp;quot;/useful flaws. High number of false positives is a guarantee with the automated static analysis tools The human reviewer can find vulnerabilities a piece of software can’t, specifically issues relating to business logic flaws and A+A. The human can analyse the findings instantly, in context and with a “real” opinion on the actual risk of the finding, Automated tools provide good coverage but fail to understand &amp;quot;context&amp;quot; and more subtle/complex bugs. Downside with manual (&amp;quot;human&amp;quot;) reviews is that they rely on humans. Even the best humans get tired and make mistakes, you can only concentrate properly and review code for a short (relative to tools) period of time before they begin to be of little use. In my opinion a skilled/trained human reviewer is better than a tool but Manual reviews, automated reviews and DAST all find different issues and you should try to combine them all. No silver bullet! SmartBear Software conducted the largest study ever done on code review. It encompassed 2500 code reviews, 50 programmers, and 3.2 million lines of code at Cisco Systems.
  • Even the best tools can be &amp;quot;noisy&amp;quot; requiring manual intervention to identify &amp;quot;real&amp;quot;/useful flaws. High number of false positives is a guarantee with the automated static analysis tools The human reviewer can find vulnerabilities a piece of software can’t, specifically issues relating to business logic flaws and A+A. The human can analyse the findings instantly, in context and with a “real” opinion on the actual risk of the finding, Automated tools provide good coverage but fail to understand &amp;quot;context&amp;quot; and more subtle/complex bugs. Downside with manual (&amp;quot;human&amp;quot;) reviews is that they rely on humans. Even the best humans get tired and make mistakes, you can only concentrate properly and review code for a short (relative to tools) period of time before they begin to be of little use. In my opinion a skilled/trained human reviewer is better than a tool but Manual reviews, automated reviews and DAST all find different issues and you should try to combine them all. No silver bullet! SmartBear Software conducted the largest study ever done on code review. It encompassed 2500 code reviews, 50 programmers, and 3.2 million lines of code at Cisco Systems.
  • Even the best tools can be &amp;quot;noisy&amp;quot; requiring manual intervention to identify &amp;quot;real&amp;quot;/useful flaws. High number of false positives is a guarantee with the automated static analysis tools The human reviewer can find vulnerabilities a piece of software can’t, specifically issues relating to business logic flaws and A+A. The human can analyse the findings instantly, in context and with a “real” opinion on the actual risk of the finding, Automated tools provide good coverage but fail to understand &amp;quot;context&amp;quot; and more subtle/complex bugs. Downside with manual (&amp;quot;human&amp;quot;) reviews is that they rely on humans. Even the best humans get tired and make mistakes, you can only concentrate properly and review code for a short (relative to tools) period of time before they begin to be of little use. In my opinion a skilled/trained human reviewer is better than a tool but Manual reviews, automated reviews and DAST all find different issues and you should try to combine them all. No silver bullet! SmartBear Software conducted the largest study ever done on code review. It encompassed 2500 code reviews, 50 programmers, and 3.2 million lines of code at Cisco Systems.
  • Tools don’t get tired and can review a lot of code in one go, it can be left to run for many hours, days without tiring and cover almost all of the code. They can help find issues earlier in the SDLC if ingrained in the development phase == lower fix cost but not necessarily lower total cost. If tools produce high amounts of false positives then finding the real issues might actually negate some of/all of the cost saving gained from introducing the tool into the development phase of the SDLC. Tools are expensive to buy, should never be used out of the box. Need to be configured to meet your own requirements - not a five minute job - tuning will takes months. As I mentioned on the previous slide - humans can find vulnerabilities that tools can’t. Tools can often lead to people having a false sense of security, no issues found by the tool != no issues in the code as we will see in the next few slides. Lots of factors involved in getting adoption/buy in. Things like false positive and negative rates, integration into the process, integration into IDE’s, making SA happen automatically, how do you review the issues found, figure out the bugs that matter. SmartBear Software conducted the largest study ever done on code review. It encompassed 2500 code reviews, 50 programmers, and 3.2 million lines of code at Cisco Systems.
  • SQL Injection - FindBugs found it
  • Same SQL Injection but not found by CodePro Analytix
  • Path Manipulation flaw not found by FindBugs
  • Same Path Manipulation flaw found by CodePro Analytix
  • Reflected XSS not found by FindBugs or CodePro Analytix
  • Reflected XSS when we use the username value in the “echo” found by RIPS. Storing the username and password in the clear not found by RIPS.
  • Reflected XSS when we use the username value in the “echo” found by RIPS. Storing the username and password using hashing for the password without a salt not found by RIPS.
  • Security code reviews - a magical mystery tour or a real process driven task? Ringo Starr recalled &amp;quot;Paul had a great piece of paper-just a blank piece of white paper with a circle on it. The plan was: &apos;We start here-and we’ve got to do something here...&apos; We filled it in as we went along.&amp;quot;
  • Open source, hosted on Source Forge Over 4000 downloads from 80+ countries since November 2010 Agnitio is latin for : recognition, knowledge
  • Even if your process is good it might not be smart
  • Agnitio: its static analysis, but not as we know it

    1. 1. David Rook Agnitio It’s static analysis, but not as we know it SecurityBSides, London
    2. 2. if (slide == introduction) System.out.println( &quot; I’m David Rook &quot; ); <ul><li>Security Analyst, Realex Payments, Ireland </li></ul><ul><li>CISSP, CISA, GCIH and many other acronyms </li></ul><ul><li>Security Ninja ( www.securityninja.co.uk ) </li></ul><ul><li>Speaker at international security conferences </li></ul><ul><li>Nominated for multiple blog awards </li></ul><ul><li>A mentor in the InfoSecMentors project </li></ul><ul><li>Developed and released Agnitio </li></ul>
    3. 3. <ul><li>What is static analysis? </li></ul><ul><li>Security code reviews: the good, the bad and the ugly </li></ul><ul><li>The principles of secure development </li></ul><ul><li>Agnitio: It’s static analysis, but not as we know it </li></ul><ul><li>A sneak preview of Agnitio v2.0 </li></ul>Agenda
    4. 4. Static analysis <ul><li>What do I mean by static analysis? </li></ul><ul><ul><li>A review of source code without executing the application </li></ul></ul><ul><li>Can be either manual or automated through one or more tools </li></ul><ul><li>Human and/or tools analysing application source code </li></ul>
    5. 5. Static analysis <ul><li>Wetware or software? </li></ul><ul><ul><li>Humans are needed with or without static analysis tools </li></ul></ul><ul><li>The best thing about humans is that they aren’t software </li></ul><ul><li>The worst thing about humans is that they are humans </li></ul>
    6. 6. Static analysis <ul><li>Wetware or software? </li></ul>http://www.ibm.com/developerworks/rational/library/11-proven-practices-for-peer-review/
    7. 7. Static analysis <ul><li>Wetware or software? </li></ul>http://www.ibm.com/developerworks/rational/library/11-proven-practices-for-peer-review/
    8. 8. Static analysis <ul><li>Wetware or software? </li></ul><ul><ul><li>Tools can cover more code in less time than a human </li></ul></ul><ul><li>The best thing about software is that it isn’t human </li></ul><ul><li>The worst thing about software is that it’s software </li></ul>
    9. 16. The ugly security code reviews <ul><li>“ Ugly reviews” implies you do actually review code </li></ul><ul><ul><li>An unplanned magical mystery tour at the end of the SDLC </li></ul></ul><ul><li>Unstructured, not repeatable and heavily reliant on C 8 H 10 N 4 O 2 </li></ul><ul><li>Too late in the SDLC making findings very expensive to fix </li></ul><ul><li>Completely manual process, no tools used during reviews </li></ul><ul><li>No audit trails, no metrics........no security? </li></ul><ul><li>Better than nothing? </li></ul>
    10. 17. The bad security code reviews <ul><li>“ Bad reviews” might be fine for some companies </li></ul><ul><ul><li>A single planned code review in your SDLC </li></ul></ul><ul><li>Some structure, normally based on finding the OWASP top 10 </li></ul><ul><li>Still too late in the SDLC making findings very expensive to fix </li></ul><ul><li>Some automation, usually basic code analysis tools </li></ul><ul><li>Basic audit trails still no metrics so hard to measure “anything” </li></ul><ul><li>Better than ugly reviews, might be fine for some companies </li></ul>
    11. 18. The good security code reviews <ul><li>“ Good reviews” don’t happen by accident </li></ul><ul><ul><li>Multiple reviews defined as deliverables in your SDLC </li></ul></ul><ul><li>Structured, repeatable process with management support </li></ul><ul><li>Reviews are exit criteria for the development and test phases </li></ul><ul><li>Ability to produce reports, metrics and measure improvements </li></ul><ul><li>External validation of the review process and SDLC </li></ul><ul><li>Automation used where useful freeing up the reviewer </li></ul>
    12. 19. <ul><li>What are the principles of secure development? </li></ul>The principles of secure development
    13. 20. <ul><li>We put the cart before the application security horse </li></ul><ul><ul><li>Security guys tell developers about specific vulnerabilities </li></ul></ul><ul><li>We hope they figure out how to prevent them </li></ul><ul><li>Inevitably security flaws end up in live code </li></ul><ul><li>Security guys complain when data gets stolen </li></ul>The principles of secure development
    14. 21. <ul><li>What if we taught drivers in the same way? </li></ul><ul><ul><li>Instructor tells driver about the different ways to crash </li></ul></ul><ul><li>We hope the driver figures out how not to crash </li></ul><ul><li>Inevitably the driver will crash </li></ul><ul><li>People complain when they get crashed into </li></ul>The principles of secure development
    15. 22. <ul><li>Many lists of vulnerabilities </li></ul><ul><ul><li>OWASP Top 10 </li></ul></ul><ul><li>White Hat Sec Top 10 </li></ul><ul><li>SANS Top 25 </li></ul><ul><li>Others?? </li></ul>The principles of secure development
    16. 23. Cross Site Scripting Injection Flaws Security Misconfiguration Information Leakage Race Condition Broken Authentication Session Management Cross Site Request Forgery Buffer Copy without Checking Size on Input Insecure Direct Object Reference Failure to Restrict URL Access Insecure Cryptographic Storage SQL Injection Content Spoofing Insufficient Authorisation Insufficient Authentication Abuse of Functionality Predictable Resource Location Unrestricted Upload of File with Dangerous Type Failure to Preserve SQL Query Structure Failure to Preserve Web Page Structure Failure to Preserve OS Command Structure URL Redirection to Untrusted Site Insufficient Transport Layer Protection Improper Limitation of a Pathname to a Restricted Directory Improper Control of Filename for Include/Require Statement in PHP Program Incorrect Permission Assignment for Critical Resource Download of Code Without Integrity Check Information Exposure Through an Error Message Reliance on Untrusted Inputs in a Security Decision Use of Hard-coded Credentials Buffer Access with Incorrect Length Value Improper Check for Unusual or Exceptional Conditions Use of a Broken or Risky Cryptographic Algorithm Missing Encryption of Sensitive Data Missing Authentication for Critical Function Integer Overflow or Wraparound Improper Validation of Array Index Incorrect Calculation of Buffer Size Unvalidated Redirects and Forwards Allocation of Resource Without Limits or Throttling Improper Access Control The principles of secure development
    17. 24. <ul><li>Many lists of vulnerabilities </li></ul><ul><ul><li>OWASP Top 10 </li></ul></ul><ul><li>White Hat Sec Top 10 </li></ul><ul><li>SANS Top 25 </li></ul><ul><li>Others?? </li></ul><ul><li>!= Secure development guidance </li></ul><ul><li>45 vulnerabilities, 41 unique names </li></ul><ul><li>Training courses etc based on these lists </li></ul>The principles of secure development
    18. 25. <ul><li>Give a man a fish and you feed him for a day, teach him to fish and you feed him for a lifetime. </li></ul>Philosophical Application Security Teach a developer about a vulnerability and he will prevent it, teach him how to develop securely and he will prevent many vulnerabilities. I want to apply this to secure development education:
    19. 26. What we need to do <ul><li>Put the application security horse before the cart </li></ul><ul><ul><li>Security guys tell developers how to write secure code </li></ul></ul><ul><li>Developer doesn’t need to guess anymore </li></ul><ul><li>Common vulnerabilities prevented in applications </li></ul><ul><li>Realistic or just a caffeine fueled dream? </li></ul>
    20. 27. Cross Site Scripting Injection Flaws Security Misconfiguration Information Leakage Race Condition Broken Authentication Session Management Cross Site Request Forgery Buffer Copy without Checking Size on Input Insecure Direct Object Reference Failure to Restrict URL Access Insecure Cryptographic Storage SQL Injection Content Spoofing Insufficient Authorisation Insufficient Authentication Abuse of Functionality Predictable Resource Location Unrestricted Upload of File with Dangerous Type Failure to Preserve SQL Query Structure Failure to Preserve Web Page Structure Failure to Preserve OS Command Structure URL Redirection to Untrusted Site Insufficient Transport Layer Protection Improper Limitation of a Pathname to a Restricted Directory Improper Control of Filename for Include/Require Statement in PHP Program Incorrect Permission Assignment for Critical Resource Download of Code Without Integrity Check Information Exposure Through an Error Message Reliance on Untrusted Inputs in a Security Decision Use of Hard-coded Credentials Buffer Access with Incorrect Length Value Improper Check for Unusual or Exceptional Conditions Use of a Broken or Risky Cryptographic Algorithm Missing Encryption of Sensitive Data Missing Authentication for Critical Function Integer Overflow or Wraparound Improper Validation of Array Index Incorrect Calculation of Buffer Size Unvalidated Redirects and Forwards Allocation of Resource Without Limits or Throttling Improper Access Control The principles of secure development Input Validation Output Validation Error Handling Authentication Authorisation Session Management Secure Communications Secure Storage Secure Resource Access Auditing and Logging
    21. 28. Agnitio <ul><li>What is Agnitio? </li></ul><ul><ul><li>Tool to help with manual static analysis </li></ul></ul><ul><li>Checklist based with reviewer & developer guidance </li></ul><ul><li>Produces audit trails & enforces integrity checks </li></ul><ul><li>Single tool for security code review reports & metrics </li></ul>
    22. 29. Agnitio <ul><li>Why did I develop Agnitio? </li></ul><ul><ul><li>Even if your review process is good it might not be smart </li></ul></ul><ul><li>Is your review process really repeatable and easy to audit? </li></ul><ul><li>How about producing metrics, useful reports & integrity checks? </li></ul><ul><li>No? That’s why I developed Agnitio! </li></ul>
    23. 30. Agnitio <ul><li>Why did I develop Agnitio? </li></ul><ul><ul><li>My own review process was good but it wasn’t smart </li></ul></ul><ul><li>Minimum of 2 code reviews per release </li></ul><ul><li>Three pieces of evidence produced per review </li></ul><ul><li>One central Excel sheet for metrics and “audit” trail </li></ul>
    24. 31. Why did I develop Agnitio? x 1 x 2 x 1 x 2 <ul><li>2 reviews, 3 deliverables x ~200 releases in 2010 </li></ul>
    25. 32. Why did I develop Agnitio? x 10 <ul><li>2 reviews: 3 deliverables x ~200 releases in 2010 </li></ul><ul><li>400 security code reviews </li></ul>
    26. 33. Why did I develop Agnitio? <ul><li>Demonstration: security code reviews </li></ul>
    27. 34. Why did I develop Agnitio? x 10 <ul><li>2 reviews: 3 deliverables x ~200 releases in 2010 </li></ul><ul><li>Minimum of 4 Word documents per release </li></ul>
    28. 35. Why did I develop Agnitio? <ul><li>Demonstration: security code review reports </li></ul>
    29. 36. Why did I develop Agnitio? x 10 <ul><li>2 reviews: 3 deliverables x ~200 releases in 2010 </li></ul><ul><li>Note pad file per release with notes, LOC etc </li></ul>
    30. 37. Why did I develop Agnitio? <ul><li>Demonstration: application security metrics </li></ul>
    31. 38. Why did I develop Agnitio?
    32. 39. Agnitio v2.0 <ul><li>Automated code analysis module linked to checklist </li></ul><ul><li>Data editor for developer and checklist guidance text </li></ul><ul><li>Checklist and guidance in multiple languages </li></ul><ul><li>Plus lots of user suggested changes! </li></ul>
    33. 40. Agnitio v2.0 <ul><li>Agnitio v2.0 super early, Alpha demonstration </li></ul>
    34. 41. Agnitio v2.0
    35. 42. My “shoot for the moon” vision for Agnitio “ we pretty much need a Burp Pro equivalent for Static Analysis – awesome, powerful in the right hands, and completely affordable!” http://www.securityninja.co.uk/application-security/can-you-implement-static-analysis-without-breaking-the-bank/comment-page-1#comment-9777
    36. 43. Using the principles and Agnitio <ul><li>How you can apply the principles approach </li></ul><ul><ul><li>Download principles documentation from Security Ninja </li></ul></ul><ul><li>Focus secure development training on code not exploits </li></ul><ul><li>Use your language/s in all code examples </li></ul><ul><li>Use Agnitio to conduct principles based security code reviews </li></ul><ul><li>Tie all security findings back to specific principles </li></ul>
    37. 44. www.securityninja.co.uk @securityninja QUESTIONS? /realexninja /securityninja /realexninja

    ×