Phil HugginsPrivate Security Conference Winter 2008
“Complexity is the worst enemy of security” –Marcus RanumThis is a work in progress.Winter 2008
Is it secure? What are the risks? Are the risks important? Whose fault are the risks? Why didnt our external pen test / app test / vuln scan find all these risks? Can I save money on my security investment? Why is security always the source of our problems? Can you tell us how to fix it?
What are the negative outcomes we want to avoid? Pure business focus at this point How can we rank them in importance? For the example system we identified six key negative outcomes: Loss of Credit Card Data Loss of Personal Data Compromise of internal network Loss of regulatory required data Defacement of the website Attack on user of the system
Where are the possible sources of the negative outcomes? How capable are those sources? How do the threat sources get to the outcomes via the identified system components? Attack Trees
Lots of work Manually need to build a tree for each outcome Some commercial tools available Graphviz & Dot
The attack trees identify potential risks NOT vulnerabilities No testing at this point Map to existing security controls to identify security design gaps
Still very opinion based – hard to compare results across practitioners Manually intensive Not pretty for customers What does it identify: Security design gaps Likely vulnerable (complex) components Trust relationships between components
Approach to identify complexity and interdependencies Component DSM used for system architecture analysis Matrix of components www.dsmweb.org
Just focus on which component connects to which other connections Sum of each row is the component fan-out complexity Sum of each column is the component fan-in complexity Sum of row + column for each component is total component complexity Sum of total component complexity is a measure of system complexity Allows you to rank components on connection complexity
Previous Work Howard at Microsoft Manadhata at Carnegie Mellon Manadhata correlated severity of reported public vulns in FTP servers with: Method privilege Method access rights Channel Protocol Channel access rights Data item type Data item access rights
Measuring Connection Complexity Number and type of protocols Number and type of API calls Number and type of messages Number and type of functions Measuring Connection Trust Authenticated Y/N? Integrity checking Y/N? Measuring Connection Privilege Number of levels of authorisation Privilege level of protocol endpoint Privilege level of message endpoint Persistence of message data Measuring Connection Privacy Encrypted Y/N?
Assign some arbitrary ordinal numbers to the attack surface measures Implement a clustering tool to map trust / complexity across systems Pretty graphics Anyone got any systems they want to try this out on?