Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Security Management Metrics

17,413 views

Published on

Security Metrics

Published in: Technology, Business
  • Be the first to comment

Security Management Metrics

  1. 1. Security Management Metrics Vicente Aceituno, 2008 Conferencia FIST Marzo/Madrid 2008 @ Sponsored by:
  2. 2. 3 About me  Vice president of the ISSA Spain chapter.  www.issa-spain.org  Vice president of the FIST Conferences association.  www.fistconference.org  Author of a number of articles:  Google: vaceituno wikipedia  Director of the ISM3 Consortium  The consortium promotes ISM3, an ISMS standard  ISM3 is the main source for this presentation.  www.ism3.com
  3. 3. 4 The world without Metrics
  4. 4. 5 Management vs Engineering  Security Engineering: Design and build systems than can be used securely.  Security Management: Employ people and systems (that can be well or badly engineered) safely.
  5. 5. 6 Targets vs Outcomes  Activity and Targets are weakly linked.  Targets:  +Security / -Risk  Trust  Activity:  Keep systems updated  Assign user accounts  Inform users of their rights
  6. 6. 7 Definition  Metrics are quantitative measurements that can be interpreted in the context of a series of previous or equivalent measurements.  Metrics make management possible: 1. Measurement – Some call this “metrics” too. 2. Interpretation – Some call this “indicator”. 3. Investigation – (When appropriate, logs are key here)  Common cause  Special cause 4. Rationalization 5. Informed Decision
  7. 7. 8 Qualitative vs Quantitative Measurement  William Thomson (Lord Kelvin): “I often say that when you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced to the stage of science, whatever the matter may be”: Meaning: “What can’t be measured, can’t be managed”
  8. 8. 9 Interpretation  It doesn’t make sense to set thresholds beforehand. You have to learn what is normal to find out what is abnormal.  Thresholds can be fuzzy. False positives and false negatives.  Example: 1000 students tested for HIV, 10 have it. HIV Have HIV Don’t have HIV Test positive for HIV 9 99 Test negative for HIV 1 891
  9. 9. 10  Is it successful?  Is it normal?  How does it compare against peers? Interpretation
  10. 10. 11  Are outcomes better fit to their purpose?  Are outcomes getting closer or further from target?  Are we getting fewer false positives and false negatives?  Are we using resources more efficiently? Interpretation
  11. 11. 12 Rationalization  Is the correction/change working?  Is it cost effective?  Can we meet our targets with the resources we have?  Are we getting the same outputs with fewer resources?
  12. 12. 13 Decisions
  13. 13. 14 Good Metrics are SMARTIED  S.M.A.R.T  Specific: The metric is relevant to the process being measured.  Measurable: Metric measurement is feasible with reasonable cost.  Actionable: It is possible to act on the process to improve the metric.  Relevant: Improvements in the metric meaningfully enhances the contribution of the process towards the goals of the management system.  Timely: The metric measurement is fast enough for being used effectively.  +Interpretable: Interpretation is feasible (there is comparable data) with reasonable cost (false positives or false negatives rates are low enough)  +Enquirable: Investigation is feasible with reasonable cost.  +Dynamic: The metric values change over time.
  14. 14. 15 Fashion vs Results  Real Time vs Continuous Improvement  Management is far more than Incident Response.  Risk Assessment as a Metric  Only as useful as Investigation results.  Certification / Audit  Compliant / Not compliant is NOT a Metric.
  15. 15. 16 What are good Metrics? Activity: The number of outcomes produced in a time period; Scope: The proportion of the environment or system that is protected by the process. Update: The time since the last update or refresh of process outcomes. (Are outcomes recent enough to be valid?) Availability: The time since a process has performed as expected upon demand (uptime), the frequency and duration of interruptions, and the time interval between interruptions. Efficiency / ROSI: Ratio of outcomes to the cost of the investment in the process. (Are we getting the same outcomes with fewer resources? Are we getting more/better outcomes with the same resources?)
  16. 16. 17 What are good Metrics? Efficacy / Benchmark: Ratio of outcomes produced in comparison to the theoretical maximum. Measuring efficacy of a process implies the comparison against a baseline. (Are outputs better fit to their purpose?, Compare against industry/peers to show relative position) Load: Ratio of available resources in actual use to produce the outcomes, like CPU load, repositories capacity, bandwidth, licenses and overtime hours per employee. Accuracy: Rate of false positives and false negatives.
  17. 17. 18 Examples Activity:  Number of access attempts successful Scope:  % Resources protected with Access Control Update:  Time elapsed since last access attempt successful Availability:  % of Time Access Control is available Efficiency / ROSI:  Access attempts successful per euro Efficacy / Benchmark:  Malicious access attempts failed vs Malicious access attempts successful.  Legitimate access attempts failed vs Legitimate access attempts successful. Load:  % mean and peak Gb, Mb/s, CPU and licenses in use.
  18. 18. 19 Metrics and Capability  Undefined. The process might be used, but it is not defined.  Defined. The process is documented and used.  Managed. The process is Defined and the results of the process are used to fix and improve the process.  Controlled. The process is Managed and milestones and need of resources is accurately predicted.  Optimized. The process is Controlled and improvement leads to a saving in resources
  19. 19. 20 Capability: Undefined  Measurement - None  Interpretation - None
  20. 20. 21 Capability: Defined  Measurement - None  Interpretation - None  Investigation – (When appropriate, logs are key here)  Common cause (changes in the environment, results of management decisions)  Special cause (incidents)  Rationalization for use of time, budget, people and other resources – Not possible  Informed Decision – Not possible
  21. 21. 22 Capability: Managed  Measurement: Scope, Activity, Availability  Interpretation: Normal?, Successful?, Trends? Benchmarking, How does it compare? Efficacy.  Investigation (Common cause, Special cause) Find faults before they produce incidents.  Rationalization… – Possible  Informed Decision – Possible
  22. 22. 23 Capability: Controlled  Measurement: Load, Update  Interpretation Can we meet our targets in time with the resources we have? What resources and time are necessary to meet our targets ?  Investigation Find bottlenecks.  Rationalization…- Possible  Informed Decision, Planning – Possible
  23. 23. 24 Capability: Optimized  Measurement: Efficiency (ROSI)  Interpretation  Investigation  Rationalization  Informed Decision, Planning, Tradeoffs (point of diminishing returns) – Possible
  24. 24. 25 Metric Specification Name of the metric; Description of what is measured; How is the metric measured; How often is the measurement taken; How are the thresholds calculated; Range of values considered normal for the metric; Best possible value of the metric; Units of measurement.
  25. 25. 26 Metrics Representation
  26. 26. 27 Metrics Representation
  27. 27. 28 Metrics Representation Access Rights Granted 0,0 2000,0 4000,0 6000,0 8000,0 10000,0 12000,0 14000,0 16000,0 W eek10W eek13W eek16W eek19W eek22W eek25W eek28W eek31W eek34W eek37W eek40W eek43W eek46W eek49 Weeks Access Rights Granted 0,0 200,0 400,0 600,0 800,0 1000,0 1200,0 1400,0 1600,0 1800,0 W eek10W eek13W eek16W eek19W eek22W eek25W eek28W eek31W eek34W eek37W eek40W eek43W eek46W eek49 Weeks Access Rights Granted 0,0 200,0 400,0 600,0 800,0 1000,0 1200,0 1400,0 1600,0 W eek10W eek13W eek16W eek19W eek22W eek25W eek28W eek31W eek34W eek37W eek40W eek43W eek46 Weeks 0,0 200,0 400,0 600,0 800,0 1000,0 1200,0 1400,0 1600,0 1800,0 1 4 7 10 13 16 19 22 25 28 31 34 37 40
  28. 28. 29 Metrics Representation Access Rights Granted 0,0 200,0 400,0 600,0 800,0 1000,0 1200,0 1400,0 1600,0 1800,0 W eek10W eek13W eek16W eek19W eek22W eek25W eek28W eek31W eek34W eek37W eek40W eek43W eek46W eek49 Weeks
  29. 29. 30 Using Metrics Acumulado de Recomendaciones por Responsable (Suma de días) 0 500 1000 1500 2000 2500 EneroFebrero M arzo Abril M ayo Junio JulioAgosto Septiem breO ctubre N oviem bre D iciem bre Mr Blue Mr Pink Mr Yellow Mr Purple Mr Soft Blue Mr Red Mr Green Mr Orange
  30. 30. 31 Using security management metrics  Key Goal Indicators  Key Performance Indicators  Services Levels Agreements / Underpinnig Contracts  Balanced Scorecard (Customer, Internal, Stakeholder, Innovation - Goals and Measures)
  31. 31. Information Security that makes Business Sense inovement.es/oism3 Web www.inovement.es Video Blog youtube.com/user/vaceituno Blog ism3.com Twitter twitter.com/vaceituno Presentations slideshare.net/vaceituno/presentations Articles slideshare.net/vaceituno/documents
  32. 32. 33 THANK YOU @ with the sponsorship of: www.fistconference.org

×