Your SlideShare is downloading. ×

Security Metrics

243
views

Published on

Published in: Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
243
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
18
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Conferencia FIST Marzo/Madrid 2008 @ Sponsored by:Security Management Metrics Vicente Aceituno, 2008
  • 2. About meVice president of the ISSA Spain chapter. www.issa-spain.orgVice president of the FIST Conferencesassociation. www.fistconference.orgAuthor of a number of articles: Google: vaceituno wikipediaDirector of the ISM3 Consortium The consortium promotes ISM3, an ISMS standard ISM3 is the main source for this presentation. www.ism3.com
  • 3. Management vs EngineeringSecurity Engineering: Design and build systemsthan can be used securely.Security Management: Employ people andsystems (that can be well or badly engineered)safely.
  • 4. Targets vs OutcomesActivity and Targets are weakly linked.Targets: +Security / -Risk TrustActivity: Keep systems updated Assign user accounts Inform users of their rights
  • 5. Definition Metrics are quantitative measurements that can be interpreted in the context of a series of previous or equivalent measurements. Metrics make management possible:1. Measurement – Some call this “metrics” too.2. Interpretation – Some call this “indicator”.3. Investigation – (When appropriate, logs are key here) Common cause Special cause4. Rationalization5. Informed Decision
  • 6. Qualitative vs Quantitative MeasurementWilliam Thomson (Lord Kelvin): “I often say that when you canmeasure what you are speaking about, and express it in numbers, you know somethingabout it; but when you cannot express it in numbers, your knowledge is of a meager andunsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in yourthoughts, advanced to the stage of science, whatever the matter may be”: “What can’t be measured, can’t Meaning: be managed”
  • 7. InterpretationIt doesn’t make sense to set thresholds beforehand. Youhave to learn what is normal to find out what is abnormal.Thresholds can be fuzzy. False positives and falsenegatives. Example: 1000 students tested for HIV, 10 have it.HIV Have HIV Don’t have HIVTest positive for HIV 9 99Test negative for HIV 1 891
  • 8. InterpretationIs it successful?Is it normal?How does it compare against peers?
  • 9. InterpretationAre outcomes better fit to their purpose?Are outcomes getting closer or further from target?Are we getting fewer false positives and false negatives?Are we using resources more efficiently?
  • 10. RationalizationIs the correction/change working?Is it cost effective?Can we meet our targets with the resources wehave?Are we getting the same outputs with fewerresources?
  • 11. Decisions
  • 12. Good Metrics are SMARTIEDS.M.A.R.T Specific: The metric is relevant to the process being measured. Measurable: Metric measurement is feasible with reasonable cost. Actionable: It is possible to act on the process to improve the metric. Relevant: Improvements in the metric meaningfully enhances the contribution of the process towards the goals of the management system. Timely: The metric measurement is fast enough for being used effectively.+Interpretable: Interpretation is feasible (there is comparabledata) with reasonable cost (false positives or false negativesrates are low enough)+Enquirable: Investigation is feasible with reasonable cost.+Dynamic: The metric values change over time.
  • 13. Fashion vs ResultsReal Time vs Continuous Improvement Management is far more than Incident Response.Risk Assessment as a Metric Only as useful as Investigation results.Certification / Audit Compliant / Not compliant is NOT a Metric.
  • 14. What are good Metrics? Activity: The number of outcomes produced in a timeperiod; Scope: The proportion of the environment or system thatis protected by the process. Update: The time since the last update or refresh ofprocess outcomes. (Are outcomes recent enough to be valid?) Availability: The time since a process has performed asexpected upon demand (uptime), the frequency andduration of interruptions, and the time interval betweeninterruptions. Efficiency / ROSI: Ratio of outcomes to the cost of theinvestment in the process. (Are we getting the same outcomeswith fewer resources? Are we getting more/better outcomes with thesame resources?)
  • 15. What are good Metrics? Efficacy / Benchmark: Ratio of outcomesproduced in comparison to the theoreticalmaximum. Measuring efficacy of a process impliesthe comparison against a baseline. (Are outputs better fitto their purpose?, Compare against industry/peers to show relativeposition) Load: Ratio of available resources in actual use toproduce the outcomes, like CPU load, repositoriescapacity, bandwidth, licenses and overtime hoursper employee. Accuracy: Rate of false positives and falsenegatives.
  • 16. ExamplesActivity: Number of access attempts successfulScope: % Resources protected with Access ControlUpdate: Time elapsed since last access attempt successfulAvailability: % of Time Access Control is availableEfficiency / ROSI: Access attempts successful per euroEfficacy / Benchmark: Malicious access attempts failed vs Malicious access attempts successful. Legitimate access attempts failed vs Legitimate access attempts successful.Load: % mean and peak Gb, Mb/s, CPU and licenses in use.
  • 17. Metrics and CapabilityUndefined. The process might be used, but it isnot defined.Defined. The process is documented and used.Managed. The process is Defined and theresults of the process are used to fix andimprove the process.Controlled. The process is Managed andmilestones and need of resources is accuratelypredicted.Optimized. The process is Controlled andimprovement leads to a saving in resources
  • 18. Capability: UndefinedMeasurement - NoneInterpretation - None
  • 19. Capability: DefinedMeasurement - NoneInterpretation - NoneInvestigation – (When appropriate, logs are key here) Common cause (changes in the environment, results of management decisions) Special cause (incidents)Rationalization for use of time, budget, peopleand other resources – Not possibleInformed Decision – Not possible
  • 20. Capability: ManagedMeasurement: Scope, Activity, AvailabilityInterpretation: Normal?, Successful?, Trends? Benchmarking, How does it compare? Efficacy.Investigation (Common cause, Special cause) Find faults before they produce incidents.Rationalization… – PossibleInformed Decision – Possible
  • 21. Capability: ControlledMeasurement: Load, UpdateInterpretation Can we meet our targets in time with the resources we have? What resources and time are necessary to meet our targets ?Investigation Find bottlenecks.Rationalization…- PossibleInformed Decision, Planning – Possible
  • 22. Capability: OptimizedMeasurement: Efficiency (ROSI)InterpretationInvestigationRationalizationInformed Decision, Planning, Tradeoffs (point ofdiminishing returns) – Possible
  • 23. Metric SpecificationName of the metric;Description of what is measured;How is the metric measured;How often is the measurement taken;How are the thresholds calculated;Range of values considered normal for the metric;Best possible value of the metric;Units of measurement.
  • 24. Metrics Representation
  • 25. Metrics Representation
  • 26. Metrics Representation Access Rights Granted16000,0 1800,014000,0 1600,012000,0 1400,010000,0 1200,0 8000,0 1000,0 6000,0 800,0 4000,0 600,0 2000,0 400,0 0,0 200,0 0,0 W 10 W 13 W 16 W 19 W 22 W 25 W 28 W 31 W 34 W 37 W 40 W 43 W 46 9 k4 1 4 7 10 13 16 19 22 25 28 31 34 37 40 k k k k k k k k k k k k k ee ee ee ee ee ee ee ee ee ee ee ee ee ee W Weeks Access Rights Granted Access Rights Granted 1800,0 1600,0 1600,0 1400,0 1400,0 1200,0 1200,0 1000,0 1000,0 800,0 800,0 600,0 600,0 400,0 400,0 200,0 200,0 0,0 0,0 W 10 W 13 W 16 W 19 W 22 W 25 W 28 W 31 W 34 W 37 W 40 W 43 6 W 10 W 13 W 16 W 19 W 22 W 25 W 28 W 31 W 34 W 37 W 40 W 43 W 46 9 k4 k4 k k k k k k k k k k k k k k k k k k k k k k k k k ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee ee W W Weeks Weeks
  • 27. Using Metrics Acumulado de Recomendaciones por Responsable (Suma de días)2500 Mr Blue2000 Mr Pink Mr Yellow1500 Mr Purple Mr Soft Blue Mr Red1000 Mr Green Mr Orange 500 0 o Fe r o o ie e zo pt sto o Ag o O re ic bre e ril ay ni r li er br e Ab N tub b Ju ar Ju Se o En m m br M m M c ie ie ov D
  • 28. Using security management metricsKey Goal IndicatorsKey Performance IndicatorsServices Levels Agreements / Underpinnig ContractsBalanced Scorecard (Customer, Internal, Stakeholder,Innovation - Goals and Measures)
  • 29. Creative Commons Attribution-NoDerivs 2.0You are free:•to copy, distribute, display, and perform this workUnder the following conditions: Attribution. You must give the original author credit. No Derivative Works. You may not alter, transform, or build upon this work.For any reuse or distribution, you must make clear to others the license terms of this work.Any of these conditions can be waived if you get permission from the author.Your fair use and other rights are in no way affected by the above.This work is licensed under the Creative Commons Attribution-NoDerivs License. To view a copy ofthis license, visit http://creativecommons.org/licenses/by-nd/2.0/ or send a letter to CreativeCommons, 559 Nathan Abbott Way, Stanford, California 94305, USA.
  • 30. @ with the sponsorship of: THANK YOUwww.fistconference.org

×