Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

MUSES WP5 Final Conclusions

2,684 views

Published on

Presented by S2 Grupo and UGR at Brussels during the MUSES Final Review

Published in: Science
  • Be the first to comment

  • Be the first to like this

MUSES WP5 Final Conclusions

  1. 1. Project No. 318508 FP7-ICT-2011-8 Final Review, November 11th 2015 WP5: Self-adaptive Event Correlation
  2. 2. 2 MUSES research areas Corporate security Monitoring & context observation Human-computer interaction Self adaptive event correlation Usability for mobile devices Legal aspects Risk , trust & privacy
  3. 3. 3 Index • Objective of WP5 • Work done • Main results /advance over SotA • Deliverables
  4. 4. 4 Objectives of WP5 • Collect monitoring activity • Identify policy violation patterns in RT • Coordinate device policy transmission • Automated rule extraction
  5. 5. 5 Work Done T5.1 T5.2 T5.3 T5.4 Definition of requirements, integration and validation mechanisms Design and development of a domain-specific event correlation engine Extraction and adaptation of rules Evolution of the correlation engine to a self-adaptive rule learning engine
  6. 6. 6 Work Done - Use trials data to infer new rules - Pre-selection phase to select algorithms - Rule-based and decision tree algorithms - Qualitative results analysis - Knowledge Compiler compares with existing rules and proposes candidate rules T5.3 Extraction and adaptation of rules
  7. 7. 7 Work Done - Identification of new types of policies - Integration with the selection of algorithms providing better results (Data Miner) - Integration of Knowledge Compiler: new rules as draft - MUSES Server Risk Management (Security-oriented GUI) T5.4 Evolution of the correlation engine to a self-adaptive rule learning engine
  8. 8. 8 Self-adaptive Event Correlation
  9. 9. 1. Security corporate policies 2. DRL event correlation rules 3. Identify policy violations with context 4. Analyse risk 5. Compose device policy 9 EP: Device policies in real time Policies DRL rules Adapt with context Risk analysis Device policy
  10. 10. EP: Policies Asset on unsecure WiFi Asset on secure WiFi Not sensitive asset Blacklists Required applications Email policy Virus 10 Password protection Screen lock Accesibility Access Control List Anti-virus Assets in corporate folders Bluetooth
  11. 11. EP: Policies Trusted antivirus installed Rooted devices Use of required applications Application categories by pattern in package name (p2p, torrent) Email and virus 11
  12. 12. More policies… 12 • Encryption enforcement Encryption • External storage Use of pendrives • Sessions in different devices Cross- contamination • Check connection properties Usert typing a password • Emas: Notes are not allowed when unsecure wifi Muses Aware App • Device wipe Security incident
  13. 13. 13 EP:Identification of security violations
  14. 14. 14 Identification of security violations
  15. 15. • Traditional PDP systems 15 EP:Advance beyond SotA • MUSES Continuous Real-Time Event Processor PDPAccess Request Permit Deny Policy compliance XACML Policy Expressive PDP + Risk Analysis Access Request • Policy compliance • Risk analysis • Adapted to current context Device Policy XACML Policy Additional context
  16. 16. 16 KRS Timeline January 15 April 15 May 15 June 15 October 15 Trials 1 start Development of KRS during trials, first version by the end of the trials. Offline processing and KRS refinement. Stockholm, May 2015 - Database improvement. KRS working during Trials 2, online processing. Offline processing and enhancements. SRM development.
  17. 17. •Automated Machine Learning: Rule generation 17 KRS – Data Miner CLASSIFICATION (JRIP, REPTREE, PART, J48) ASSOCIATION (Apriori) DATASET (defined patterns with decisions) EXISTING SET OF SECURITY RULES OBTAINED SET OF CLASSIFICATION/ ASSOCIATION RULES PROPOSED NEW RULES (draft) IMPROVED SET OF SECURITY RULES
  18. 18. 18 What is considered as Interesting (attributes)? MUSES DB Device information (10) OS, owner, model, trust value, among others. App information (3) name, vendor, and MUSES awareness. Asset information (4) name, location, confidentiality, and value. Other information (6) decision, detection time, silent mode, event type, among others. Connection sensor information (4) Mail sensor information (4) User information (4) role, trust, account enabled, and username. Password lexical properties (4).
  19. 19. • In terms of data 19 Trials 1 Vs. Trials 2 Trials 1 Trials 2 Number of gathered events 215552 136340 Number of covered attributes 30/40 37/40
  20. 20. 20 Applying Clustering Techniques • Identified groups by the K-means algorithm 63 % of patterns. Algorithm applied label GRANTED 22% of patterns. Algorithm applied label STRONGDENY
  21. 21. 21 Analysing Clusters: BYOD or COPE? UserRoles
  22. 22. 22 Analysing Clusters: BYOD or COPE? DeviceOS
  23. 23. 23 KRS – New rules • Rule candidates – 14 correct rules • Validated rules:3 OPEN ASSET public UNSECURE WIFI ALLOW OPEN ASSET public UNSECURE WIFI DENY BYOD
  24. 24. 24 KRS – New rules OPEN ASSET public UNSECURE WIFI ALLOW OPEN ASSET public UNSECURE WIFI DENY BYOD
  25. 25. 25 KRS – New rules OPEN ASSET public UNSECURE WIFI ALLOW OPEN ASSET public UNSECURE WIFI ALLOW COPE
  26. 26. 26 KRS – New rules OPEN ASSET confidential UNSECURE WIFI OPEN ASSET confidential UNSECURE WIFI DENY Security role ALLOW
  27. 27. 27 State of the Art • State of the art: – Rule adjustment • Pros: Applied over real environments (IDS) • Cons: Limited (rule structure is static) – Rule refinement • Pros: Take variability of input data into account. Rule structure might be changed • Cons: Lack of automation over real environments
  28. 28. KRS:Advance beyond SotA 28 • Improved automation (rule refinement) • Rule structure is not static • Rule refinement applied to security policies • Global security rule optimization not addressed before • Classification and feature extraction methods applied to real data from employees
  29. 29. 29 Deliverables Requirements and conceptual model of self-adaptive event correlation system (M12) First prototype of the self-adaptive event correlation system (M18) Second prototype of self-adaptive event correlation system (M28) D5.1 D5.2 D5.3 D5.3e Second prototype of self-adaptive event correlation system (EXTENDED VERSION)

×