Data-Driven Threat Intelligence: Metrics on Indicator Dissemination and Sharing


Published on

For the past 18 months, Niddel has been collecting threat intelligence indicator data from multiple sources in order to make sense of the ecosystem and try to find a measure of efficiency or quality in these feeds. This initiative culminated in the creation of Combine and TIQ-test, two of the open source projects from MLSec Project. These projects have been improved upon for the last year and are able to gather and compare data from multiple Threat Intelligence sources on the Internet.

We take this analysis a step further and extract insights form more than 12 months of collected threat intel data to verify the overlap and uniqueness of those sources. If we are able to find enough overlap, there could be a strategy that could put together to acquire an optimal number of feeds, but as Niddel demonstrated on the 2015 Verizon DBIR, that is not the case.

We also gathered aggregated usage information from intelligence sharing communities in order to determine if the added interest and "push" towards sharing is really being followed by the companies and if its adoption is putting us in the right track to close these gaps.

Join us in an data-driven analysis of over an year of collected Threat Intelligence indicators and their sharing communities!

Published in: Technology

Data-Driven Threat Intelligence: Metrics on Indicator Dissemination and Sharing

  1. 1. Data-­‐Driven  Threat  Intelligence:  Metrics   on  Indicator  Dissemination  and  Sharing   (#ddti) Alex  Pinto Chief  Data  Scientist   MLSec Project @alexcpsec @MLSecProject AlexandreSieira CTO Niddel @AlexandreSieira @NiddelCorp
  2. 2. • Cyber  War… Threat  Intel  – What  is  it  good  for? • Combine  and  TIQ-­‐test • Measuring  indicators • Threat  Intelligence  Sharing • Future  research  direction   (i.e.  will  work  for  data) Agenda HT  to  @RCISCwendy
  3. 3. 50-­‐ish  Slides 3  Key  Takeaways 2  Heartfelt  and  genuine  defenses  of  Threat   Intelligence  Providers 1  Prediction  on  “The  Future  of  Threat   Intelligence  Sharing” Presentation  Metrics!!
  4. 4. What  is  TI  good  for  (1)  Attribution
  5. 5. What  is  TI  good  for  anyway? TY  to  @bfist for  his  work  on
  6. 6. What  is  TI  good  for  (2)  – Cyber  Maps!! TY  to  @hrbrmstr for  his  work  on
  7. 7. What  is  TI  good  for  anyway? • (3)  How  about  actual  defense?   • Strategic  and  tactical:  planning • Technical  indicators:  DFIR  and  monitoring
  8. 8. Affirming  the  Consequent  Fallacy 1. If  A,  then  B. 2. B. 3. Therefore,  A. 1. Evil  malware  talks  to 2. I  see  traffic  to 3. ZOMG,  APT!!!
  9. 9. But  this  is  a  Data-­‐Driven  talk!
  10. 10. Combine  and  TIQ-­‐Test • Combine  ( • Gathers  TI  data  (ip/host)  from  Internet  and  local  files • Normalizes  the  data  and  enriches  it  (AS  /  Geo  /  pDNS) • Can  export  to  CSV,  “tiq-­‐test  format”  and  CRITs • Coming  Soon™:  CybOX /  STIX  /  SILK  /ArcSight CEF • TIQ-­‐Test  (­‐test) • Runs  statistical  summaries  and  tests  on  TI  feeds • Generates  charts  based  on  the  tests  and  summaries • Written  in  R  (because  you  should  learn  a  stat  language)
  11. 11. •­‐test-­‐Summer2015
  12. 12. Using  TIQ-­‐TEST  – Feeds  Selected • Dataset  was  separated  into  “inbound”  and  “outbound” TY  to  @kafeine and  John  Bambenek for  access  to  their  feeds
  13. 13. Using  TIQ-­‐TEST  – Data  Prep • Extract  the  “raw”  information  from  indicator  feeds • Both  IP  addresses  and  hostnames  were  extracted
  14. 14. Using  TIQ-­‐TEST  – Data  Prep • Convert  the  hostname  data  to  IP  addresses: • Active  IP  addresses  for  the  respective  date  (“A”  query) • Passive  DNS  from  Farsight Security  (DNSDB) • For  each  IP  record  (including  the  ones  from  hostnames): • Add  asnumber and  asname (from  MaxMind ASN  DB) • Add  country (from  MaxMind GeoLite DB) • Add  rhost (again  from  DNSDB)  – most  popular  “PTR”
  15. 15. Using  TIQ-­‐TEST  – Data  Prep  Done
  16. 16. Novelty  Test Measuring  added  and  dropped   indicators
  17. 17. Novelty  Test  -­‐ Inbound
  18. 18. Aging  Test Is  anyone  cleaning  this  mess  up   eventually?
  19. 19. INBOUND
  20. 20. OUTBOUND
  21. 21. Population  Test • Let  us  use  the  ASN  and   GeoIP databases  that  we   used  to  enrich  our  data  as  a   reference  of  the  “true”   population.   • But,  but,  human  beings  are   unpredictable!  We  will   never  be  able  to  forecast   this!
  22. 22. Is  your  sampling  poll  as  random  as   you  think?
  23. 23. Can  we  get  a  better  look? • Statistical  inference-­‐based  comparison  models   (hypothesis  testing) • Exact  binomial  tests  (when  we  have  the  “true”  pop) • Chi-­‐squared  proportion  tests  (similar  to   independence  tests)
  24. 24. Overlap  Test More  data  can  be  better,  but  make   sure  it  is  not  the  same  data
  25. 25. Overlap  Test  -­‐ Inbound
  26. 26. Overlap  Test  -­‐ Outbound
  27. 27. Uniqueness  Test
  28. 28. Uniqueness  Test • “Domain-­‐based  indicators  are  unique  to  one  list  between  96.16%  and   97.37%” • “IP-­‐based  indicators  are  unique  to  one  list  between  82.46%  and   95.24%  of  the  time”
  29. 29. I  hate  quoting  myself,  but…
  30. 30. Key  Takeaway  #1 MORE  !=  BETTER Threat  Intelligence   Indicator  Feeds Threat  Intelligence   Program Key  Takeaway  #1
  31. 31. Intermission
  32. 32. Key  Takeaway  #2
  33. 33. Key  Takeaway  #1 "These  are  the  problems  Threat   Intelligence  Sharing  is  here  to   solve!” Right?
  34. 34. Herd  Immunity,  is  it? Source:
  35. 35. Herd  Immunity… …  would  imply  that  other  people  in  your  sharing  community  being   immune  to  malware  A  meant  your  likelihood  of  infection  from  it   wa negligible  regardless of  controls  you  applied.
  36. 36. Threat  Intelligence  Sharing • How  many  indicators  are  being   shared? • How  many  members  do  actually   share  and  how  many  just  leech? • Can  we  measure  that?  What  a   super-­‐deeee-­‐duper  idea!
  37. 37. Threat  Intelligence  Sharing We  would  like  to  thank  the  kind  contribution  of  data  from  the  fine   folks  at  Facebook  Threat  Exchange  and  Threat  Connect… …  and  also  the  sharing  communities  that  chose  to  remain   anonymous.  You  know  who  you  are,  and  we  ❤ you  too.
  38. 38. Threat  Intelligence  Sharing  – Data From  a  period  of  2015-­‐03-­‐01  to  2015-­‐05-­‐31: -­‐ Number  of  Indicators  Shared § Per  day § Per  member Not  sharing  this  data  – privacy  concerns  for   the  members  and  communities
  39. 39. Update  frequency  chart
  40. 40. OVERLAP  SLIDE
  41. 41. OVERLAP  SLIDE
  43. 43. MATURITY?
  44. 44. “Reddit of   Threat   Intelligence”?
  45. 45. Key  Takeaway  #1 'How  can  sharing  make  me   better  understand  what  are   attacks  that  “are  targeted”  and   what  are  “commodity”?'
  46. 46. Key  Takeaway  #1 TELEMETRY  >  CONTENT Key  Takeaway  #3 (Also  Prediction  #1)
  47. 47. More  Takeaways  (I  lied) • Analyze  your  data.  Extract  more  value  from  it! • If  you  ABSOLUTELY  HAVE  TO  buy  Threat  Intelligence   or  data,  evaluate  it  first. • Try  the  sample  data,  replicate  the  experiments: •­‐test-­‐Summer2015 •­‐test-­‐Summer2015 • Share  data  with  us.  I’ll  make  sure  it  gets  proper  exercise!
  48. 48. Thanks! • Q&A? • Feedback! ”The  measure  of  intelligence  is  the  ability  to  change."   -­‐ Albert  Einstein   Alex  Pinto   @alexcpsec @MLSecProject Alexandre Sieira @AlexandreSieira @NiddelCorp