Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Adversarial Analytics - 2013 Strata & Hadoop World Talk


Published on

This is a talk I gave at the Strata Conference and Hadoop World in New York City on October 28, 2013. It describes predictive modeling in the context of modeling an adversary's behavior.

Published in: Technology, Business

Adversarial Analytics - 2013 Strata & Hadoop World Talk

  1. 1. Adversarial  Analy-cs  101   Strata  Conference  &  Hadoop  World   October  28,  2013   Robert  L.  Grossman   Open  Data  Group  
  2. 2. Examples  of  Adversarial  Analy-cs     •  Compete  for  resources   –  Low  latency  trading   –  Auc-ons   •  Steal  someone  else’s  resources   –  Credit  card  fraud  rings     –  Insurance  fraud  rings     •  Hide  your  ac-vity   –  Click  spam     –  Exfiltra-on  of  informa-on    
  3. 3. Conven-onal  vs  Adversarial  Analy-cs   Conven&onal  Analy&cs   Type  of  game   Gain  /  Gain   Subject   Individual   Behavior   DriUs  over  -me   Transparency   Behavior  usually  open   Automa-on   Manual  (an  individual)   Adversarial  Analy&cs   Gain  /  Loss   Group  /  ring  /  organiza-on   Sudden  changes   Behavior  usually  hidden   Some-mes  another  model  
  4. 4. 1.  Points  of  Compromise   Source:  Wall  Street  Journal,  May  4,  2007  
  5. 5. Case  Study:  Points  of  Compromise   •  TJX  compromise   •  Wireless  Point  of  Sale   devices  compromised     •  Personal  informa-on  on   451,000  individuals  taken   •  Informa-on  used  months   later  for  fraudulent   purchases.   •  WSJ  reports  that  45.7   million  credit  card  accounts   at  risk   •  “TJX's  breach-­‐related  bill   could  surpass  $1  billion   over  five  years”   Source:  Wall  Street  Journal,   May  4,  2007  
  6. 6. Learning  TradecraU  
  7. 7. The  adversary  is  oUen  a  group,   criminal  ring,  etc.  
  8. 8. It’s  Not  an  Individual…   Source:  Jus-ce  Department,  New  York  Times.  
  9. 9. Gonzalez  TradecraU   1.  Exploit  vulnerabili-es  in  wireless  networks   outside  of  retail  stores.   2.  Exploit  vulnerabili-es  in  databases  of  retail   organiza-ons.   3.  Gain  unauthorized  access  to  networks  that   process  and  store  credit  card  transac-ons.     4.  Exfiltrate  40  Million  track  2  records  (data  on   credit  card’s  magne-c  strip)  
  10. 10. Gonzalez  TradecraU  (cont’d)   5.  Sell  track  2  data  in  Eastern  Europe,  USA,  and   other  places.   6.  Create  counterfeit  ATM  and  debit  cards  using   track  2  data.   7.  Conceal  and  launder  the  illegal  proceeds   obtained  through  anonymous  web  currencies   in  the  US  and  Russia.   Source:  US  District  Court,  District  of  Massachusehs,  Indictment  of   Albert  Gonzalez,  August  5,  2008.  
  11. 11. Data  Is  Large   •  TJX  compromise  data  is  too   large  to  fit  into  memory   •  Data  is  difficult  to  fit  into   database   •  Millions  of  possible  points  of   compromise   •  Data  must  be  kept  for  months   to  years   Source:  Wall  Street  Journal,  May  4,  2007  
  12. 12. 2.  Introduc-on  to     Adversarial  Analy-cs  
  13. 13. Conven-onal  Analy-cs   •  Example:  predic-ve   model  to  select  online   ads.   •  Example:  predic-ve   model  to  classify   sen-ment  of  a  customer   service  interac-on.   Adversarial  Analy-cs   •  Example:  commercial   compe-tor  trying  to   maximize  trading  gains.   •  Example:  criminal  gang   ahacking  a  system  and   hiding  its  behavior.  
  14. 14. What’s  Different  About  the  Models?   Data   Data  size   Model   Conven&onal  Analy&cs   Labeled   Small  to  large   Classifica-on  /     Regression   Frequency   Monthly,  quarterly,   of  update   yearly   Adversarial  Analy&cs   Unlabeled   Small  to  very  large   Clustering,  change   detec-on   Hourly,  daily,  weekly,  …  
  15. 15. Types  of  Adversaries   Adversary   Home  Team   Use  specialized  teams   and  approaches.   Build  custom   analy-c  models.   Use  products.   Change   the  game.   $10,000,000   Create  new  analy-c   techniques.   $100,000   Use  exis-ng  analy-c  techniques.   $1,000   Source:  This  approach  is  based  in  part  on  the  threat  hierarchy  in  the  Defense  Science  (DSB)   Report  on  Resilient  Military  Systems  and  the  Cyber  Threat,  January,  2013  
  16. 16. What  is  Different  About  the  Adversary?   En-ty   What  is   modeled?   Conven&onal  Analy&cs   Adversarial  Analy&cs   Person  browsing   Gang  ahacking  a  system   Events,  en--es   Events,  en--es,  rings   Behavior   Component  of  natural   behavior   Wai-ng  for  opportuni-es   Evolu-on   DriU   Ac-ve  change  in  behavior     Obfusca-on   Ignore   Hide  
  17. 17. Upda-ng  Models   Conven&onal   Analy&cs   When  do  you   update  model?   Adversarial   Analy&cs   When  there  is  a   major  change  in   behavior   Frequently,  to  gain   an  advantage.   Process  for  upda-ng   Automa-on  and   models   analy-c   infrastructure   helpful.   Automa-on  and   analy-c   infrastructure   essen-al.  
  18. 18. 3.  More  Examples  
  19. 19. Click  Fraud   ?   •  Is  a  click  on  an  ad  legi-mate?   •  Is  it  done  by  a  computer  or  a  human?   •  If  done  by  a  human,  is  it  an  adversary?  
  20. 20. Types  of  Adversaries   Adversary   Home  Team   Use  specialized  teams   and  approaches.   Build  custom   analy-c  models.   Use  products.   Change   the  game.   $10,000,000   Create  new  analy-c   techniques.   $100,000   Use  exis-ng  analy-c  techniques.   $1,000   Source:  This  approach  is  based  in  part  on  the  threat  hierarchy  in  the  Defense  Science  (DSB)   Report  on  Resilient  Military  Systems  and  the  Cyber  Threat,  January,  2013  
  21. 21. Low  Latency  Trading   Broker/Dealer   ECN   Trades   Market  data   Low  latency   trader   Adversarial  traders   •  •  •  •  2+  billion  bids,  asks,  executes  /  day   >  90%  are  machine  generated   Most  are  cancelled   Decisions  are  made  in  ms   Algorithm   Algorithm   Back  tes-ng  
  22. 22. Connec-ng  Chicago  &  NYC   Technology   Vendor   Round  Trip   Time  (ms)   Microwave   Windy   Apple     9  .0   Dark  fiber,   Spread   shorter  path   Networks   13.1   Standard   fiber,  ISP   14.5+   Various   Source:,  June  27,  2012;  Wired   Magazine,  August  3,  2012  
  23. 23. Types  of  Adversaries   Adversary   Home  Team   Use  specialized  teams   and  approaches.   Build  custom   analy-c  models.   Use  products.   Change   the  game.   $10,000,000   Create  new  analy-c   techniques.   $100,000   Use  exis-ng  analy-c  techniques.   $1,000   Source:  This  approach  is  based  in  part  on  the  threat  hierarchy  in  the  Defense  Science  (DSB)   Report  on  Resilient  Military  Systems  and  the  Cyber  Threat,  January,  2013  
  24. 24. Example:  Conficker   Source:  Conficker  Working  Group:  Lessons  Learned,  June  2010  (Published   January  2011),  
  25. 25. Infected  PCs   Command  and   control  center  
  26. 26. Source:  Conficker  Working  Group:  Lessons  Learned,  June  2010  (Published   January  2011),  
  27. 27. Source:  Conficker  Working  Group:  Lessons  Learned,  June  2010  (Published   January  2011),  
  28. 28. Types  of  Adversaries   Adversary   Home  Team   Use  specialized  teams   and  approaches.   Build  custom   analy-c  models.   Use  products.   Change   the  game.   $10,000,000   Create  new  analy-c   techniques.   $100,000   Use  exis-ng  analy-c  techniques.   $1,000   Source:  This  approach  is  based  in  part  on  the  threat  hierarchy  in  the  Defense  Science  (DSB)   Report  on  Resilient  Military  Systems  and  the  Cyber  Threat,  January,  2013  
  29. 29. 4.  Building  Models  for     Adversarial  Analy-cs  
  30. 30. Building  Models  To     Iden-fy  Fraudulent  Transac-ons   Building  the  model:   1.  Get  a  dataset  of  labeled   data  (i.e.  fraud  is  labeled).   2.  Build  a  candidate  predic-ve   model  that  scores  each   transac-on  with  likelihood   of  fraud.   3.  Compare  liU  of  candidate   model  to  current  champion   model  on  hold-­‐out  dataset.   4.  Deploy  new  model  if   performance  is  beher.   Scoring  transac-ons:   1.  Get  next  event   (transac-on).   2.  Update  one  or  more   associated  feature  vectors   (account-­‐based)   3.  Use  model  to  process   updated  feature  vectors  to   compute  scores.   4.  Post-­‐process  scores  using   rules  to  compute  alerts.  
  31. 31. Building  Models  to     Iden-fy  Fraud  Rings   Driver   Repair  shop   Repair  shop   Driver   Driver   Clinic   Driver   Driver   Driver   Clinic   •  Look  for  suspicious  paherns  and  rela-onships   among  claims.     •  Look  for  hidden  rela-onships  through  common   phone  numbers,  nearby  addresses,  etc.  
  32. 32. Method  1:  Look  for  Tes-ng  Behavior   •  Adversaries  oUen  test  an  approach  first.   •  Build  models  to  detect  the  tes-ng.  
  33. 33. Method  2:  Iden-fy  Common  Behavior   •  Common  cluster  algorithms  (e.g.  k-­‐means)   require  specifying  the  number  of  clusters   •  For  many  applica-ons,  we  keep  the  number  of   clusters  k  rela-vely  small   •  With  microclustering,  we  produce  many   clusters,  even  if  some  of  the  them  are  quite   similar.   •  Microclusters  can  some-mes  be  labeled    
  34. 34. Microcluster  Based  Scoring   Good   Cluster   Bad   Cluster   •  The  closer  a   feature  vector  is  to   a  good  cluster,  the   more  likely  it  is  to   be  good   •  The  closer  it  is  to  a   bad  cluster,  the   more  likely  it  is  to   be  bad.  
  35. 35. Method  3:  Scaling  Baseline  Algorithms   Drive  by  exploits   Source:  Wall  Street  Journal   Points  of  compromise  
  36. 36. What  are  the  Common  Elements?   •  Time  stamps   •  Sites   –  e.g.  Web  sites,  vendors,  computers,  network  devices   •  En--es   •  •  •  •  –  e.g.  visitors,  users,  flows     Log  files  fill  disks;  many,  many  disks   Behavior  occurs  at  all  scales   Want  to  iden-fy  phenomena  at  all  scales   Need  to  group  “similar  behavior”  
  37. 37. Examples  of  a  Site-­‐En-ty  Data   Example   Sites   En&&es   Drive-­‐by  exploits   Compromised   user  accounts   Compromised     payment  systems   Web  sites   Compromised   computers   Merchants   Computers   User  accounts   Cardholders   Source:  Collin  Benneh,  Robert  L.  Grossman,  David  Locke,  Jonathan  Seidman  and  Steve  Vejcik,   MalStone:  Towards  a  Benchmark  for  Analy-cs  on  Large  Data  Clouds,  The  16th  ACM  SIGKDD   Interna-onal  Conference  on  Knowledge  Discovery  and  Data  Mining  (KDD  2010),  ACM,  2010.   37  
  38. 38. Exposure  Window   sites   en--es   Monitor  Window   dk-­‐1   dk   -me   38  
  39. 39. The  Mark  Model   •  Some  sites  are  marked  (percent  of  mark  is  a   parameter  and  type  of  sites  marked  is  a  draw   from  a  distribu-on)   •  Some  en--es  become  marked  aUer  visi-ng  a   marked  site  (this  is  a  draw  from  a  distribu-on)   •  There  is  a  delay  between  the  visit  and  the  when   the  en-ty  becomes  marked  (this  is  a  draw  from  a   distribu-on)   •  There  is  a  background  process  that  marks  some   en--es  independent  of  visit  (this  adds  noise  to   problem)  
  40. 40. Subsequent  Propor-on  of  Marks   •  Fix  a  site  s[j]   •  Let  A[j]  be  en--es  that  transact  during   ExpWin  and  if  en-ty  is  marked,  then  visit   occurs  before  mark   •  Let  B[j]  be  all  en--es  in  A[j]  that  become   marked  some-me  during  the  MonWin   •  Subsequent  propor-on  of  marks  is                                          r[j]  =  |  B[j]  |    /    |  A[j]    |   •  Easily  computed  with  MapReduce.    
  41. 41. Compu-ng  Site  En-ty     Sta-s-cs  with  MapReduce   MalStone  B   Hadoop    MapReduce  v0.18.3   799  min   Hadoop  Python  Streams    v0.18.3   142  min   Sector  UDFs  v1.19   44  min   #  Nodes   20  nodes   #  Records   10  Billion   Size  of  Dataset     1  TB   Source:  Collin  Benneh,  Robert  L.  Grossman,  David  Locke,  Jonathan  Seidman  and  Steve  Vejcik,   MalStone:  Towards  a  Benchmark  for  Analy-cs  on  Large  Data  Clouds,  The  16th  ACM  SIGKDD   Interna-onal  Conference  on  Knowledge  Discovery  and  Data  Mining  (KDD  2010),  ACM,  2010.   41  
  42. 42. 5.  Summary  
  43. 43. Some  Rules   1.  Understand  your  adversary's  tradecraU;  he  or   she  will  go  aUer  the  easiest  or  weakest  link.   2.  Your  adversary  will  be  crea-ng  new   opportuni-es;  you  must  create  new  models  and   analy-c  approaches.   3.  Risk  is  usually  viewed  as  the  cost  of  doing   business,  but  every  once  in  a  while  the  cost  may   be  100x  –  1,000x  greater   4.  The  quicker  you  can  update  your  strategy  and   models,  the  greater  your  advantage.    Use   automa-on  (e.g.  PMML-­‐based  scoring  engines)   to  deploy  models  more  quickly.  
  44. 44. Ques-ons?  
  45. 45. About  Robert  L.  Grossman   Robert  Grossman  (@bobgrossman)  is  the  Founder  and  a  Partner   of  Open  Data  Group,  which  specializes  in  building  predic-ve   models  over  big  data.  He  is  also  a  Senior  Fellow  at  the   Computa-on  Ins-tute    and  Ins-tute  for  Genomics  and  Systems   Biology  at  the  University  of  Chicago  and  a  Professor  in  the   Biological  Sciences  Division.  He  has  led  the  development  of  new   open  source  soUware  tools  for  analyzing  big  data,  cloud   compu-ng,  and  high  performance  networking.  Prior  to  star-ng   Open  Data  Group,  he  founded  Magnify,  Inc.,  which  provides  data   mining  solu-ons  to  the  insurance  industry.  Grossman  was   Magnify’s  CEO  un-l  2001  and  its  Chairman  un-l  it  was  sold  to   ChoicePoint  in  2005.  He  blogs  occasionaly  about  big  data,  data   science,  and  data  engineering  at      
  46. 46. For  More  Informa-on