TAROT2013 Testing School - Leonardo Mariani presentation

777 views

Published on

TAROT 2013 9th International Summer School on Training And Research On Testing, Volterra, Italy, 9-13 July, 2013

These slides summarize Leonardo Mariani's presentation about "Automated Failure Analysis in Absence of Specification"

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
777
On SlideShare
0
From Embeds
0
Number of Embeds
52
Actions
Shares
0
Downloads
18
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

TAROT2013 Testing School - Leonardo Mariani presentation

  1. 1. Automated  Failure  Analysis  in   Absence  of  Specifica7on   Leonardo  Mariani   University  of  Milano  Bicocca   mariani@disco.unimib.it  
  2. 2. Analysis  of  So?ware  Behaviors   Analysis  of  So?ware  Failures   (semi-­‐)automa7cally,  when   no  specifica7on  is  available  
  3. 3. Automated  Debugging   Fault  localiza7on:  search  for  the  fault  loca7on        (e.g.,  search  for  faulty  program  statements      in  the  program  source  code)   Failure  analysis  (aka  anomaly  detec7on):  search      for  failure  causes        (e.g.,  search  for  erroneous  events  in  the        execu7on  space)  
  4. 4. Example   •  known  issue  in  Tomcat  6.0.0  (to  6.0.9)   8  
  5. 5. Fault  Localiza7on  (output  obtained  with  Tarantula)   public  class  ChipsListener  implements  ServletContextListener  {          public  ChipsListener()      {  }                  public  void  contextIni7alized(ServletContextEvent  evt)  {              ServletContext  context  =  evt.getServletContext();              JspApplica7onContext  jspContext  =    JspFactory.getDefaultFactory().getJspApplica7onContext(context);                      jspContext.addELResolver(new  ChipsELResolver());            }       n  even  if  related  to  the  bug,  the  bug  is  not  there!   n  this  is  not  the  best  ranked  piece  of  code   n  why  is  this  fragment  of  code  relevant?   9   [Jones  et  al.  Visualiza7on  of  Test  Informa7on  to  Assist  Fault  Localiza7on.  ICSE,  2002.]  
  6. 6. Failure  Analysis  (output  obtained  with  BCT)   •  javax.servlet.jsp.JspFactory.getDefaultFactory()  returned  null   •  then   org.apache.tomcat.u7l.modeler.Registry.unregisterComponent(javax.managem ent.ObjectName)  invoked   org.apache.catalina.session.ManagerBase.postDeregister()     •  and   org.apache.tomcat.u7l.modeler.Registry.unregisterComponent(javax.mana gement.ObjectName)    invoked   org.apache.catalina.loader.WebappLoader.postDeregister()     11   [Mariani,  Pastore,  Pezzè.  Dynamic  Analysis  for  Diagnosing  Integra7on  Faults.  TSE,  2011.]   Tomcat  failed  because  
  7. 7. How  to  iden7fy  the   events  responsible  for   a  failure  when  no  spec   is  available?  
  8. 8. Specifica?on  Mining:   Learn  specifica?ons   from  actual  execu?ons  
  9. 9. Specifica7on  mining  for   failure  analysis:  learn   the  regular  behavior   (specifica(on)…  to   detect  anomalous   events  
  10. 10. Behavioral  Anomalies  
  11. 11. 1 2 a 3 c 4 5 d f eb Applica7on   Traces   Model   Analysis   What  models  can  we  mine?   What  traces  do  we  need?   How  can  we  analyze  a  failure?   Trace   Failure  
  12. 12. What  models  can  we  mine?  
  13. 13. generate  a   model  that   represents  the   actual  behavior   from  samples   X=-­‐2   X=-­‐9   X=0   X=5   X=7   …   Actual  Behavior   -­‐10<X<10   Mined  behavior   -­‐10<X<10   Specifica7on  Mining  
  14. 14. Model  genera7on  is  imprecise…   Over-­‐Generaliza7on   Over-­‐Restric7on   Over-­‐Generaliza7on   and   Over-­‐Restric7on   Actual  Behavior   -­‐10<X<10   Mined  specifica7on   X  >  -­‐100   Mined  specifica7on   -­‐5  <  X  <  5   Mined  specifica7on   -­‐100  <  X  <  5  
  15. 15. Real  Specifica7on   -­‐10<X<10   Mined  Specifica7on   -­‐100<X<5   Specifica7on  Mining:  Models  Used  as  Specifica7ons   X  =  100   Correctly  rejected   behavior   X  =  1   Correctly  accepted   behavior   X  =  7   Erroneously  rejected   behavior   X  =  -­‐50   Erroneously  accepted   behavior  
  16. 16. Models   1 2 a 3 c 4 5 d f eb x > 0 Full  Ordering  of  Events   Data  Values   Ordering  of  Events  +   Data  Values   Par7al  Ordering  of   Events   open  =>  close  
  17. 17. MODELS  THAT  REPRESENT  THE   FULL  ORDERING  OF  EVENTS  
  18. 18. Mining  of  Finite  State  Models   •  Trace-­‐based  mining   – State-­‐based  merging   – Behavior-­‐based  merging   •  State-­‐based  mining   Total  =  0   Elem  =  0   Total  =  3   Elem  =  1   Total  =  5   Elem  =  2   Total  =  0   Elem  =  0   onLoad   add   add   empty   onLoad   add   add   empty  
  19. 19. kTail  (state-­‐based  merging)   TRACES   PTA   FSA   [Biermann  and  Feldman.  On  the  synthesis  of   finite  state  machines  from  samples  of  their   behavior.  IEEE  ToC,  1972.  ]   a   a   a   b   c   a   b   c   a   a   b   c   a   a   a   a   a   c  
  20. 20. Build  the  PTA   TRACES   PTA   a   a   a   b   c   a   b   c   a   a   b   c   a   a   a   a   a   c  
  21. 21. k=2   2-­‐future(2)  =  {aa,ab,bc}   2-­‐future(5)  =  {aa,  bc}   2-­‐future(11)  =  {}   2-­‐future(8)  =  {c}   …   2  FUTURES  
  22. 22. 2-­‐future(8)  =  {c}   2-­‐future(12)  =  {c}  
  23. 23. 2-­‐future(11)  =  {}   2-­‐future(13)  =  {}  
  24. 24. 2-­‐future(2)  =  {aa,  ab,  bc}   2-­‐future(3)  =  {aa,  ab,  bc}  
  25. 25. …  
  26. 26. K   Over-­‐ restric?on   Over-­‐ generaliza?on   Small  K   Big  K   The  Parameter  K  
  27. 27. kBehavior  (behavior-­‐based  merging)   •  incremental   Traces   login   home   checkMsg   logout   [Mariani,  Pastore,  Pezzè.  Dynamic  Analysis  for  Diagnosing  Integra7on  Faults.  TSE,  2011.]  
  28. 28. kBehavior  (behavior-­‐based  merging)   Traces   login   home   checkMsg   logout   login   home   checkMsg   7meout  
  29. 29. kBehavior  (behavior-­‐based  merging)   Traces   login   home   checkMsg   logout   login   home   checkMsg   7meout  
  30. 30. kBehavior  (behavior-­‐based  merging)   Traces   login   home   checkMsg   logout   login   home   checkMsg   7meout   login   home   checkMsg  watchVideo  home   checkMsg   logout   K  =  min  length  of   matched  behavior  
  31. 31. kBehavior  (behavior-­‐based  merging)   Traces   login   home   checkMsg   logout   login   home   checkMsg   7meout   login   home   checkMsg  watchVideo  home   checkMsg   logout  
  32. 32. kBehavior  (behavior-­‐based  merging)   Traces   login   home   checkMsg   logout   login   home   checkMsg   7meout   login   home   checkMsg  watchVideo  home   checkMsg   logout   login   home   checkMsg   read   home   checkMsg   logout  reply  
  33. 33. kBehavior  (behavior-­‐based  merging)   Traces   login   home   checkMsg   logout   login   home   checkMsg   7meout   login   home   checkMsg  watchVideo  home   checkMsg   logout   login   home   checkMsg   read   home   checkMsg   logout  reply  
  34. 34. The  Parameter  K   •  K  determines  the  degree  of  generaliza7on   •  Empirically,  behavior-­‐based  merging   generates  models  that  are  more  general  than   state-­‐based  merging  [Lo  et  al.,  JSS,  2012]   State-­‐based  merging   behavior-­‐based   merging  
  35. 35. State-­‐Based  Inference  of  FSM  Models   Total  =  0   Elem  =  0   Total  =  3   Elem  =  1   Total  =  5   Elem  =  2   Total  =  0   Elem  =  0   onLoad   add   add   empty   <0   ==0   >0   Abstrac7on   func7on   <0   =0   >0   <0   =0   >0   <0   =0   >0   <init>   Total  ==  0   Elem==0   Total  >  0   Elem  >  0   onLoad   add   add   empty   [Dallmeier,  Lindig,  Wasylkowski,  Zeller:  Mining  Object  Behavior  with  ADABU.  WODA  2006]   [Marcheso,  Tonella,  Ricca:  State-­‐Based  Tes7ng  of  Ajax  Web  Applica7ons.  ICST  2008]   [Mariani,  Marcheso,  Nguyen,  Tonella.  Revolu7on:  Automa7c  evolu7on  of  mined   specifica7ons.  ISSRE.  2012]  
  36. 36. The  Abstrac7on  Func7on   •  Quality  of  the  final  model  influenced  by   – Completeness  of  the  state  informa7on  that  is   traced   – The  kind  of  abstrac7on  implemented  by  the   abstrac7on  func7on       numElements   numDis7nctElements   numElements  VS   <0        =0        >0   <-­‐1        =-­‐1        =0        =1      >1  VS  
  37. 37. MODELS  THAT  REPRESENT  THE   VALUES  OF  VARIABLES  
  38. 38. Program  for  eLectronic  Commerce   Execu7ons   Trace  with   variable  values   totalCost  unitCost   4   3 1 7 …   8   3 8 14 …  
  39. 39. totalCost  unitCost   4   3 1 7 …   8   3 8 14 …   _  +  _  =_   _  <  _   _=_   _  >  0   unitCost  =  totalCost   unitCost  <  totalCost   unitCost  <=  totalCost   unitCost  +  totalCost  >  unitCost   …   preserve  expressions  with   perfect  confidence   unitCost  <=  totalCost   unitCost  +  totalCost  >  unitCost   totalCost  >  0     1  >  _   remove  redundant  proper(es   unitCost  <=  totalCost   totalCost  >    0   Daikon  in  a   nutshell   Traces   Template  Expressions   Candidate  Expressions   [Ernst,  Cockrell,   Griswold,  Notkin.   Dynamically   Discovering  Likely   Program  Invariants   to  Support  Program   Evolu7on.  IEEE  TSE   2001]  
  40. 40. The  Set  of  Template  Expressions   •  Expressiveness  depends  on  the  template  expressions   •  More  template  expr  =>  more  candidate  expressions   =>  higher  computa7onl  cost   •  Recently  defined  an  approach  to  deal  with   polynomial  and  array  expressions  [Nguyen  et  al.  ICSE  2012]   _  +  _  =_   _  <  _   _=_   _  >  0  1  >  _  
  41. 41. MODELS  THAT  REPRESENT  PARTIAL   ORDERING  OF  EVENTS  
  42. 42. Mine  Temporal  Rules   Traces   Template  Rules   <pre>   <post>   CONFIDENCE  AND  SUPPORT  THRESHOLDS   Temporal  Rules   [Lo,  Khoo,  Liu.  Mining  temporal   rules  for  so?ware  maintenance.   JSME,  2008]   [Yang,  Evans,  Bhardwaj,  Bhat,  Das.   Perracosa:  mining  temporal  API   Rules  from  Imperfect  Traces.  ICSE.   2006]   …   start   open   close   stop   start   load   stop   start   open   close   stop   begin   end  
  43. 43. Mine  Temporal  Rules   start   open   close   stop   start   load   stop   start   open   close   stop   Traces   Template  Rules   <pre>   <post>   CONFIDENCE  AND  SUPPORT  THRESHOLDS   Temporal  Rules   CONFIDENCE  OF  A  RULE   #  traces  rule  holds   …   #traces    pre  applies   start   open   has  67%  confidence   begin   end  
  44. 44. Mine  Temporal  Rules   Traces   Template  Rules   <pre>   <post>   CONFIDENCE  AND  SUPPORT  THRESHOLDS   Temporal  Rules   Conf  =  100%   Supp  =  20%   start   stop   close  open   …   …   SUPPORT  OF  A  RULE   #  traces  rule  holds   #traces   start   open   has  50%  support   start   open   close   stop   start   load   stop   start   open   close   stop   begin   end  
  45. 45. Template  Rules   •  Expressiveness  depends  on  the  template  rules   •  Confidence  and  Support  for  tuning  the   technique  wrt  imperfect  traces  
  46. 46. Steering  FSA  Models  with  Temporal  Rules   kTail  with  k=2   Overgeneraliza7on  problem:   -­‐  locally,  it  sounds  to  be  a   good  decision   -­‐  globally,  it  generates   anomalous  behaviors  
  47. 47. Idea:  mine  global  proper7es,  exploit   them  when  taking  decisions  locally   Traces   Mine  Temporal  Rules   Build  PTA   openFile   closeFile   closeConn  connDB   Apply  kTail  (e.g.   with  k=2)   BUT   prevent  state   merges  that  violate   temporal  rules   (LOCAL  DECISIONS)   [Lo,  Mariani,  Pezzè.  Automa7c  Steering  of  Behavioral  Model  Inference.  ESEC/FSE  2009]   GLOBAL  PROPERTIES  
  48. 48. EXTENDED  MODELS  
  49. 49. FSA  With  Annota7ons  to  Represent   •  Constraints  on  parameter  values    
  50. 50. Traces  With  Parameter  Values   addItem   addItem   buy   qt=1   unitCost=1   totalCost=1   qt=2   unitCost=3   totalCost=6   Traces   …  
  51. 51. GKTail   merging  similar  traces   EFSM   [Lorenzoli,  Mariani,  Pezzè.   Automa7c  Genera7on  of   So?ware  Behavioral  Models,   ICSE,  2008]   0 1 2 3 4 5 6 m1 0≤x≤15 m1 x=1 m2 x=0 y=0 x=y m3 z={’IT’,’UK’} m1 x=0 m2 x=0 0≤y≤20 8 9 10 11 12 13 m3 z=’UK’ m3 z=’UK’ m2 x=0 y=3 m3 z=’UK’ m1 x=0 m2 x=0 y=15 22 23 24 25 26 27 m1 x=0 m1 x=1 m2 x=0 y=0 x=y m3 z=’IT’ m3 z=’IT’ m2 x=0 y=30 PTA   deriving  guards   X=0   Y=0,   X=0   Y=15,   …   X=0,     0≤Y≤20   DAIKON   m2   m2  
  52. 52. GKTail   merging  similar  traces   deriving  guards   EFSM   [Lorenzoli,  Mariani,  Pezzè.   Automa7c  Genera7on  of   So?ware  Behavioral  Models,   ICSE,  2008]   0 1 2 3 4 5 6 m1 0≤x≤15 m1 x=1 m2 x=0 y=0 x=y m3 z={’IT’,’UK’} m1 x=0 m2 x=0 0≤y≤20 8 9 10 11 12 13 m3 z=’UK’ m3 z=’UK’ m2 x=0 y=3 m3 z=’UK’ m1 x=0 m2 x=0 y=15 22 23 24 25 26 27 m1 x=0 m1 x=1 m2 x=0 y=0 x=y m3 z=’IT’ m3 z=’IT’ m2 x=0 y=30 PTA   X=0   Y=0,   X=0   Y=15,   …   X=0,     0≤Y≤20   DAIKON   m2   m2  
  53. 53. Mining  Specifica7ons:  Different   Models  for  Different  Aspects   Different  models  can  capture  different  types  of  anomalous  behaviors  
  54. 54. Specifica7on  Mining  Tools   •  Synop7c   – hsp://code.google.com/p/synop7c/   •  Perracosa   – hsp://www.cs.virginia.edu/perracosa/   •  Adabu   – hsp://www.st.cs.uni-­‐saarland.de/models/ adabu.php3   •  KLFA   – hsp://www.lta.disco.unimib.it/tools/klfa/  
  55. 55. (When)  Are  Mined  Models   Precise  Enough?  
  56. 56. Empirical  Studies   -­‐  complexity  -­‐   Length  of  traces/Noise/Number     of  different  events  in  the  traces     Mining  simple  FSA  Mining  extended  FSA   Mining  temporal  rules   Mining  constraints   [Lo,  Mariani,  Santoro,  Learning  extended  FSA  from  So?ware:  An  Empirical  Assessment.     JSS,  2012]   [Yang,  Evans,  Bhardwaj,  Bhat,  Das.  Perracosa:  mining  temporal  API  Rules  from  Imperfect   Traces.  ICSE.  2006]   [Nugyen,  Marcheso,  Tonella.  Automated  Oracles:  An  Empirical  Study  on  Cost  and   Effec7veness,  ESEC/FSE,  2013]  
  57. 57. Empirical  Studies   -­‐  sensi7vity  -­‐   Capture  small  differences   Mining  simple  FSA  Mining  extended  FSA   Mining  temporal  rules   Mining  constraints   Capture  major  differences   FSA  good  to  analy7cally   capture  the  behavior  of  small   units  (e.g.,  components)   Temporal  rules  and  constraints   good  to  capture  some  behaviors   in  rela7vely  big  applica7ons  
  58. 58. Quality  of  Models  vs  Number  of  Traces   Component/ API/Method   Traces   Ideal  Model   -­‐  Transi7on  coverage   enough  for  mining  good   FSAs  [Lo,  JSS,  2012]   -­‐  Addi7onal  tests  can  be   generated  to  improve   models  [Dallmeier,  TSE,   2012]    
  59. 59. Quality  of  Models  vs  Number  of  Traces   Applica7on   Traces   Ideal  Model   -­‐  Good  FSAs  hard  to   mine     -­‐  Other  models:  several   traces  necessary  for   par7cularly  complex   cases  [Nguyen,  ESEC/FSE,   2013]  
  60. 60. Take  Home  About  Specifica7on  Mining   •  Think  to  your  research  area   – If  you  need  models  and   specifica7ons…   – …and  you  do  not  have  any,   – but  you  have  a  way  of   execu7ng  your  so?ware   – Specifica7on  Mining  could   an  op7on!  
  61. 61. Failure  Analysis  
  62. 62. 1 2 a 3 c 4 5 d f eb Applica7on   Traces   Model   Analysis   Trace   Failure  
  63. 63. Failure  Analysis  Based  on  Specifica7on   Mining   •  Analysis  of  (Field  and  Regression)  Failures   – BCT    [Mariani  et  al.  Dynamic  Analysis  for  Diagnosing  Integra7on  Faults.  TSE,  2011.]   •  Analysis  of  Regression  Failures   – Radar   [Pastore  et  al.  Dynamic  Analysis  of  Upgrades  in  C/C++  So?ware.  ISSRE,  2012.]   •  Produce  Descrip7ve  Reports   – AVA   [Babenko  et  al.  AVA:  automated  interpreta7on  of  dynamically  detected   anomalies.  ISSTA,  2009.]  
  64. 64. BCT:  a  technique  for  automated   iden7fica7on  of  func7onal  faults   1.  Capturing  Behavioral  Data   •  Monitoring  Component  Execu7ons   •  Capturing  Run-­‐Time  Informa7on   2.  Dis(lling  Behavioral  Models   •  I/O  Models   •  Interac7on  Models   3.  Failure  Analysis   •  Regression  failures   •  Field  failure  analysis   addItem(c) c.getTotalCost>=c.getCost c.getQuantity>0 ... getCart() Cart newCart() addItem(c) getCart() ... Catalog imageDB Failure!! unexpected   interac?on!   unexpected   interac?on!   unexpected   interac?on!   unexpected   value!   unexpected   value!   System
  65. 65. Capturing  IO  Data   purchase   …   checkCreden7als(usr,  taskType)   makeOrder(cart,  usr)   checkCreden7als(User,  TaskType)   usr.name  =  “Leonardo”   usr.address.streetName  =  “viale  Certosa”   …   taskType.type  =  2   …     usr.name  =  “Carlo”   usr.address.streetName  =  “viale  Manzoni”   …   taskType.type  =  1   …   execu7on  1   execu7on  2   makeOrder(cart,  usr)   purchase(cart)  
  66. 66. Capturing  Interac7on  Data   101   purchase   …   checkCreden7als(usr,  taskType)   makeOrder(cart,  usr)   purchase()   Auth.checkCreden7als(User,TaskType)   Shop.makeOrder(Cart,  User)   …     Auth.checkCreden7als(User,  TaskType)   Shop.makeOrder(Cart,  User)   …     Auth.checkCreden7als(User,  TaskType)   Logger.logPermissionDenied(User,  TaskType)   …   execu7on  1   execu7on  2   execu7on  3  
  67. 67. Dis7lling  Behavioural  Models   I/O  Data   Interac7on  Data   Daikon   kBehavior   I/O  Model   Interac7on  Model   x != null method1   method2   method3   method4   I/O  and  Interac7on   Models   I/O  and  Interac7on   Models   I/O  and  Interac7on   Models   I/O  and  Interac7on   Models  
  68. 68. Run-­‐Time  Verifica(on  and  Failure   Analysis   System Failure!! unexpected   interac?on!   unexpected   interac?on!   unexpected   interac?on!   unexpected   value!   unexpected   value!   103  
  69. 69. Filtering   •  Re-­‐execute  tests  and  remove  anomalies  detected  in  both  passing   and  failing  tests     Regression  tes7ng   •  Country==US violated  by  passing  regression  tests  because  the  new   version  of  the  applica7on  is  available    outside  US   •  Viola7ons  of  this  property  can  be  ignored     Field  failures   •  date==20/3/2013 spurious property violated  by  passing   regression  tests   •  Viola7ons  of  this  property  can  be  ignored   Remaining  anomalies  are  re-­‐arranged  according  to  likely  cause-­‐ effects  
  70. 70. Run-­‐Time  Verifica(on  and  Failure  Analysis:   Rela(ng  Anomalies   105   start   ini?alize   getValue   I  ini7alize  the   next  component   I  need  a  proper   value  for   ini7aliza7on   I  do  not  know   the  value!!!  I   return  null null  is  not  a   proper  value!  I   return  an  excep7on   we  have  an   excep7on!!!  Let’s  try  to   terminate  safely
  71. 71. Run-­‐Time  Verifica(on  and  Failure  Analysis:   Rela(ng  Failures     106   start   ini?alize   getValue   undo   log   closeConnec?on   log  the  event   and  close  the   connec7on
  72. 72. Run-­‐Time  Verifica(on  and  Failure  Analysis:   Rela(ng  Failures     107   start   ini?alize   getValue   undo   log   closeConnec?on   one  anomaly  is  the  cause  of  many  others!   return  null  value   throw  excep?on   call  undo   early  close  the   connec?on  
  73. 73. Capturing   Clusters   Dynamic  call  tree  for  the  Tomcat  case  study   ini?al  anomaly  graph   109  
  74. 74. Capturing   Clusters   Dynamic  call  tree  for  the  Tomcat  case  study   ini?al  anomaly  graph   110  
  75. 75. Capturing   Clusters   Dynamic  call  tree  for  the  Tomcat  case  study   ini?al  anomaly  graph   111  
  76. 76. Output  Obtained  with  BCT  for  the   Tomcat  Failure   ON  EXIT  from  javax.servlet.jsp.JspFactory.getDefaultFactory()   MODEL  VIOLATED  returnValue  !=  null  =  false   FROM   org.apache.tomcat.u7l.modeler.Registry.unregisterComponent(javax.man agement.ObjectName)     UNEXPECTED  CALL  TO   org.apache.catalina.session.ManagerBase.postDeregister()     FROM     org.apache.tomcat.u7l.modeler.Registry.unregisterComponent(ja vax.management.ObjectName)       UNEXPECTED  CALL  TO     org.apache.catalina.loader.WebappLoader.postDeregister()     112  
  77. 77. Eclipse  3.3  Anomaly  Graph   113  
  78. 78. Capturing   Clusters   Dynamic  call  tree  for  the  Tomcat  case  study   ini?al  anomaly  graph   114  
  79. 79. Eclipse  3.3  Anomaly  Graph   115  
  80. 80. Stopping  Criterion   edges  with  weights  greater  than   this  value  are  removed   cohesion(graph)   116   cohesion(graph)  =  avg(cohesion  (CCs))   cohesion(CC)  =  avg(weight  edges)     smaller  value  ==  beser  cohesion    
  81. 81. Resul7ng  Graph   …   •   the  components  are   inspected  from  the  biggest  to   the  smallest     •   the  first  two  graphs  are   enough  to  explain  the   problem!   117  
  82. 82. Improvements   Make  the  analysis   specific  to  the   type  of   considered  faults   Radar:  failure  analysis   of  regression  problems   Produce  outputs   that  beser  explain   the  reason  of  the   failure   AVA:  automa7c   analysis  of  anomalies  
  83. 83. V1   V2   chainItems.size > 0 24   25   27   28   29   31   availableQty   32   34   36   37   28   28   33   Radar  in  a  Nutshell   TEST  SUITE  TEST  SUITE   TRACE   Failed  because   initItems()  has  not   been  invoked  and   chainItem.size  =0  
  84. 84. AVA:  PRODUCING  DESCRIPTIVE   OUTPUTS  
  85. 85. File.open File.write File.close sortFile File.delete File.open   …sortFile   File.delete  File.write   Anomaly  Detec7on  with  FSA  
  86. 86. File.open File.write File.close sortFile File.delete File.open …sortFile File.deleteFile.write Anomaly  Detec7on  with  FSA  
  87. 87. File.open File.write File.close sortFile File.delete File.open …sortFile File.deleteFile.write Anomaly  Detec7on  
  88. 88. File.open File.write File.close sortFile File.delete File.open …sortFile File.deleteFile.write Anomaly  Detec7on   Should the path of the file be the problem? Should the sorting be the problem? Should the content of the file be the problem? May be the file has not been closed!!! …
  89. 89. File.open File.write File.close sortFile File.delete File.open …sortFile File.deleteFile.write
  90. 90. File.open File.write File.close sortFile File.delete File.open …sortFile File.deleteFile.write Anomaly interpretation: Missing event: File.close The  file  has  not  been  closed!  
  91. 91. AVA  =  Compare  actual  and   expected  behaviors  (represented   with  a  FSA)  to  produce  informa7ve   outputs  
  92. 92. Automata  Viola7ons  Analysis  -­‐  AVA   Iden7fy   Model   Viola7ons   ____   _____   _____   _____   _____   _____   _____       FSA   Trace   Branches Tails Final States
  93. 93. Automata  Viola7ons  Analysis  -­‐  AVA   Iden7fy   Model   Viola7ons   Iden7fy   Basic   Interpreta7ons   ____   _____   _____   _____   _____   _____   _____       FSA   Trace   Deletions Insertions Replacements Terminations Branches Tails Final States
  94. 94. Automata  Viola7ons  Analysis  -­‐  AVA   Iden7fy   Model   Viola7ons   Iden7fy   Basic   Interpreta7ons   Iden7fy   Composite   Interpreta7ons   ____   _____   _____   _____   _____   _____   _____       FSA   Trace   Deletions Insertions Replacements Terminations Branches Tails Final States Anticipations Postponements Swaps
  95. 95. File.open File.write File.close sortFile File.delete File.open   File.close  sortFile   File.delete  File.write   ε File.close Iden7fy  Model  Viola7ons     with  FSA  Extensions   fsa’  =  kBehavior(fsa,t)   extensions  =  diff(fsa,fsa’)  
  96. 96. File.open File.write File.close sortFile File.delete File.open File.closesortFile File.deleteFile.write ε Iden7fy  Basic  Interpreta7ons   File.open File.write File.open sortFile File.delete sortFile File.delete File.close File.close File.write File.write sortFile File.deleteFile.close
  97. 97. File.open sortFile File.deleteFile.write Iden7fy  Basic  Interpreta7ons   Observed sequence Expected sequences Compare using alignment algorithms File.open File.write File.open sortFile File.delete sortFile File.delete File.close File.close File.write File.write sortFile File.deleteFile.close
  98. 98. File.open File.write File.close sortFile File.delete File.open File.write - sortFile File.delete Expected   Observed   The  applica7on  failed   because  File.close  has  not   been  executed     sortFile  is  anomalous  BETTER  THAN  
  99. 99. Different  interpreta7ons  can  be  discovered   with  different  alignment  strategies   Deletion EV1 EV2 EV3 EV4 EV5 Insertion EV1 EV2 EV3 EV EV4 EV5 Replacements EV1 EV2 EV3 EV EV5 Terminations EV1 EV2 EV3 Anticipations EV1 EV5 EV2 EV3 EV4 Postponements EV1 EV3 EV4 EV5 EV2 Swap EV1 EV5 EV3 EV4 EV2 SIMPLE  INTERPRETATIONS   COMPOSITE  INTERPRETATIONS  
  100. 100. Tools   •  BCT   – hsp://www.lta.disco.unimib.it/tools/bct/   •  Radar   – hsp://www.lta.disco.unimib.it/tools/radar/   •  AVA   – hsp://www.lta.disco.unimib.it/tools/ava/  
  101. 101. CONCLUDING  REMARKS  
  102. 102. Specifica?on  Mining     can  be  used  to  enable   several  analyses  in  the   frequent  case  no   specifica?on  is  available  
  103. 103. One  interes7ng   domain  is  failure   analysis  
  104. 104. Specifica7on  mining  has  been   experienced  in  several  other  contexts   Other  Failure  analysis  approaches   [Hangal,  Lam.  Tracking  down  so?ware  bugs  using  automa7c  anomaly  detec7on.  ICSE   2002]   [Yilmaz,  Paradkar,  Williams.  Time  will  tell.  ICSE.  2010]     Regression  tes?ng   •  Behavioral  Regression  Tes7ng    [Jin,  Orso,  Xie.  Automated  Behavioral  Regression  Tes7ng.  ICST.  2010.]   •  Compa7bility  Tes7ng    [Mariani,  Papagiannakis,  Pezzè.  Compa7bility  and  regression  tes7ng  of  COTS-­‐ Component-­‐based  so?ware.  ICSE.  2007.]     Combined  with  Sta?c  analysis   [Pradel,  Gross.  Leveraging  test  genera7on  and  specifica7on  mining  for  automated  bug   detec7on  without  false  posi7ves.  ICSE  2012]   [Dallmeier,  Zeller,  Meyer.  Genera7ng  fixes  from  object  behavior  anomalies.  ASE.  2009]  
  105. 105. But  be  careful  with  “posi7ve”   behavioral  anomalies  
  106. 106. References   •  Babenko,  Mariani,  Pastore.  AVA:  automated  interpreta7on  of  dynamically  detected  anomalies.   ISSTA,  2009   •  Biermann  and  Feldman.  On  the  synthesis  of  finite  state  machines  from  samples  of  their  behavior.   IEEE  ToC,  1972.   •  Dallmeier,  Knopp,  Mallon,  Fraser,  Hack,  Zeller.  Automa7cally  Genera7ng  Test  Cases  for   Specifica7on  Mining.  TSE,  2012   •  Dallmeier,  Lindig,  Wasylkowski,  Zeller:  Mining  Object  Behavior  with  ADABU.  WODA  2006   •  Dallmeier,  Zeller,  Meyer.  Genera7ng  fixes  from  object  behavior  anomalies.  ASE.  2009   •  Ernst,  Cockrell,  Griswold,  Notkin.  Dynamically  Discovering  Likely  Program  Invariants  to  Support   Program  Evolu7on.  IEEE  TSE  2001   •  Gabel,  Su.  Tes7ng  Mined  Specifica7ons.  ESEC/FSE.  2012.   •  Hangal,  Lam.  Tracking  down  so?ware  bugs  using  automa7c  anomaly  detec7on.  ICSE  2002   •  Jin,  Orso,  Xie.  Automated  Behavioral  Regression  Tes7ng.  ICST.  2010.   •  Jones,  Harrold,  Stasko.  Visualiza7on  of  Test  Informa7on  to  Assist  Fault  Localiza7on.  ICSE,  2002.   •  Lo,  Khoo,  Liu.  Mining  temporal  rules  for  so?ware  maintenance.  JSME,  2008   •  Lo,  Mariani,  Pezzè.  Automa7c  Steering  of  Behavioral  Model  Inference.  ESEC/FSE  2009   •  Lo,  Mariani,  Santoro,  Learning  extended  FSA  from  So?ware:  An  Empirical  Assessment.    JSS,  2012   •  Lorenzoli,  Mariani,  Pezzè.  Automa7c  Genera7on  of  So?ware  Behavioral  Models,  ICSE,  2008    
  107. 107. References   •  Marcheso,  Tonella,  Ricca:  State-­‐Based  Tes7ng  of  Ajax  Web  Applica7ons.  ICST  2008   •  Mariani,  Marcheso,  Nguyen,  Tonella.  Revolu7on:  Automa7c  evolu7on  of  mined  specifica7ons.   ISSRE.  2012   •  Mariani,  Papagiannakis,  Pezzè.  Compa7bility  and  regression  tes7ng  of  COTS-­‐Component-­‐based   so?ware.  ICSE.  2007.   •  Mariani,  Pastore.  Automa7c  Iden7fica7on  of  Failure  Causes  in  System  Logs,  ISSRE,  2008   •  Mariani,  Pastore,  Pezzè.  Dynamic  Analysis  for  Diagnosing  Integra7on  Faults.  TSE,  2011.   •  Nguyen,  Kapur,  Weimer,  Forrest:  Using  dynamic  analysis  to  discover  polynomial  and  array   invariants.  ICSE  2012:  683-­‐693   •  Nugyen,  Marcheso,  Tonella.  Automated  Oracles:  An  Empirical  Study  on  Cost  and  Effec7veness,   ESEC/FSE,  2013   •  Pastore,  Mariani,  Goffi,  Oriol,  Wahler.  Dynamic  Analysis  of  Upgrades  in  C/C++  So?ware.  ISSRE,   2012.   •  Pradel,  Gross.  Leveraging  test  genera7on  and  specifica7on  mining  for  automated  bug  detec7on   without  false  posi7ves.  ICSE  2012   •  Raz,  Koopman,  Shaw.  Seman7c  anomaly  detec7on  in  online  data  sources.  ICSE.  2002.   •  Yang,  Evans,  Bhardwaj,  Bhat,  Das.  Perracosa:  mining  temporal  API  Rules  from  Imperfect  Traces.   ICSE.  2006   •  Yilmaz,  Paradkar,  Williams.  Time  will  tell.  ICSE.  2010.    
  108. 108. Ques7ons?  
  109. 109. ICSE  DOCTORAL  SYMPOSIUM   S.C.  Cheung  and  L.  Mariani   Submission  deadline:  Nov  22,  2013   No7fica7on:  Feb  17,  2014   Camera  Ready:  Mar  14,  2014   Event  Date:  Jun  3,  2014  

×