Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Master project - Competitive Co-evolutionary Code-Smells Detection

57 views

Published on

Presented by Mohamed Boussaa on June 25, 2017. Project conducted in the Search-Based Software Engineering Lab at Missouri University, USA (SBSE@MST). This work has been published at SSBSE 2013 (https://www.researchgate.net/publication/305215765_Competitive_Coevolutionary_Code-Smells_Detection)

  • Be the first to comment

  • Be the first to like this

Master project - Competitive Co-evolutionary Code-Smells Detection

  1. 1. Missouri University of Science and Technology, USA Graduation Project Candidate : Mohamed BOUSSAA Advisors : Dr Marouane KESSENTINI - MS&T, USA Mrs Soukeina BENCHIKHA - ISSATSo, TUNISIA Academic year 2012/2013Date: June 25, 2013 Missouri High Ins;tue of Applied Sciences and Technology of Sousse Jury Members : Mr Imed BOUDRIGA - ISSATSo, TUNISIA Dr Riadh MTIBAA - ISSATSo, TUNISIA 1
  2. 2. Outline Context 5 Related Work and Problem Statement Compete5ve Co-evolu5onary Approach for Code-Smells Detec5on Evalua5on Conclusion 2
  3. 3. • Environment : Missouri University of Science and Technology, USA (MS&T) Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion • Laboratory : Software Engineering Laboratory (SEL) • Objectives : Ø  Design, build and evaluate new automated software engineering approaches based on the use of search- based techniques Ø Solve real-world software engineering problems 3
  4. 4. • Software is complex • It changes frequently Ø Add new functionalities Ø Identifying and correcting bugs Ø Adaptation to environment changes Na;onal Ins;tute of Standards and Technology (NIST) • SoCware defects cost nearly $60 Billion Annually • 80% of development costs involve iden5fying and correc5ng defects Improving soCware quality facilitate soCware evolu;on Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 4
  5. 5. q Software Code-Smells •  Design situations that adversely affect the development and maintenance of a software •  Anomalies, anti-patterns, bad smells… •  Unavoidable bad practices •  Introduce bugs Examples: Spagheti code, functional decomposition, and Blob Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 5
  6. 6. q The blob example • One large class monopolizes the behavior of a system • Other classes encapsulate data • Procedural-style design Symptoms : ü Controller class ü Abnormally large ü No parents and no children. ü Uses data classes (Small classes) Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 6
  7. 7. q The blob example Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 7 Refactoring (Moving Methods, Add Attributes,...)
  8. 8. •  Problem Code-smells detection requires specific and contextual knowledge –  Require a great experts' knowledge and interpreta5on –  Need a huge number of examples to provide good detec5on results –  Not always fully available –  Difficult to express, structure, implement •  Solution The use of artificial code-smells examples to compensate this lack of knowledge Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q Statement 8
  9. 9. Outline Context Related Work and Problem Statement Compete5ve Co-evolu5onary Approach for Code-Smells Detec5on Evalua5on Conclusion 5 9
  10. 10. •  Symptom-based approaches (Moha et al., ’07) Ø Declara5ve rule specifica5on Rules èDefini5on èSymptoms Metric combina5ons need substan5al calibra5on efforts No consensual symptom-based defini5on of code-smells •  Manual approaches (Brown et al. ., ’98) Ø  Define manually code-smells Require huge human effort (Experts) Time-consuming, error-prone Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q Related work 10
  11. 11. •  Machine learning approaches (Kessentini et al., ’10) Ø  Inspired from the AIS Ø  Generate detectors to es5mate the risks of classes that deviate from "normality“ No experts' knowledge and interpreta5on •  High level of false posi5ves Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q Related work 11
  12. 12. •  Search-based approaches (Kessentini et al., ’11) Ø  Generate rules automa5cally Ø  Use of code-smell examples (Available in defect repositories) No experts' knowledge and interpreta5on •  Detec5on rules depend on the quality of examples Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q Related work 12
  13. 13. • No consensus to decide if a particular design fragment is a code-smell • The same symptom could be associated to many defect types • Difficulty to automate symptom’s evaluation (threshold value definition, large list of quality metrics, …) • Prioritizing the list of code-smells detected by type Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q Problem statement 13
  14. 14. Difficulty to find code-smells examples • Code smells examples are manually inspected, identified and documented Code-smells are not usually documented by developers Difficult time-consuming and manual process Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q Problem statement 14 Proposal : Compe--ve Co-evolu-onary Approach for Code-Smells Detec-on (CCEA)
  15. 15. Outline Context Related Work and Problem Statement Compete5ve Co-evolu5onary Approach for Code-Smells Detec5on Evalua5on Conclusion 5 15
  16. 16. q Co-evolution (Darwin., ’29) q Competition •  Solu5ons belong to different species •  Co-evolve subpopula5ons of individuals Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 16
  17. 17. Selec;on Crossover Muta;on Next Genera;on Popula;on Selec;on Crossover Muta;on Next Genera;on Popula;on Evolu&onary_Algorithm 1 Evolu&onary_Algorithm 2 Elite solu;ons Elite solu;ons Evalua;on Best solu;on (set of detec;on rules) Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q Overall process 17
  18. 18. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code-smells examples Detec5on rules New system A to evaluate Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q Approach overview 18
  19. 19. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Quality metrics Well designed code examples Ar5ficial code-smells examples Detec5on rules New system A to evaluate Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q  CCEA Inputs 19 Code-smells examples
  20. 20. • Code-smells examples ü  A class that has at least one design defect (blob, spaghec code, func5onal decomposi5on) ü Available in defect repositories in companies ü Manually inspected and documented Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 20
  21. 21. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code-smells examples Detec5on rules New system A to evaluate q CCEA Inputs Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 21
  22. 22. • Quality metrics ü  Measuring a property of a soCware code Examples : ü Number of methods (NOM) ü Number of private fields. (NOP) ü Number of accessors (NOA) ü Lack of cohesion in methods (LCOM) ü Number of children (NOC) ü Weighted Methods per Class (WMC) ü Cohesion and Coupling Between Objects (CBO) Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion WMC LCOM NOM CBO [1..1000] [1..100] Quality metrics NOA 22
  23. 23. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code-smells examples Detec5on rules New system A to evaluate Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q CCEA Inputs 23
  24. 24. • Well designed examples ü  Examples of reference code ü  Source code without design defects. ü  Contains few known code-smells Example : Jhotdraw: well-designed reference code Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 24
  25. 25. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code-smells examples Detec5on rules New system A to evaluate Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q CCEA Outputs 25
  26. 26. • Detection rules 1 : Blob 2 : Spaghetti code 3 : Functional decomposition Defect types 1 : If (LOCCCLASS≥1500) AND (LOCCMETHOD≥129) OR (NMD>100) Then Blob 2 : If (LOCMETHOD ≥ 151) Then Spaghetti code 3 : If (NPRIVFIELD≥4) AND (NMD=16) Then Functional decomposition Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 26
  27. 27. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code-smells examples Detec5on rules New system A to evaluate Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q CCEA Outputs 27
  28. 28. • Artificial code-smells examples ü  Represent a code-smell ü Generated automa5cally by the second popula5on ü Generated based on the no5on of deviance from well-designed code Detectors d1 d2 d3 s1 s2 s3 s4 s5 Reference Code Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 28
  29. 29. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code-smells examples Detec5on rules New system A to evaluate q CCEA Adaptation Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 29
  30. 30. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code-smells examples Detec5on rules q CCEA Adaptation Solu5on representa5on Evalua5on Evolu5onary operators Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 30
  31. 31. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code- smells examples Detec5on rules q CCEA : Solution representation Quality metrics •  Tree : –  Leaf node (Terminal) : metrics and their thresholds –  Internal node (Func5ons) : logic operators (AND;OR) Example •  Vector : –  A set of detectors –  A detector is composed of 7 metrics Example Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 31
  32. 32. • Genetic Programming : Solution representation 1 : If (LOCCCLASS≥1500) AND (LOCCMETHOD≥129) OR (NMD>100) Then Blob 2 : If (LOCMETHOD ≥ 151) Then Spaghetti code 3 : If (NPRIVFIELD≥4) AND (NMD=16) Then Functional decomposition Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 32
  33. 33. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code- smells examples Detec5on rules q CCEA : Solution representation Quality metrics •  Tree : –  Leaf node (Terminal) : metrics and their thresholds –  Internal node (Func5ons) : logic operators (AND;OR) Example •  Vector : –  A set of detectors –  A detector is composed of 7 metrics Example Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 33
  34. 34. • Genetic Algorithm : Solution representation NOM|NOA|WMC|LCOM|CBO|NOP|NOC 11 50 88 35 2 55 37 NOM|NOA|WMC|LCOM|CBO|NOP|NOC 17 21 60 24 8 11 65 ü  Solu5on represented by 2 ar5ficial code-smells examples. –  [11, 50, 88, 35, 2, 55, 37] –  [17, 21, 60, 24, 8, 11, 65] Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion Vector (Solu5on) 2 ar5ficial Code-smells Examples (Detectors) 34
  35. 35. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code-smells examples Detec5on rules q CCEA Adaptation Solu5on representa5on Evalua5on Evolu5onary operators Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 35
  36. 36. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code- smells examples Detec5on rules q CCEA : Solutions evaluation Quality metrics Fitness func;on •  Evalua5ng the ar5ficial code-smells examples : o  Maximize the dissimilarity score between generated code-smells and reference code o  Maximize the number of generated code- smell examples un-covered by the solu5ons of the first popula5on •  Evalua5ng detec5on-rules solu5ons : o  Maximize the coverage of the base of code-smell examples o  Maximize the number of covered “ar5ficial” code-smells generated by the second popula5on Fitness func;on Example Example Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 36
  37. 37. • Genetic Programming : Fitness function 2 2 )()( )(1 11 1 p Sa t Sa d c Sf p i i p i i d i i ∑∑ ∑ == = + + = ⎪⎩ ⎪ ⎨ ⎧ = otherwise0 examplesofbasein theexistssmell-codedetecteditheif1 )( th Sai - p : number of detected classes - t : number of defects in the base of examples - d : number of artificial code-smells examples ⎪⎩ ⎪ ⎨ ⎧ = otherwise0 smell-codeaasdetectedissmell-codeartificialitheif1 th ic Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 37
  38. 38. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code- smells examples Detec5on rules q CCEA : Solutions evaluation Quality metrics Fitness func;on •  Evalua5ng the ar5ficial code-smells examples : o  Maximize the dissimilarity score between generated code-smells and reference code o  Maximize the number of generated code- smell examples un-covered by the solu5ons of the first popula5on •  Evalua5ng detec5on-rules solu5ons : o  Maximize the coverage of the base of code-smell examples o  Maximize the number of covered “ar5ficial” code-smells generated by the second popula5on Fitness func;on Example Example Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 38
  39. 39. • Genetic Programming : Evaluation example Detection results :Code-smells in the base of examples : Generated code-smells examples 375.0 2 2 6 1 3 1 4 2 )(1 = + + =Sf Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion Detector Detector 1 Detector 2 Detector 3 Detector 4 Class Blob FD SC Student X Person X University X Course X Classroom X Administra5on X Class Blob FD SC Person X Classroom X Professor X Detector 1 X Detector 2 X 39
  40. 40. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code- smells examples Detec5on rules q CCEA : Solutions evaluation Quality metrics Fitness func;on •  Evalua5ng the ar5ficial code-smells examples : o  Maximize the dissimilarity score between generated code-smells and reference code o  Maximize the number of generated code- smell examples un-covered by the solu5ons of the first popula5on •  Evalua5ng detec5on-rules solu5ons : o  Maximize the coverage of the base of code-smell examples o  Maximize the number of covered “ar5ficial” code-smells generated by the second popula5on Fitness func;on Example Example Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 40
  41. 41. • Genetic Algorithm : Fitness function - r : number of generated code-smell examples un-covered by the solu5ons of the first popula5on - )( ik dM : Metric value of the ar5fical code-smell - )( jk cM : Metric value of the well designed code r n cMdM df jkik k n l i + − = ∑∑ == )()( )(2 7 11 - n : number of generated code-smell examples Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 41
  42. 42. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code- smells examples Detec5on rules q CCEA : Solutions evaluation Quality metrics Fitness func;on •  Evalua5ng the ar5ficial code-smells examples : o  Maximize the dissimilarity score between generated code-smells and reference code o  Maximize the number of generated code- smell examples un-covered by the solu5ons of the first popula5on •  Evalua5ng detec5on-rules solu5ons : o  Maximize the coverage of the base of code-smell examples o  Maximize the number of covered “ar5ficial” code-smells generated by the second popula5on Fitness func;on Example Example Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 42
  43. 43. • Genetic Programming : Evaluation example Generated code-smells examples : Well designed code example NOM NOA WMC LCOM CBO NOP NOC 2 0 2 1 4 6 2 0 0 1 4 4 3 0 NOM NOA WMC LCOM CBO NOP NOC 10 2 2 0 4 6 2 5.171 2 )2341210()128( )(2 =+ ++++++++ =idf Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 43
  44. 44. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code-smells examples Detec5on rules q CCEA Adaptation Solu5on representa5on Evalua5on Evolu5onary operators Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 44
  45. 45. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code- smells examples Detec5on rules q CCEA : Evolutionary operators (Selection) Quality metrics Popula5on size / 2 Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 45
  46. 46. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code- smells examples Detec5on rules q CCEA : Evolutionary operators (Crossover) Quality metrics Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 46
  47. 47. Genetic Programming for detection rules generation Genetic Algorithm for detectors generation Code-smells examples Quality metrics Well designed code examples Ar5ficial code- smells examples Detec5on rules q CCEA : Evolutionary operators (Mutation) Quality metrics Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 47
  48. 48. q CCEA : Project Design Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion •  Sequence diagram 48
  49. 49. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion System Start execu5on Specifying a termina5on criterion : nb of itera5ons Specifying code-smells examples Specifying the quality metrics Specifying the well designed code examples Returning the best set of detec5on rules Valida5on of results Evalua5on of both popula5ons Apply gene5c techniques for both popula5ons Produce the two next genera5ons Ini5alize the two popula5ons :Loop [CurrentItera5on < NbItera5ons] 49
  50. 50. Outline Context Related Work and Problem Statement Compete5ve Co-evolu5onary Approach for Code-Smells Detec5on Evalua5on Conclusion 5 50
  51. 51. Q1 : To what extent can the proposed approach detect efficiently code-smells (in terms of correctness and completeness)? Q2 : To what extent does the compe55ve co-evolu5on approach performs beuer than the considered single- popula5on ones? q Research questions : Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 51
  52. 52. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion Metrics plugin Systems Number of classes Number of code-smells ArgoUML v0.26 1358 138 Xerces v2.7 991 82 Ant-Apache v1.5 1024 103 Azureus v2.3.0.6 1449 108 q Implementation details : 52
  53. 53. •  Two exis5ng single popula5on approaches •  Two Measures : Ø Using Genetic Programming (GP) Ø Using Artificial Immune System (AIS) (Kessentini et al., ’11) (Kessentini et al., ’10) Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q Comparison : 53
  54. 54. •  For the first popula5on Ø  4-fold cross-valida5on ( between 4 open source systems) •  For the second popula5on Ø  JHotDraw as example of reference code q Experimental settings •  For the Gene5c algorithm Ø  The popula5on size is fixed to 100 Ø  The number of genera5ons is set to 1000 Ø  Maximum of 13 metrics per rule Ø  We consider a set of 7 metrics Ø  We generated at each solu5on up-to 150 “ar5ficial” code-smells Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 54
  55. 55. •  Precision Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion CCEA GP AIS Systems Precision Precision p-value Precision p-value Azureus v2.3.0.6 71 62 <0.01 65 <0.01 ArgoUMLv 0.26 91 81 <0.01 77 <0.01 Xercesv2.7 93 84 <0.01 83 <0.01 Ant-Apachev1.5 93 86 <0.01 86 <0.01 q Results and performance analyses 55
  56. 56. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion CCEA GP AIS Systems Recall Recall p-value Recall p-value Azureus v2.3.0.6 75 62 <0.01 66 <0.01 ArgoUMLv 0.26 84 79 <0.01 88 <0.01 Xercesv2.7 88 83 <0.01 86 <0.01 Ant-Apachev1.5 92 80 <0.01 84 <0.01 •  Recall q Results and performance analyses 56
  57. 57. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion •  The impact of number of systems in the base of examples on the detec5on results (Xerces) q Results and performance analyses 57
  58. 58. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion Algorithm Execu;on ;me CCEA 1h and 22 minutes GP 1h and 13 minutes AIS 1h and 4 minutes • Execution time q Results and performance analyses 58
  59. 59. Outline Context Related Work and Problem Statement Compete5ve Co-evolu5onary Approach for Code-Smells Detec5on Evalua5on Conclusion 5 59
  60. 60. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion •  A novel approach to the problem of code-smells detection –  It represents the first auempt to use compe55ve co-evolu5on to detect code-smells –  Combining different detec5on using the compe55ve co-evolu5onary algorithms •  Validation –  Set of java open-source java systems –  Comparison with exis5ng approaches (single popula5on approaches) –  Promising results •  Future work –  Valida5on with larger systems –  Use various other code-smells types and quality metrics –  Combine many single popula5on approaches (more than two) 60
  61. 61. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion 61 This work is accepted for publica;on at : –  5th IEEE Symposium of Search-based SoCware Engineering SSBSE 2013 Reference : –  Boussaa, M., Kessentini, W., Kessentini, M., Bechickh, S., Ben Chikha, S.: Competitive Co-evolutionary Code-Smells Detection. In: Proceedings of the 5th IEEE Symposium International Conference of Search-based Software Engineering SSBSE ’13, St-Petersburg, Russia (2013) •  Publications
  62. 62. Thank you for your attention 62
  63. 63. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion Ini5aliza5on Evalua5on Muta5on Crossover Selec5on Evolu;on Cycle Genera5on of an ini5al popula5on Evalua5on of the quality of solu5ons Select of best solu5ons q Steps of Genetic Algorithm 63
  64. 64. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion •  Find the best combina5on of quality metrics that cover the base of examples and the ar5ficial code-smell examples •  Find the best threshold value for each metric Cannot be done in an exhaus5ve way because we have a huge number of combina5on of these metrics The rule genera5on process is a combinatorial op5miza5on problem. Due to the huge number of possible combina5ons, a determinis5c search is not prac5cal, and the use of a heuris5c search is warranted. To explore the search space, we use a global heuris5c search by means of gene5c programming. q Use of Genetic Algorithm •  Genetic Programming
  65. 65. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion •  Random number of ar5ficial code-smells in each solu5on •  We assigns random metrics values for each detector Finding the best set of ar5ficial code-smells examples that deviates from the reference code q Use of Genetic Algorithm •  Genetic Algorithm
  66. 66. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q Problem Statement (Examples) Log Classes (No Consensus for code-smells detection) A “Log” class responsible for maintaining a log of events in a program, used by a large number of classes, is a common and acceptable prac5ce. However, from a strict code-smell defini5on, it can be considered as a class with an abnormally large coupling. Blob classes (Symptom’s evaluation : need of expert) The blob detec5on involves informa5on such as class size. Although we can measure the size of a class, an appropriate threshold value is not trivial to define. A class considered large in a given program/community of users could be considered average in another.
  67. 67. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion q 4-fold cross-validation Example 1 Example 4 Example 2 Example 3 Generate detec5on rules Comparison with the expected ones Result 67
  68. 68. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion Specifying code-smells examples Producing the next genera5on of detec5on rules Applying gene5c techniques Reaching the termina5on criterion Finding the best set of detec5on rules Genera5ng an ini5al popula5on of detec5on rules Specifying the quality metrics Finding the best set of ar5ficial code-smells Genera5ng an ini5al popula5on of code-smells Specifying the well-designed code examples Valida5on of results Evalua5ng each solu5on of ar5ficial code-smells Producing the next genera5on of code-smells Evalua5ng each solu5on of detec5on rules <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> <<Include>> 68
  69. 69. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion
  70. 70. Context Related work and problem statement Competetive Co- evolutionary Approach Evalua5on Conclusion

×