Your SlideShare is downloading. ×
0
Decision Support Analysis for Software Effort Estimation by Analogy Jingzhou Li Guenther Ruhe University of Calgary, Canad...
Outline Technology (evaluation) Which technology is suitable for which situations? What is the empirical evidence support ...
1. Estimation by analogy —An introduction New Object EBA Effort estimate Historical data Three steps : 1. Search for analo...
2. Decision-centric process model of EBA Processed Historical Data Raw Historical Data D8. Determining closest analogs  D2...
3.  Decision problems of EBA and solution alternatives  where   Si.j  represent the   j th   solution alternative of decis...
3.  Decision problems of EBA and solution alternatives  <ul><li>General form of EBA: </li></ul><ul><li>EBA =   F   (D1, D2...
3.  Decision problems of EBA and solution alternatives --Customization of EBA   EBA =  F  (D1, D2, …, D11) Data set type 1...
4. Decision support in an example EBA method —AQUA + Effort estimates Data set for  AQUA + Learned accuracy distribution A...
Data sets used in the comparative study 4. Decision support in an example EBA method  —Comparative study Mendes et al., 20...
Comparison of the four attribute weighting heuristics 2. H 1  performed better than H 0  for all data sets, hence is recom...
4. Decision support in an example EBA method  —Apply the knowledge obtained from the comparative study H 3  is suitable fo...
5. Decision support and empirical studies Application or customization  of EBA e.g. Knowledge about which alternatives are...
6. Summary and future work Decision-centric process model   Decision support Decision problems and solution alternatives  ...
Major references <ul><li>G. Ruhe, &quot;Software Engineering Decision Support—A New Paradigm for Learning Software Organiz...
Thank you !  Comments and questions?
A preliminary DSS framework for EBA Machine learning and reasoning tools Virtual DB Virtual KB … Interface Database Docume...
Upcoming SlideShare
Loading in...5
×

Decision Support Analyss for Software Effort Estimation by Analogy

1,753

Published on

Jingzhou Lo and Guenther Ruhe

Published in: Economy & Finance, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,753
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
113
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Slide 11: 4 Heuristics as a header for the lower Introduce a simplified formulae exlaing how the coefficients were calculated. - Be prepared for the following questions: (1) Are there alternatives to using RSA to determine the importance of the attributes? (2) What is the overall effort of the method(s) (3) Wham means the name AQUA? (4) When do you recommend apply the method? (and when better not?) (5) Needs the learning be done after each new prediction (data point)??
  • Transcript of "Decision Support Analyss for Software Effort Estimation by Analogy"

    1. 1. Decision Support Analysis for Software Effort Estimation by Analogy Jingzhou Li Guenther Ruhe University of Calgary, Canada PROMISE’07, May 20, 2007
    2. 2. Outline Technology (evaluation) Which technology is suitable for which situations? What is the empirical evidence support of the decision? Software effort estimation by analogy (EBA) What are the optional methods for EBA? What are the basic decision-making problems? What are the empirical evidences to support the decision-making? Decision making Empirical studies Empirical study (an example ) Decision–centric process model of EBA
    3. 3. 1. Estimation by analogy —An introduction New Object EBA Effort estimate Historical data Three steps : 1. Search for analogs (similar objects) 2. Determine the closest analogs 3. Predict by analogy adaptation How many analogs should we use? What adaptation strategy should we use? What if there are missing values? What similarity measures should we use? 1. What are the basic tasks to accomplish for a user in order to apply or customize EBA? 2. What are the basic decision-making problems and their solution alternatives for applying or customizing EBA? v nm … v 1m a m … v ij … … r 2 e n ... v n1 r n e 1 … v 11 r 1 Effort … a 2 a 1 v gm a m ? … v g2 v g1 s g Effort … a 2 a 1
    4. 4. 2. Decision-centric process model of EBA Processed Historical Data Raw Historical Data D8. Determining closest analogs D2. Dealing with missing values D1. Impact analysis of missing values D7. Retrieving analogs Objects Under Estimation Effort Estimates D9. Analogy adaptation D11. Comparing EBA methods in general D10. Choosing evaluation criteria D6. Determining similarity measures D3. Object selection D5. Attribute weighting & selection D4. Discretization of attributes
    5. 5. 3. Decision problems of EBA and solution alternatives where Si.j represent the j th solution alternative of decision problem Di Accuracy-based methods EBA comparison methods in General D11 Some conventional criteria: e.g. MMRE, Pred Choosing evaluation criteria D10 Mean, weighted mean, linear extrapolation Analogy adaptation strategy D9 Fixed number of analogs without considering similarity measure; through learning process Determining closest analogs D8 Using similarity measures or rule-based heuristics Retrieving analogs D7 Distance-based, local-global similarity principle Determining similarity measures D6 S5.1—Brute-force attribute selection S5.2—WRAPPER attribute selection S5.3—Rough Sets based attribute selection S5.4—Attribute weighting using regression S5.5—Attribute weighting using genetic algorithm S5.6-S5.9—Attribute weighting using Rough Sets (heuristic H 1 to H 4 ) Attribute weighting and selection D5 For RSA-based attribute weighting; Based on interval, frequency, or both; other techniques used in machine learning Discretization of continuous attributes D4 Hill climbing, simulated annealing, forward and backward sequential selection algorithms Object selection D3 Deletion and imputation techniques; NULL value Dealing with missing values D2 Preliminary knowledge Impact analysis of missing values D1 Typical solution alternatives Decision problems ID
    6. 6. 3. Decision problems of EBA and solution alternatives <ul><li>General form of EBA: </li></ul><ul><li>EBA = F (D1, D2, …, D11) </li></ul><ul><li>where </li></ul><ul><li>domain of Di : {Si.j} – solution alternatives of Di </li></ul><ul><li>F is an amalgamation function </li></ul><ul><li>Customization of EBA: </li></ul><ul><li>A specific EBA is obtained for a given data set DB by using </li></ul><ul><li>a (set of) specific solution alternatives Si.j of Di and </li></ul><ul><li>aggregated through function F. </li></ul><ul><li>EBA( DB ) = F (D1, D2, …, D11, DB ) </li></ul>
    7. 7. 3. Decision problems of EBA and solution alternatives --Customization of EBA EBA = F (D1, D2, …, D11) Data set type 1 Data set type 2 Data set type k …… Classification according to characteristics of the data sets Si.j for Di? 3. How empirical study can be used to support the decision-making regarding the customization of EBA? Customization 1 Customization 2 Customization k
    8. 8. 4. Decision support in an example EBA method —AQUA + Effort estimates Data set for AQUA + Learned accuracy distribution Attributes & weights Raw historical data Pre-process (missing value, attribute type…) Pre-Phase (D2, D6) (D4, D5) (D8) (D7, D9) Objects under estimation S2.3: NULL value S6.5: local-global similarity, weighted mean of local-similarity measures S4.2: equal frequency and equal width discretization S5.6-S5.9: RSA-based attribute weighting, heuristics H 1 -H 4 S7.1: similarity measure S8.2: learning process S9.1: adaptation using weighted mean General form of AQUA + : AQUA + = F (D2(S2.3), D4(S4.2), D5(S5.6), D6(S6.5), D7(S7.1), D8(S8.2), D9(S9.1)) For a specific type of data set DB : AQUA + ( DB ) = ? e.g. S5.6-S5.9: H 1 -H 4 ? AQUA + Learning Phase1 Predicting Phase2 Attribute weighting and selection Phase0
    9. 9. Data sets used in the comparative study 4. Decision support in an example EBA method —Comparative study Mendes et al., 2003 0 0 6 34 Mends03 Kemerer et al., 1987 40 0 5 15 Kem87 ISBSG, 2004 63 27.24 24 158 ISBSG04-2 Jingzhou et al., 2005 71 6.8 14 76 USP05-FT Jingzhou et al., 2005 71 2.54 14 121 USP05-RQ Source %Non-Quantitative Attributes %Missing Values #Attributes #Objects Data Sets
    10. 10. Comparison of the four attribute weighting heuristics 2. H 1 performed better than H 0 for all data sets, hence is recommended for use in AQUA + . Tentative conclusions: 1. H 1 and H 3 performed the best, hence RSA-based attribute weighting is recommended for use by AQUA + . 4. Decision support in an example EBA method —Comparative study H 4 H 3 H 2 H 1 H 0 -0.05 -0.05 − 0.15 -0.09 Kem87 0.15 0.62 − 0.03 -0.79 USP05-RQ 0.37 0.52 -1.53 0.42 0.22 USP05-FT 0.35 0.30 -2.62 1.81 0.16 ISBSG04-2 -0.47 AccuH[i] -0.48 -0.47 1.42 1.42 Mends03 Weighting Heuristics Data sets
    11. 11. 4. Decision support in an example EBA method —Apply the knowledge obtained from the comparative study H 3 is suitable for this class H 1 is suitable for this class New data set Which heuristic should be used? H 1 is suitable for this class
    12. 12. 5. Decision support and empirical studies Application or customization of EBA e.g. Knowledge about which alternatives are suitable for which types of data set New Data Set Knowledge base DSS for EBA Empirical studies Apply knowledge Classify Customize EBA
    13. 13. 6. Summary and future work Decision-centric process model Decision support Decision problems and solution alternatives Example EBA AQUA + Empirical studies Knowledge base DSS for EBA
    14. 14. Major references <ul><li>G. Ruhe, &quot;Software Engineering Decision Support—A New Paradigm for Learning Software Organizations&quot;, Advances in Learning Software Organization , Lecture Notes In Computer Science, Vol. 2640, Springer 2003, pp 104-115. </li></ul><ul><li>V.R. Basili, G. Caldiera, and H.D. Rombach, &quot;Experience Factory&quot;, Encyclopedia of Software Engineering (Eds. J. Marciniak), Vol. 1, 2001, pp 511-519. </li></ul><ul><li>G. Ruhe, &quot;Software Engineering Decision Support and Empirical Investigations - A Proposed Marriage&quot;, The Future of Empirical Studies in Software Engineering (A. Jedlitschka, M. Ciolkowski, Eds.), Workshop Serious on Empirical Studies in Software Engineering, Vol. 2, 2003, pp 25-34. </li></ul><ul><li>M. Shepperd, C. Schofield, “Estimating Software Project Effort Using Analogies”, IEEE Transactions on Software Engineering , 23(1997) 736-743. </li></ul><ul><li>J.Z. Li, G. Ruhe, A. Al-Emran, and M.M. Ritcher, &quot;A Flexible Method for Effort Estimation by Analogy&quot;, Empirical Software Engineering , Vol. 12, No. 1, 2007, pp 65-106. </li></ul><ul><li>J.Z. Li, G. Ruhe, &quot;Software Effort Estimation by Analogy Using Attribute Weighting Based on Rough Sets&quot;, International Journal of Software Engineering and Knowledge Engineering , To appear. </li></ul><ul><li>J.Z. Li, A. Ahmed, G. Ruhe, &quot;Impact Analysis of Missing Values on the Prediction Accuracy of Analogy-based Software Estimation Method AQUA&quot;, ESEM’07, Madrid, Spain, September 2007. </li></ul>
    15. 15. Thank you !  Comments and questions?
    16. 16. A preliminary DSS framework for EBA Machine learning and reasoning tools Virtual DB Virtual KB … Interface Database Documents Web contents, hypermedia Model base Other forms of contents Rule base Domain knowledge Dealing with missing values Attribute weighting and selection Discretization of attributes General EBA comparison methods … Object selection Determining similarity measures Retrieving & determining analogs Analogy adaptation strategy … Knowledge representation and acquisition General data analysis tools Decision-centric EBA process Objects under estimation Effort estimates
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×