Ph.D. Presentation

1,316 views

Published on

Application of Computational Intelligence to Energy Systems, University of Rome "Roma Tre", Department of Informatics and Automation (DIA)

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,316
On SlideShare
0
From Embeds
0
Number of Embeds
420
Actions
Shares
0
Downloads
25
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • Ph.D. Presentation

    1. 1. Application of Computational Intelligence to Energy Systems Matteo De Felice Scuola Dottorale di Ingegneria Sezione di Informatica e Automazione XXIII° Ciclo
    2. 2. OutlineWhat is ComputationalIntelligence (CI)?What are the possibleapplications of CI to energysystems?
    3. 3. Quick glance NN EC FS SI AIS
    4. 4. Quick glance Soft NN Computing EC FS SI AIS
    5. 5. Quick glance Soft NN Computing EC FS SI AISComputational Intelligence
    6. 6. Quick glance Soft NN Computing EC FS And SI AIS AI?Computational Intelligence
    7. 7. Quick glance My focus NN EC FS SI AIS
    8. 8. CI and scientific literature −3 x 105 Evolutionary Computation Swarm Intelligence4 Artificial Neural Networks321 01994 1996 1998 2000 2002 2004 2006 2008 2010 yearData from Thomson Reuters ISI considering Computer Science &Technology (January 2010)Two CI journals on the CS top 10 (IF 2009)
    9. 9. Is CI gaining interest?Problems more and morecomplexMore computational poweravailable
    10. 10. but... Lack of well-established theory Algorithms fragmentation Tendency to unsystematic approach and comparisonPSO APSO CPSO DPSO EPSO FPSO GPSO HPSO IPSOLPSO MPSO NPSO OPSO PPSO QPSO RPSO SPSO TPSOUPSO VPSO WPSO GA AGA BGA CGA DGA EGA FGAHGA IGA KGA LGA MGA OGA PGA QGA RGA SGAVGA ...
    11. 11. Main applications1) Modeling & Forecasting2) Optimization
    12. 12. Main applications Neural Networks & Fuzzy Logic1) Modeling & Forecasting2) Optimization Evolutionary Computation
    13. 13. Modeling & Forecasting
    14. 14. Modeling with NNs ||F (x) − f (x)|| < , ∀x 1.2 0.6Y Axis 0 0 1 2 3 4 5 6 6.5 -0.6 -1.2 X Axis y = sin(x)
    15. 15. Modeling with NNs ||F (x) − f (x)|| < , ∀x 1.2 0.6Y Axis 0 0 1 2 3 4 5 6 6.5 -0.6 -1.2 X Axis y = sin(x) NN(x)
    16. 16. Modeling with NNs DisturbancesInput u(k) Output y(k) System Neural Network Error measure (MSE) Empirical Rules for NN typology
    17. 17. Regression with NNWe use a NN to do non-linearregression
    18. 18. Time Series ForecastingWe can forecast future datausing known past data
    19. 19. Time Series ForecastingWe can forecast future datausing known past data And other useful (!) information as well
    20. 20. NN approaches y(t+1)Input at Neural y(t+2) ... Direct Method time t Network y(t+N) Input at output t+1 time t Neural Network output t Iterative Method delay
    21. 21. NN approaches y(t+1)Input at Neural y(t+2) ... Direct Method time t Network y(t+N) Input at output t+1 time t Neural Network output t Iterative Method delay
    22. 22. Short-Term Load Forecasting 60 40kW 20 0 0 200 400 600 800 1000 1200 1400 1600 1800 2000 hours Hourly load data Goal: up to 24-hours ahead load prediction
    23. 23. NN model 36 30 y(k-1) 25 y(k)Y Axis 20 y(k+1) 15 10 0 2 4 6 8 10 12 14 16 18 20 22 24 X Axis
    24. 24. NN model 36 30 y(k-1) 25 y(k)Y Axis 20 y(k+1) 15 10 0 2 4 6 8 10 12 14 16 18 20 22 24 X Axis How to choose the best lags?
    25. 25. Data Analysis1. ACF2. Distribution −3. Multivariate analysis
    26. 26. First questionHow to reduce the variance ofNeural Networks outputs?
    27. 27. First questionHow to reduce the variance ofNeural Networks outputs?
    28. 28. Ensembling
    29. 29. Ensembling1. Model creation with data subset (Bagging)2. Data samples weights related to their ‘importance’ (Adaboosting)3. Interaction and cooperation among estimators
    30. 30. Ensembling1. Model creation with data subset (Bagging)2. Data samples weights related to their ‘importance’ (Adaboosting)3. Interaction and cooperation among estimators
    31. 31. Ensembling[Hansen & Salomon, 1990]Majority voting (classification)Linear combination (regression) N 1 F (x, D) = Fi (x, D) N i=1
    32. 32. Ensembling Averaging
    33. 33. Application STLF of a building located inside ENEA Casaccia R.C. (C59) Presentation at IEEE Symposium on CI Applications in Smart GridM. De Felice and X. Yao, "Neural Networks Ensembles for Short-Term LoadForecasting," in IEEE Symposium Series in Computational Intelligence 2011 (SSCI2011), 2011
    34. 34. TechniquesNaive predictor:SARIMA (Seasonal ARIMA)model:ΦP (B s )φ(B) D s d xt = α + ΘQ (B s )θ(B)etNeural NetworksNN Ensembles
    35. 35. Methodology 40 24 hours 35 30kW 25 training part 20 15 10 2010 2013 2016 2019 2022 2025 2028 2031 2034 2037 2040 2043 2046 2049 2052 2055 2058 hoursMeasured data from Septemberto November 2009Training (13 weeks) and testing(one week split in T1 and T2) sets
    36. 36. Error MeasuresAbsolute Error (MAE and MSE)Percentage Error (MAPE)
    37. 37. Error MeasuresAbsolute Error (MAE and MSE)Percentage Error (MAPE)Scaled Error (MASE)
    38. 38. Negative Correlation Learning[Liu & Yao, 1999]Backpropagation error functionmodifiedPenalty term λ M ei = (Fi (xn ) − yn )2 + λpi n=1
    39. 39. Regularized NCL [Chen & Yao, 2009] NCL with Regularization M M 1 2 1ei = (Fi (xn ) − yn ) − (Fi (x) − F (xn ))2 + N n=1 N n=1 T +αi wi wi
    40. 40. Errors
    41. 41. Errors MAE MSE 2.34 (0.79) 10.9 (17.88)NN Average 2.49 (1.47) 21.67 (59.29) 1.38 2.95NN Ensemble 1.09 2.4 1.47 3.34 RNCL 1.07 2.82
    42. 42. Errors MAE MSE 2.34 (0.79) 10.9 (17.88)NN Average 2.49 (1.47) 21.67 (59.29) 1.38 2.95NN Ensemble 1.09 2.4 1.47 3.34 RNCL 1.07 2.82
    43. 43. Errors MAE MSE 2.34 (0.79) 10.9 (17.88)NN Average 2.49 (1.47) 21.67 (59.29) 1.38 2.95NN Ensemble 1.09 2.4 1.47 3.34 RNCL 1.07 2.82
    44. 44. Errors MAE MSE 2.34 (0.79) 10.9 (17.88)NN Average 2.49 (1.47) 21.67 (59.29) 1.38 2.95NN Ensemble 1.09 2.4 1.47 3.34 RNCL 1.07 2.82
    45. 45. Errors MAE MSE 2.34 (0.79) 10.9 (17.88)NN Average 2.49 (1.47) 21.67 (59.29) 1.38 2.95NN Ensemble 1.09 2.4 1.47 3.34 RNCL 1.07 2.82 2.11 7.61 Naive 2.28 6.4 1.89 5.52 SARIMA 1.24 2.17
    46. 46. Errors MAE MSE 2.34 (0.79) 10.9 (17.88)NN Average 2.49 (1.47) 21.67 (59.29) 1.38 2.95NN Ensemble 1.09 2.4 1.47 3.34 RNCL 1.07 2.82 2.11 7.61 Naive 2.28 6.4 1.89 5.52 SARIMA 1.24 2.17
    47. 47. External data
    48. 48. External dataIntroduction of: buildingoccupancy, info about hour, dayof the week, working days.
    49. 49. External dataIntroduction of: buildingoccupancy, info about hour, dayof the week, working days.NN: additional inputs
    50. 50. External dataIntroduction of: buildingoccupancy, info about hour, dayof the week, working days.NN: additional inputsSARIMA: additional linear term
    51. 51. Additional inputs 4 SARIMA − external data SARIMA 3Absolute error 2 1 0 0 20 40 60 80 100 120 140 Forecasting window
    52. 52. Additional inputs 4 MLP Ensemble − external data SARIMA − external data MLP Ensemble SARIMA 3Absolute errorabsolute 2 1 0 0 20 40 60 80 100 120 140 Forecasting window forecast window
    53. 53. Additional inputs 4 MLP Ensemble − external data SARIMA − external data 4 MLP Ensemble SARIMA 3 Absolute error 3absolute error absolute 2 2 1 1 0 0 0 0 20 20 40 60 80 100 120 140 140 forecast window forecast window Forecasting window
    54. 54. Errors – external data
    55. 55. Errors – external data MAE MSE 2.46 (0.83) 12.13 (16.80)NN Average 2.34 (1.00) 11.61 (10.61) 1.42 3.30NN Ensemble 0.75 1.27 1.33 2.7 RNCL 0.92 1.62
    56. 56. Errors – external data MAE MSE 2.46 (0.83) 12.13 (16.80)NN Average 2.34 (1.00) 11.61 (10.61) 1.42 3.30NN Ensemble 0.75 1.27 1.33 2.7 RNCL 0.92 1.62
    57. 57. Errors – external data MAE MSE 2.46 (0.83) 12.13 (16.80)NN Average 2.34 (1.00) 11.61 (10.61) 1.42 3.30NN Ensemble 0.75 1.27 1.33 2.7 RNCL 0.92 1.62
    58. 58. Errors – external data MAE MSE 2.46 (0.83) 12.13 (16.80)NN Average 2.34 (1.00) 11.61 (10.61) 1.42 3.30NN Ensemble 0.75 1.27 1.33 2.7 RNCL 0.92 1.62
    59. 59. Errors – external data MAE MSE 2.46 (0.83) 12.13 (16.80)NN Average 2.34 (1.00) 11.61 (10.61) 1.42 3.30NN Ensemble 0.75 1.27 1.33 2.7 RNCL 0.92 1.62 2.11 7.61 Naive 2.28 6.4 1.91 5.61 SARIMA 1.20 2.07
    60. 60. Errors – external data MAE MSE 2.46 (0.83) 12.13 (16.80)NN Average 2.34 (1.00) 11.61 (10.61) 1.42 3.30NN Ensemble 0.75 1.27 1.33 2.7 RNCL 0.92 1.62 2.11 7.61 Naive 2.28 6.4 1.91 5.61 SARIMA 1.20 2.07
    61. 61. Optimization
    62. 62. Process Optimization Process Parameters Process Environment (X) MeasurementHow to improve process‘performance’ with respect to itsparameters?
    63. 63. Traditional optimizationLine-search and trust-regionmethods (Hessian needed)Quasi-newton methods (Hessianapproximated)Derivative-free Methods
    64. 64. ...but real-world is:1) Noisy2) Dynamic3) Hard to investigate
    65. 65. Evolutionary Computation (EC)
    66. 66. Evolutionary Computation (EC)Black-box optimization
    67. 67. Evolutionary Computation (EC)Black-box optimizationSingle- and multi-objective
    68. 68. Evolutionary Computation (EC)Black-box optimizationSingle- and multi-objectiveAlso discontinuous and not-differentiable functions
    69. 69. Evolutionary Computation (EC)Black-box optimizationSingle- and multi-objectiveAlso discontinuous and not-differentiable functionsPopulation-based meta-heuristics:
    70. 70. Evolutionary Computation (EC)Black-box optimizationSingle- and multi-objectiveAlso discontinuous and not-differentiable functionsPopulation-based meta-heuristics:
    71. 71. Application Start-up optimization of a CCPP Minimization of time, fuel consumption, emissions and thermal stress Maximization of energy outputM. De Felice, I. Bertini, A. Pannicelli, and S. Pizzuti, "Soft Computing basedoptimisation of combined cycled power plant start-up operation with fitnessapproximation methods," Applied Soft Computing, (to appear). I. Bertini, M. De Felice, F. Moretti, and S. Pizzuti, "Start-Up Optimisation of a Combined Cycle Power Plant with Multiobjective Evolutionary Algorithms," in Applications of Evolutionary Computation, 2010, pp. 151-160.
    72. 72. Project steps1. Definition of performance index2. Software simulator setup3. EC algorithm using simulator
    73. 73. Performance index F1 1Process 0.5 0 0 0.5 1 1.5 2 2.5 3 4 x 10experts F2 1 0.5interviews 0 0 0.5 1 1.5 F3 2 2.5 5 x 10 3 1Knowledge 0.5 0 0 5 10 15 9 x 10modeling with F4 1 0.5fuzzy functions 0 0 5 10 15 20 25 30 F5 1 0.5 0 0 50 100 150 200 250 300
    74. 74. Single-objectivereal-coded GAGaussian Mutation operatorApproximated fitness function tospeed-up the optimization (from2070 to 36 hours)
    75. 75. Results Start-up Fuel Energy Thermal Emissions time consum. output stressExperts 21070 143557 2.5•109 25 10 GA 16569 115070 1.86•109 18.8 78.4 Norm. -25% -16% -16% -30% 2%Variation
    76. 76. Multi-objective 12.65 12.6 12.55Emissions (mg s / N m3) 12.5 12.45 12.4 12.35 12.3 Real NSGA−2 12.25 WSGA RAND 12.2 3.9 4 4.1 4.2 4.3 4.4 4.5 4.6 Energy Production (KJ) 9 x 10
    77. 77. Other projects
    78. 78. Financial Applications Financial trend reversal detection with nature-inspired and machine learning approachesA. Azzini, M. De Felice, and A. Tettamanzi, "Financial Trend Reversal DetectionProblem: a Comparison between Nature-Inspired and Machine LearningApproaches", Natural Computing in Computational Finance, vol. 4, Springer, (toappear)
    79. 79. Spatially-Structured EA Evolutionary Algorithms on complex networks Diversity and convergenceM.De Felice, S. Meloni, and S. Panzieri. “Effect of Topology on Diversity ofSpatially-Structured Evolutionary Algorithms”, GECCO 2011: Parallel EvolutionarySystems, 11-16 July 2011, Dublin

    ×