Effectiveness and Efficiency of Particle Swarm Optimization Technique in Inverse Heat Conduction Analysis

4,627 views

Published on

0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
4,627
On SlideShare
0
From Embeds
0
Number of Embeds
10
Actions
Shares
0
Downloads
0
Comments
0
Likes
4
Embeds 0
No embeds

No notes for slide
  • James Kennedy (social psychologist) / Eberhart: Professor of Electrical and Computer Engineering / Survival
  • Performance w/ GA
  • Make these two one!
  • Maybe skip this one
  • Performance w/ GA
  • Effectiveness and Efficiency of Particle Swarm Optimization Technique in Inverse Heat Conduction Analysis

    1. 1. Effectiveness and Efficiency of Particle Swarm Optimization Technique in Inverse Heat Conduction Analysis <ul><ul><li>Presented by: </li></ul></ul><ul><ul><li>Soheyl Vakili </li></ul></ul><ul><ul><li>Supervisor: </li></ul></ul><ul><ul><li>Dr. M. S. Gadala </li></ul></ul><ul><ul><li>October 2009 </li></ul></ul>
    2. 2. Inverse Problems <ul><li>Determining unknown causes based on observation of their effects </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    3. 3. Inverse Heat Transfer Problem Finding Surface Values from Readings of Thermocouples Inside the Plate Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    4. 4. Classical Approach <ul><li>Based on calculating the sensitivity matrix </li></ul><ul><li>Use linear minimization theory, to match predicted results with measured data </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion [ See: “ Inverse Heat Conduction: Ill-Posed Problems”; By Beck, et al.]
    5. 5. Objective Function <ul><li>Minimize </li></ul>T direct solution Obtained Using Direct Solution with Assumed Boundary Conditions T experiment Experimentally Obtained Solution N Number of Data Points Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    6. 6. Our Problems with Classical Approach <ul><li>Nonphysical overshoots and undershoots </li></ul><ul><li>Damping the peak heat flux values </li></ul><ul><li>Too sensitive to measurement errors </li></ul><ul><li>Unstable for small time steps </li></ul>If you don’t experience these problems: Stick to the classical approaches Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion If you experience these problems: You may want to try stochastic methods, such as Particle Swarm
    7. 7. Particle Swarm Optimization (PSO) <ul><li>A relatively recent global search algorithm developed by Eberhart and Kennedy in 1995 </li></ul><ul><li>Based on social behavior of species in nature, e.g. a swarm of birds or a school of fish </li></ul><ul><li>If a member finds a desirable path, the rest will follow </li></ul><ul><li>Each member learns not only from its own experience, but from the others, especially from the best performer </li></ul><ul><li>Cooperation vs. Competition </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    8. 8. Basic Particle Swarm Optimization <ul><li>Initialize the positions and velocities for all of the particles in the swarm randomly </li></ul><ul><li>Evaluate the fitness at each position </li></ul><ul><li>Find the best performer </li></ul><ul><li>Update all velocities </li></ul><ul><li>Update all positions </li></ul><ul><li>Repeat (2) – (5) until convergence </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    9. 9. Update Equations i ’s best performance x i p g p i v i overall best performance Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    10. 10. Some Variations of the PSO <ul><li>Repulsive Particle Swarm Optimization ( RPSO ) : </li></ul><ul><ul><li>Does not have the tendency toward the global best performer. </li></ul></ul><ul><ul><li>Has a repulsion between the particle and the best position of a randomly chosen other particle, to prevent the population from being trapped in a local minimum. </li></ul></ul><ul><li>Complete Repulsive Particle Swarm Optimization ( CRPSO ) : </li></ul><ul><ul><li>RPSO + the tendency toward the global best performer </li></ul></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    11. 11. Test Case I: 1D Transient Problem without regularization with regularization Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    12. 12. Test Case II: 2D Problem Axisymmetric around the left side Similar to a thermocouple hole inside a plate Top surface is subjected to a heat flux similar to those happening in real cooling of hot steel Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    13. 13. Classical Approach & Large Time Step Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    14. 14. Classical Approach & Small Time Step Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    15. 15. PSO Is Stable for Small Time Steps Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    16. 16. Comparing the Efficiency <ul><li>More Efficient = Less Computational Expense </li></ul><ul><li>= Fewer function evaluations (direct problem solutions) </li></ul><ul><li>Best performances of the three variations of PSO are compared with each other and with the best performance of the genetic algorithm ( GA ) which is the common method in these problems </li></ul><ul><li>Because of their stochastic nature, the solution time is different for each run </li></ul><ul><li>We need a statistical test </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    17. 17. t – Test Results critical t -value = 1.73 If t -value > 1.73, Method 2 performs better than Method 1 in at least 95% of cases GA : Genetic Algorithm Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion CRPSO RPSO 6 CRPSO PSO 5 RPSO PSO 4 CRPSO GA 3 RPSO GA 2 PSO GA 1 Method 2 Method 1 Test # 7.75 3.45 3.04 Test 6 7.92 7.59 4.63 Test 5 1.27 3.32 1.49 Test 4 13.41 17.76 10.78 Test 3 9.41 12.52 6.60 Test 2 8.26 10.08 4.88 Test 1 Test Case 3 Test Case 2 Test Case 1
    18. 18. Observations <ul><li>PSO is generally faster than genetic algorithm in solving inverse heat conduction problems (up to 36%) </li></ul><ul><li>The advantage of PSO over genetic algorithm is more pronounced in more complex cases (more unknowns and less data points) </li></ul><ul><li>RPSO improves the performance of PSO, only in more complex cases </li></ul><ul><li>CRPSO generally performs better than RPSO </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    19. 19. Effectiveness in Handling Noisy Data <ul><li>Noisy data is inevitable </li></ul><ul><li>To simulate real experiments, a virtual error is added to the inside temperatures </li></ul>r : normally distributed random variable with zero mean and unit standard deviation σ is the standard deviation Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    20. 20. Effect of Self-Confidence Parameter (1) Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    21. 21. Effect of Self-Confidence Parameter (1) Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    22. 22. Effect of Self-Confidence Parameter (2) Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    23. 23. Effect of Self-Confidence Parameter (3) Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    24. 24. Advantages of PSO <ul><li>Over classical approaches: </li></ul><ul><li>Does not require derivatives  More stable </li></ul><ul><li>Smaller time steps are possible </li></ul><ul><li>Easily parallelizable </li></ul><ul><li>Over genetic algorithm: </li></ul><ul><li>Simpler to understand and implement </li></ul><ul><li>Fewer parameters to adjust </li></ul><ul><li>Lower computational costs (up to 36%) </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    25. 25. Disadvantages of PSO <ul><li>Still too slow compared to classical approaches </li></ul><ul><li>Not exactly repeatable in terms of computational cost, which makes the comparison hard </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    26. 26. Future Works <ul><li>Applying to real 3D cases </li></ul><ul><li>Modeling of moving heat flux </li></ul><ul><li>Comparison with other schemes under different initial and boundary conditions </li></ul><ul><li>Experimenting with and developing other variations of PSO to reduce the computational expenses </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    27. 27. <ul><li>Thank You </li></ul>
    28. 28. Future Works <ul><li>Use temperature data as an input to: </li></ul><ul><li>Model thermal stresses </li></ul><ul><li>Model microstructure evolutions </li></ul><ul><li>In combination with hydraulic studies: obtain a better simulation of the jet impingement cooling process </li></ul>Background > Inverse Problem > Particle Swarm Optimization > Results > Conclusion > Future Work
    29. 29. What do we get from our experiments? Background > Inverse Problem > Particle Swarm Optimization > Results > Conclusion > Future Work
    30. 30. Motivations for Controlled Cooling <ul><li>Obtaining Desired Properties, e.g. Strength, Toughness, </li></ul><ul><li>Formability, etc. by Means of Desired Microstructural Properties, e.g. Grain Size </li></ul>Background > Inverse Problem > Particle Swarm Optimization > Results > Conclusion > Future Work
    31. 31. Motivations for Controlled Cooling <ul><li>Obtaining Desired Microstructural Properties, by Means of Controlled Cooling on Run-Out Table </li></ul>Controlled Cooling Background > Inverse Problem > Particle Swarm Optimization > Results > Conclusion > Future Work
    32. 32. Improved Objective Function Regularization Penalizing nonphysical oscillations in the results Background > Inverse Problem > Particle Swarm Optimization > Results > Conclusion > Future Work q i : Heat Flux Component
    33. 33. World Crude Steel Production Background > Inverse Problem > Particle Swarm Optimization > Results > Conclusion > Future Work &quot;World Steel in Figures, 2007&quot;, http://www.worldsteel.org/
    34. 34. Experimental Setup at UBC Background > Inverse Problem > Particle Swarm Optimization > Results > Conclusion > Future Work
    35. 35. Test Case I: 1D Problem
    36. 36. Speedup 1.22 19611.40 1.36 9938.90 1.22 4155.40 CRPSO 1.15 20872.30 1.27 10658.60 1.15 4426.70 RPSO 1.13 21118.70 1.19 11382.10 1.11 4582.40 PSO 1.00 23903.10 1.00 13498.90 1.00 5074.70 GA Speedup Cost Speedup Cost Speedup Cost Test Case 3 Test Case 2 Test Case 1
    37. 37. Motivations for Controlled Cooling <ul><li>Desired Mechanical Properties, e.g. Strength, Toughness, Formability, etc. </li></ul><ul><li>Desired Microstructural Properties </li></ul><ul><li>Controlled Cooling (Desired Time-Temperature Profile): Boundary Conditions </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    38. 38. Measurement Errors Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    39. 39. What to Expect? <ul><li>Classical Approaches in Solving Inverse Problems </li></ul><ul><li>Introduction to Particle Swarm Optimization (PSO) technique </li></ul><ul><li>Application to Some Test Cases </li></ul><ul><li>Efficiency (Computational Expense) </li></ul><ul><li>Effectiveness in Handling Noisy Data </li></ul><ul><li>Conclusion </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    40. 40. Statistical t – Test A significance level of 5%, and a 10+10-2=18 degrees of freedom a critical t­ -value of 1.73 Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    41. 41. Effect of the Regularization Parameter (1) Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    42. 42. Effect of the Regularization Parameter (2) Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    43. 43. Effect of the Regularization Parameter (3) Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    44. 44. Effect of Using PSO Variants Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    45. 45. Key Points To Remember <ul><li>Inverse Problem: Solving for unknown causes (e.g. boundary and initial conditions, shape, materials) based on known effects </li></ul><ul><li>First try the classical methods </li></ul><ul><li>If unstable, go for stochastic approaches </li></ul><ul><li>PSO is a suitable approach (relatively fast and stable in dealing with noise) </li></ul><ul><li>You can use variations of PSO and tweak them to get better results </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    46. 46. PSO Is Stable for Small Time Steps Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    47. 47. Test Case III: 3D Steady Problem Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    48. 48. 3D Results <ul><li>The top plane ( y = 1) temperature contours; </li></ul><ul><li>solid lines: input temperature (expected output) </li></ul><ul><li>dotted lines: output of the inverse analysis. </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    49. 49. Effect of the Regularization Parameter (4) <ul><li>Increasing the value of α from 10 -12 to 10 -10 can improve the handling of noisy data </li></ul><ul><li>This trend is similar to the one in classical approaches </li></ul><ul><li>Increasing α increases the required number of iterations </li></ul><ul><li>After 10 -10 , it can result in divergence </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    50. 50. Effect of Self-Confidence Parameter (4) <ul><li>A moderate increase in c 0 can improve the handling of noise </li></ul><ul><li>Explanation: Increasing c 0 results in a more global search in the domain, and increases the capability of method in escaping from the local minima caused by the noise </li></ul><ul><li>Increasing c 0 beyond 1.3 results in divergence </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    51. 51. PSO Variants <ul><li>CRPSO is the most efficient one (less computational expense) </li></ul><ul><li>Tuning the self-confidence parameter improves the effectiveness in handling noisy data </li></ul><ul><li>RPSO is the most effective variation in dealing with noise, followed closely by CRPSO </li></ul><ul><li>CRPSO is recommended for the inverse heat conduction practice, as it combines the efficiency and effectiveness </li></ul>Background > Classical Methods > Particle Swarm Optimization > Test Cases and Results > Efficiency > Effectiveness > Conclusion
    52. 52. What Is an Inverse Problem? <ul><li>Three Categories : </li></ul><ul><li>Backward (retrospective) problem </li></ul><ul><ul><li>Solution Initial Conditions </li></ul></ul><ul><li>Coefficient inverse problem </li></ul><ul><li>Solution Coefficients </li></ul><ul><li>Boundary inverse problem </li></ul><ul><ul><li>Solution Boundary Conditions </li></ul></ul>Inverse Problems > Particle Swarm Optimization > Test Cases > Performance Studies > Conclusion > Future Work

    ×