Your SlideShare is downloading. ×
0
Aurora energy efficiency
Energy hungry datacenters• Electricity used by data centers has doubled between 2000 and 2005  alone!• Servers are becomin...
Challenges• Energy demanding servers pose several challenges   –   Cost   –   Energy waste   –   Power availability   –   ...
Motivations for energy efficiencyquote from Meijer Huber, LRZ• Energy Efficiency and SuperMUC• Motivation   •Academic and ...
Motivations for energy efficiencyquote from Steve Hammond, NRELMotivation•   Data centers are highly energy-intensive faci...
PUEs in various data centersSource: IntelGlobal bank’s best data center (of more than 100)   2.25             AirEPA Energ...
WAYS TO IMPROVE PUE ANDENERGY EFFICIENCY
Ways to improve PUE and energy efficiencyTotal vs local energy optimization
Ways to improve PUE and energy efficiencyActing at different levelsIT equipment level• Increasing processor efficiency• In...
3 main opportunity areas for energy efficiency                 IT equipment       1      Maximize Flops / Watt            ...
MAXIMIZE EFFICIENCY (FLOPS/WATT)
Energy efficient design•   Eurora has been designed using standard component but making    choices for the best energy eff...
Gain DC/DC conversion efficiency• In the DC/DC choice a gain of over 2% in efficiency, from 95,5 % to  98%• Choice of the ...
Water cooling and efficiency178 nodes – AMD Opteron 6128HE CPUs (Magny Cours) - 16GB RAM Measuremets   taken by LRZ   • Wi...
Water cooling = No fans, Low noise•   Ventilators consume 5-8% of peak power…per se a small contribution but the    SUM of...
Coldwater cooling• Cold water often need chillers to be generate so it impacts negatively  the PUE• Ideally cold water sho...
ADDITIONAL EFFICIENCY!!!REDUCE HOUSE LOAD
Efficiency and economics - Energy use indata centersData from APC
Efficiency and economics - “typical” powerbreakdown in datacenters                                   Data from APC
Reducing cooling energyWays to reduce cooling energy consumption• Air cooling optimization (hot and cold aisle containment...
Aurora liquid cooling infrastructure                   Dry cooler                        Filter                           ...
Pumps consume energy but they                       can control the flowrate                       Increasing the flowrate...
Advantages of the Eurotech approachHot liquid cooling  no chillers  save energy• Avoid/limit expensive and power hungry ...
Optimize power conversionStandard power distribution steps Data from Intel
Moving towards DC reduces steps in powerconversion Data from Intel
Aurora power distribution                                                       10 V    230 V                             ...
ADDITIONAL GREEN!!!THERMAL ENERGY RECOVERY
Minimize waste: thermal energy re-use      Three Stages Cooling + Heat Recovery                                           ...
Minimize waste: thermal energy re-use • The ability to effectively re-use the waste heat from the outlets increases   with...
Thermal energy recovery and swimming pools      Swimming pool 50 m, 4 lanes, 2m deep that looses 2°C per day if not heated...
IMPACT ON TOTAL COST OFOWNERSHIP
Total cost of ownership
Total cost of ownership A comparison between datacenters: initial cost                Both datacenters with roughly 1MW of...
Total cost of ownershipA comparison between datacenters: annualized TCO            Both datacenters with roughly 1MW of IT...
GREEN considerationsA comparison between datacenters                           OPTIMAL Air Cooled     Hot Liquid Coloed   ...
CO2 equivalent          11000 tons of saved CO2 are equivalent          to          1500 cars that do not circualte for 1 ...
Aurora hpc energy efficiency
Aurora hpc energy efficiency
Upcoming SlideShare
Loading in...5
×

Aurora hpc energy efficiency

2,066

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
2,066
On Slideshare
0
From Embeds
0
Number of Embeds
8
Actions
Shares
0
Downloads
38
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • NREL = National Renewable Energy Laboratory
  • This slide shows that to getPUEs < 1.8 dedicatedinfrastructureisneeded to implementeither free cooling or liquidcooling
  • Transcript of "Aurora hpc energy efficiency"

    1. 1. Aurora energy efficiency
    2. 2. Energy hungry datacenters• Electricity used by data centers has doubled between 2000 and 2005 alone!• Servers are becoming more powerful , dense and more in number as well as storage becoming larger and larger• Availability needs are on the rise ALL OF THE ABOVE EQUALS• MORE power consumed by the servers AND more consumed for cooling
    3. 3. Challenges• Energy demanding servers pose several challenges – Cost – Energy waste – Power availability – Cooling – Hot spots – Carbon footprint
    4. 4. Motivations for energy efficiencyquote from Meijer Huber, LRZ• Energy Efficiency and SuperMUC• Motivation •Academic and governmental institutions in Bavaria use electrical energy from renewable sources •We currently pay 15.8 Cents per KWh •We already know that we will have to pay at least 17.8 Cents per KWh in 2013
    5. 5. Motivations for energy efficiencyquote from Steve Hammond, NRELMotivation• Data centers are highly energy-intensive facilities• 10-100x more energy intensive than an office.• Server racks well in excess of 30kW.• Surging demand for data storage.• ~3% of U.S. electricity consumption.• Projected to double in next 5 years.• Power and cooling constraints in existing facilities.Sustainable Computing Why should we care?• Carbon footprint.• Water usage.• Mega$ per MW year.• Cost OpEx > IT CapEx!Thus, we need a holistic approach to sustainability and TCO for theentire computing enterprise, not just the HPC system
    6. 6. PUEs in various data centersSource: IntelGlobal bank’s best data center (of more than 100) 2.25 AirEPA Energy Star Average 1.91 Air/LiquidIntel average >1.80 AirORNL 1.25 LiquidGoogle 1.16 Liquid coils, evaporative tower, hot aisle containmentLeibniz Supercomputing Centre (LRZ) 1.15 Direct liquidNational Center for Atmospheric Research (NCAR) 1.10 LiquidYahoo Lockport *(PUE declared in project) 1.08 Free air cooling + evaporative coolingFacebook Prineville 1.07 Free cooling, evaporativeNational Renewable Energy Laboratory (NREL) 1.06 Direct Liquid + evaporative tower
    7. 7. WAYS TO IMPROVE PUE ANDENERGY EFFICIENCY
    8. 8. Ways to improve PUE and energy efficiencyTotal vs local energy optimization
    9. 9. Ways to improve PUE and energy efficiencyActing at different levelsIT equipment level• Increasing processor efficiency• Increasing memory efficiency• Increasing storage efficiency• Optimizing networks (i.e. 3d-Torus vs fat tree networks)• Optimizing algorithms• Optimizing software (i.e. locality…)• Optimizing the jobs scheduling to maximizing processors utilizationData center level• 50% of the energy entering a data centre goes into the «house load», so it used for ancillary activities not directly related to the IT equipment• Reducing the house load bring a considerable improvement of the data centre energy efficiency
    10. 10. 3 main opportunity areas for energy efficiency IT equipment 1 Maximize Flops / Watt Maximize efficiency Data Center 2 Reduce House Load Reduce Cooling Energy consumption Optimize power Data Center or conversion 3 ecosystem Reuse thermal energy
    11. 11. MAXIMIZE EFFICIENCY (FLOPS/WATT)
    12. 12. Energy efficient design• Eurora has been designed using standard component but making choices for the best energy efficiency possible• Eurora could benefit from the Eurotech experience of making the power conversion chain efficiency of the Eurotech Aurora system progressively increased from 89% to 97%The approach has been:• Choice of the most efficient components in the market. That is, choosing components that minimize energy consumption giving the same functionality and performance• Choice of the best «working points» to top the components efficiency curves• Water cooling to lower the working temperature of components and maximize their efficiency and eliminate fans
    13. 13. Gain DC/DC conversion efficiency• In the DC/DC choice a gain of over 2% in efficiency, from 95,5 % to 98%• Choice of the optimal current (I) to work on the top of the conversion curves Existing DC/DC conversion New upgraded DC/DC conversion
    14. 14. Water cooling and efficiency178 nodes – AMD Opteron 6128HE CPUs (Magny Cours) - 16GB RAM Measuremets taken by LRZ • With aircooling the CPU’s operate at about 5°C below maximum case temparture • Normal operation of an water cooled server is with water of 20°C, which is about 40°C below the maximum case temperature
    15. 15. Water cooling = No fans, Low noise• Ventilators consume 5-8% of peak power…per se a small contribution but the SUM of all of the contributions described gives a considerable positive delta in energy efficiency
    16. 16. Coldwater cooling• Cold water often need chillers to be generate so it impacts negatively the PUE• Ideally cold water should be generated by natural sources like lakes, rivers or by natural sources of cool, like cold climates, high mountain or geothermal exchange• Eurotech can design solutions that accommodate the use of natural sources of cooling
    17. 17. ADDITIONAL EFFICIENCY!!!REDUCE HOUSE LOAD
    18. 18. Efficiency and economics - Energy use indata centersData from APC
    19. 19. Efficiency and economics - “typical” powerbreakdown in datacenters Data from APC
    20. 20. Reducing cooling energyWays to reduce cooling energy consumption• Air cooling optimization (hot and cold aisle containment…)• Free cooling: avoid compressor based cooling (chillers) using cold air coming from outside the data center. Possible only in cold climate or seasonal• Free cooling with heat exchangers (dry coolers). Dry coolers consume much less energy than chillers!• Liquid cooling to increase the cooling efficacy and reduce the power absobed by chillers• Liquid cooling with free cooling: the liquid is not cooled by chillers but by dry coolers• Hot liquid cooling allows the use of dry coolers all year round and also in warm climates• Liquid cooling using a natural source of• Alternative approaches: spray cooling, oil submersion coolingEurotech Aurora approach:• Direct Hot Water Cooling with no chillers but only dry coolers
    21. 21. Aurora liquid cooling infrastructure Dry cooler Filter Loop #1 Loop #6 Heat Loop #12exchanger Internal cooling Loop Pump
    22. 22. Pumps consume energy but they can control the flowrate Increasing the flowrate is lessChiller s energy demanding that swicthing onDry Coolers a chiller LOOP #1 LOOP #2 heater By pass
    23. 23. Advantages of the Eurotech approachHot liquid cooling  no chillers  save energy• Avoid/limit expensive and power hungry chillers with the only cooling method that requires almost always dry coolers only• Minimize PUE and hence maximize energy cost savings• Reuse thermal energy for heating, air conditioning, electrical energy or industrial processes• “Clean” free cooling: no dust, no filters needed to filter external airDirect liquid cooling via cold plates  effective cooling• Allow very limited heat spillage• Maximize the effectiveness of cooling allowing for hot water to be used (up to 55 °C inlet water)Comprehensive  more efficiency• Cools any source of heat in the server (including power supply)
    24. 24. Optimize power conversionStandard power distribution steps Data from Intel
    25. 25. Moving towards DC reduces steps in powerconversion Data from Intel
    26. 26. Aurora power distribution 10 V 230 V 48 Vdc Optional UPS 97% efficiency 98% efficiency
    27. 27. ADDITIONAL GREEN!!!THERMAL ENERGY RECOVERY
    28. 28. Minimize waste: thermal energy re-use Three Stages Cooling + Heat Recovery 1 MW 0.13 MW Computing Computing Computing system system system 20° C 25° C 30° C rack 1 rack 2 rack #n Liquid to Liquid Heat exchanger Liquid to Liquid Heat exchanger 0.87 MW 30° C 55° C PUE < 1 !! Thermal energy re-use
    29. 29. Minimize waste: thermal energy re-use • The ability to effectively re-use the waste heat from the outlets increases with higher temperatures. • Outlet temperatures starting from 45°C can be used to heat buildings, temperatures starting from 55°C can be used to drive adsorption chillers. • Higher temperatures may even allow for trigeneration, the combined production of electricity, heating and cooling • Warm water can be used also in industrial processes
    30. 30. Thermal energy recovery and swimming pools Swimming pool 50 m, 4 lanes, 2m deep that looses 2°C per day if not heated The heat exchange system has 90% efficiency Volume water = 2,50m x 4 x 50m x 2m = 1000m^3 = 10^6 litri = 10^6 Kg Water specific heat= specificheat = 4186 Joule / Kg K Water target temperature = 28°C How much power do I need to keep the swimming pool at 28°C? P(W) = Q(Joule)/t(sec) = m(kg) * c_specif (Joule/Kg K) * deltaT (K)/t(sec) = 10^6 Kg * 4186 Joule/Kg K * 2K ( 24*60*60 sec ) = 96900 W = 96,9 KW So we need a supercomputer generating roughly 110 kW. Assuming an energy efficiency of 900 Mflops/W… …to heat the swimming pool we would need to install a 100 Tflop system. That is, one Eurotech Aurora HPC 10-10 rack
    31. 31. IMPACT ON TOTAL COST OFOWNERSHIP
    32. 32. Total cost of ownership
    33. 33. Total cost of ownership A comparison between datacenters: initial cost Both datacenters with roughly 1MW of IT equipment installed OPTIMAL Air Cooled Hot Liquid ColoedValues in K$ Datacenter (PUE = 1.8) datacenter (PUE=1.05)Cost of IT (HW and SW) $8,200 $8,200Facilities (building, raised floor,fire system...) $960 $410Racks and rack mngt software $220 $100Liquid cooling $0 $620Total for network equipment $710 $710Cooling infrastructure/plumbing $4,280 $580Electrical $5,710 $3,880TOTAL INVESTMENT COST $20,080 $14,500
    34. 34. Total cost of ownershipA comparison between datacenters: annualized TCO Both datacenters with roughly 1MW of IT equipment installed OPTIMAL Air Cooled Hot Liquid Coloed Values in K$ Datacenter (PUE = 1.8) datacenter (PUE=1.05) Cost of energy $2,690 $1,060 Retuning and additional CFD $5 $0 Total outage cost $440 $370 Preventive maintenance $150 $150 Annual facility and infrastructure maintenance. $460 $220 Lighting $4 $2 Annualized 3 years capital costs $3,480 $3,440 Annualized 10 years capital costs $1,420 $720 Annualized 15 years capital costs $100 $40 ANNUALIZED TCO $8,749 $6,002
    35. 35. GREEN considerationsA comparison between datacenters OPTIMAL Air Cooled Hot Liquid Coloed Datacenter (PUE = 1.8) datacenter (PUE=1.05) Total tons CO2 in 5 15,500 26,600 years Tons of CO2 saved 11070 0 CO2 savings
    36. 36. CO2 equivalent 11000 tons of saved CO2 are equivalent to 1500 cars that do not circualte for 1 yesr 11500 saved adult trees 15 Km2 of rain forest left untouched
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×