Download It

450 views
364 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
450
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
13
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Power Conversion: Potential for 10 to 30% improvement Server Load/Computing: Potential for 30 to 50% improvement (check EPA report) Load management Virtualization Sleep modes Load shifting Server innovation Chip design Efficient power supplies Semiconductor materials Cooling: Potential for 30 to 50% improvement (check EPA report) Alternative Power Generation: On-site generation (eliminates transmission losses) CHP applications Use of waste heat for cooling Use of renewable energy (could couple with DC power distribution systems) PV Fuel cells Dale’s revision: Improve “in-the-box” power supply efficiency Improve efficiency of software applications Improve resource utilization (e.g. virtualization) Reduce idle power (power management) Hardware innovation (e.g. more efficient computations per watt)
  • Climate savers: dedicated to making PCs and servers more energy efficient and, hopefully, better for the environment . Initial goal is to encourage computer and component manufacturers to build products that meet or surpass ENERGY STAR. In the coming years, the group will tighten its energy efficiency guidelines, surpassing ENERGY STAR guidelines with more stringent ones. For example, 2007 ENERGY STAR specifications require that PC power supplies meet at least 80% efficiency. The initiative would require a minimum of 90% by 2010. Similarly, guidelines for 1U and 2U single-socket and dual-socket servers will increase from meeting 85% of the minimum guidelines in 2007 to 92% efficiency by 2010. According to the group, by 2010, the Climate Savers Computing Initiative will cut greenhouse gas emissions in an amount equal to removing more than 11 million cars from the road or shutting down 20 500-megawatt coal-fired power plants. 35+ industry partners include: AMD, Intel, Microsoft, Dell, IBM, Sun, eBay, Google, Starbucks, Yahoo, NRDC, PG&E, and others….
  • Download It

    1. 1. Overview of Data Center Energy Use Bill Tschudi, LBNL [email_address]
    2. 2. Data Center Definitions <ul><li>Server closet < 200 sf </li></ul><ul><li>Server room <500 sf </li></ul><ul><li>Localized data center <1,000 sf </li></ul><ul><li>Mid-tier data center <5,000 sf </li></ul><ul><li>Enterprise class data center 5000+ sf </li></ul>Focus today’s training on larger data centers—however most principles apply to any size center
    3. 3. EPA report to Congress— Breakdown of Space
    4. 4. Data Center Efficiency Opportunities Benchmarking of over 25 centers consistently lead to opportunities No silver bullet Lots of silver bb’s
    5. 5. Energy Efficiency Opportunities Are Everywhere Server Load/ Computing Operations Cooling Equipment Power Conversion & Distribution Alternative Power Generation <ul><li>High voltage distribution </li></ul><ul><li>Use of DC power </li></ul><ul><li>Highly efficient UPS systems </li></ul><ul><li>Efficient redundancy strategies </li></ul><ul><li>Load management </li></ul><ul><li>Server innovation </li></ul><ul><li>Better air management </li></ul><ul><li>Better environmental conditions </li></ul><ul><li>Move to liquid cooling </li></ul><ul><li>Optimized chilled-water plants </li></ul><ul><li>Use of free cooling </li></ul><ul><li>On-site generation </li></ul><ul><li>Waste heat for cooling </li></ul><ul><li>Use of renewable energy/fuel cells </li></ul>
    6. 6. IT Equipment Load Density
    7. 7. Benchmarking Energy End Use
    8. 8. Overall Electrical Power Use in Data Centers Courtesy of Michael Patterson, Intel Corporation
    9. 9. Performance Varies The relative percentages of the energy actually doing computing varied considerably.
    10. 10. High Level Metric— Percentage of Electricity Delivered to IT Equipment Average .57 Higher is better Source: LBNL Benchmarking
    11. 11. Alternate High Level Metric – Data Center Total Electrical Demand/ IT Equipment Demand (PUE) Average 1.83 Lower is better Source: LBNL Benchmarking
    12. 12. HVAC System Effectiveness We observed a wide variation in HVAC performance
    13. 13. Benchmark Results Can Help Identify Best Practices <ul><li>The ratio of IT equipment power to the total is an indicator of relative overall efficiency. Examination of individual systems and components in the centers that performed well helped to identify best practices. </li></ul>
    14. 14. Best HVAC Practices <ul><li>Air Management </li></ul><ul><li>Air Economizers </li></ul><ul><li>Humidification Control </li></ul><ul><li>Centralized Air Handlers </li></ul><ul><li>Low Pressure Drop Systems </li></ul><ul><li>Fan Efficiency </li></ul><ul><li>Cooling Plant Optimization </li></ul><ul><li>Water Side Economizer </li></ul><ul><li>Variable Speed Chillers </li></ul><ul><li>Variable Speed Pumping </li></ul><ul><li>Direct Liquid Cooling </li></ul>
    15. 15. Best Electrical Practices <ul><li>UPS systems </li></ul><ul><li>Self-generation </li></ul><ul><li>AC-DC distribution </li></ul><ul><li>Standby generation </li></ul>
    16. 16. Best Practices and IT Equipment <ul><li>Power supply efficiency </li></ul><ul><li>Standby/sleep power modes </li></ul><ul><li>IT equipment fans </li></ul><ul><li>Virtualization </li></ul><ul><li>Load shifting </li></ul>
    17. 17. Best Practices— Cross-Cutting and Misc. Issues <ul><li>Motor efficiency </li></ul><ul><li>Right sizing </li></ul><ul><li>Variable speed drives </li></ul><ul><li>Lighting </li></ul><ul><li>Maintenance </li></ul><ul><li>Continuous Commissioning and Benchmarking </li></ul><ul><li>Heat Recovery </li></ul><ul><li>Building Envelope </li></ul><ul><li>Redundancy Strategies </li></ul><ul><li>Methods of charging for space and power </li></ul>
    18. 18. Potential Savings <ul><li>Electrical bill will exceed the cost of IT equipment over its useful life </li></ul><ul><li>20-40% savings typically possible </li></ul><ul><li>Aggressive strategies – better than 50% savings </li></ul><ul><li>Paybacks are short – 1 to 3 years are common </li></ul>
    19. 19. Scenarios of Projected Energy Use from EPA Report to Congress 2007 - 2011
    20. 20. The Good News: <ul><li>Industry is taking action </li></ul><ul><ul><li>IT manufacturers </li></ul></ul><ul><ul><li>Infrastructure equipment manufacturers </li></ul></ul><ul><li>Industry Associations are leading: </li></ul><ul><ul><li>ASHRAE </li></ul></ul><ul><ul><li>Green Grid </li></ul></ul><ul><ul><li>Uptime Institute </li></ul></ul><ul><ul><li>Afcom </li></ul></ul><ul><ul><li>Critical Facilities Roundtable </li></ul></ul><ul><ul><li>7 X 24 Exchange </li></ul></ul>
    21. 21. IT Industry Taking Action www.climatesaverscomputing.org . www.thegreengrid.com
    22. 22. More Good News: <ul><li>Utilities are getting involved: </li></ul><ul><ul><li>PG&E, SCE, San Diego </li></ul></ul><ul><ul><li>CEE </li></ul></ul><ul><li>CA incentive programs are aggressive </li></ul><ul><li>California Energy Commission, DOE, EPA all have data center initiatives </li></ul>
    23. 23. Design Guidelines Were Developed in Collaboration With PG&E <ul><ul><li>Guides available through PG&E’s Energy Design Resources Website </li></ul></ul>
    24. 24. Design Guidance is Summarized in a Web Based Training Resource http://hightech.lbl.gov/dctraining/TOP.html
    25. 25. Resources summary sheet available
    26. 26. Take Aways <ul><li>Various meanings for “data centers” </li></ul><ul><li>Benchmarking helps identify performance </li></ul><ul><li>Benchmarking suggests best practices </li></ul><ul><li>Efficiency varies </li></ul><ul><li>Large opportunity for savings </li></ul><ul><li>Resources are being developed </li></ul>

    ×