Your SlideShare is downloading. ×
Data Center Energy Efficiency Best Practices
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

Data Center Energy Efficiency Best Practices

2,862
views

Published on

Data Center Energy Efficiency Best Practices – Insights Into The ROI On Best Practices …

Data Center Energy Efficiency Best Practices – Insights Into The ROI On Best Practices

Electricity expense has become an increasingly important factor of the total cost of ownership (TCO) for data centers. Energy consumption of typical data centers can be substantially reduced through design of the physical infrastructure and IT architecture.

To view the recorded webinar presentation, please visit http://www.42u.com/energy-efficiency-webinar.htm

Published in: Technology, Business

0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,862
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
242
Comments
0
Likes
3
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • <J> Just in case you don’t know, and I’m sure most of you listening are painfully aware of these numbers, it’s costly to have downtime.
  • Here is a view of the typical conversion losses… From this slide, we see as the power cascades back from the server to its original source at the utility we are losing wattage each step of the way. These small wattage savings can add up to a significant number as you multiply it across your entire facility. Any positive changes you can make to the efficiency of this process are extremely beneficial. If you implement best practices you can leverage this information to further improve efficiency.
  • <J> Here are the results of a study done by The Green Grid. They found that for the data centers measured just 30% of the energy consumed went to the actual IT equpment. <Michael> And this is only 1 approach, IDC states 48% for Overhead, and 52% for IT
  • <J> I prefer to use independent sources of information as opposed to those supplied by vendors. Power use efficiency is an alternative use of the acronym. Also characterized as Energy Efficiency Ratio (EER) by the EPA. <Michael> So as we focus on that gray area in the middle, we are looking at numbers in the 30%-50% range
  • <J> Here is another view of The Green Grid research. <Michael> Lawrence Berkeley National Laboratory did a separate study of 24 California Data centers and found the average there to be closer to the objective shown here, and many were close to the 1.5 level. <J> Interesting. California does have a more aggressive energy policy. In fact, we’ve seen PUE measurements as low as 1.21, again a California-based data center.
  • Start with Survey. Note that installation phase could be minimal with point-in-time measurements.
  • <Michael> Based on the Efficiency Measurement Life-Cycle process, 42U advises that clients first set the baseline values, implement a targeted efficiency best practice and then measure the efficiency gains. Then, repeating the process with increasingly more sophisticated technologies. Have established relationships with specialty companies for consolidation and virtualization.
  • Measurement is critical to establishing the savings of programs. ASHRAE TC 9.9: Adjustments for IT; operating temperature range expanded from 68-77 to 64.4-80.6, refresh rate of IT equipment is 2-5 years, building cooling is 10-25 years Modular systems, such as modular UPS systems. Recall slide 6 cascade. Permits more precise sizing of infrastructure to match computing loads. Virtualization- typically results in 1:10 up to 1:20 server reductions
  • This shows the various points at which we can measure. All the components are broken down to categories. Energy flows from utility to the facility, and then out to IT internal to the racks Again as you can see there are various points at which you can monitor.. You should design your power infrastructure to support business requirements These requirements typically have to do with SLAs, what are the standard practices and procedures the organization has to follow, or charge backs, am I sharing the space and how can I accurately charge our customers for the energy they use.
  • Reduce bypass airflow as well as mixing Ensure ALL uncontrolled airflow is managed; wall gaps, all service access points, CRAC covers on idle systems Floor management: Hot/cold aisle Clearing subfloor plenum of cables and other blockages CFD tools, vortex example Maintenance: Manage moves adds and changes Eliminate idle servers Remove unused cabling Close off and prevent airflow gaps On-going measurement program will identify issues in this area
  • <M> Here is an example of the measurable impact of adding one 12” blanking panel to the middle of a rack <J> This is then the best best practice, right? <M> Exactly. Conventional wisdom is correct: blanking panes are essential
  • <M> Feedback on floor-tile tuning: with instrumentation, we can observe results in real time When airflow is restricted, Under-floor pressure increases Rack-top temperatures decrease <J> This is pretty complex isn’t it? <M> Yes. Without monitoring and visualization, this process is guesswork How many or which tiles to remove? Other data center clichés also borne out Eliminate leaks in floor Manage floor tile permeability
  • <M> At all rack design points hot/cold aisle configurations will be more efficient Diagrams show an end view of aisles on left, with CRAC moved to side for illustration purposes, and an overhead view illustrating best-practice CRAC locations. Effective cable management moves savings to higher levels by avoiding airflow blockage in the cabinet Ensure gaps between cabinets are also blocked. <J> Should clients keep empty cabinets with blanking panels in rows to close gaps? <M> Yes, either that or use other containment technologies Low heat racks on ends of aisles to reduce mixing around ends of rows In addition to aisle arrangement, best-practice placement of cooling equipment is an important aspect of this best practice
  • Level 1-3 = 42U branded containment model
  • <M> Containment is simply controlling airflow in an efficient manner. Containment can focus on either the hot aisle or the cold aisle. Hot aisle containment is less efficient and can make OSHA compliance less challenging. Cold aisle containment is more efficient, but raises ambient room temperature 42U has defined a modular containment model to enable clients to achieve immediate savings with a minimal investment. The concept defines three levels of containment; level 1 is closing off aisle ends to eliminate mixing around row ends and to keep chilled air in front of cabinets. Level 2 extends baffling above racks to further reduce mixing over rack tops. Level 3 is complete containment where containment technology covers the entire aisle.
  • <M> Here is an example of one cold aisle containment technology. Achieves higher-end of savings potential. Range of technologies are available depending on specific client requirements, including existing vs new build and tailored to data center configuration.
  • <M> This is essentially coordination of cooling technologies. In larger data centers multiple cooling systems can actually compete. For example, one CRAC can detect low humidity and begin humidifying. A neighboring CRAC detects the increase in humidity and begins de-humidifying, resulting in gross energy waste. <J> Moving set-points to supply side rather than the current approach of the return Deadband speaks to setting sensitivity points to a wider range, for example from +/– 2 degrees to +/– 10 degrees, so that the equipment does not compete. Moving set-points to supply side rather than the current approach of the return
  • Rack-Level: In Row and Close Coupled Room-Level: Brief coverage of pros and cons of water versus refrigerant: water is more efficient, less costly; refrigerant less risky, reduced footprint
  • Economizers: air-side and water-side Adiabatic: sprays a mist of water around heat exchanger to improve efficiency Heat Wheel: reduces air mixing Dry Coolers
  • Best practice is to size for light load efficiency, NOT peak load efficiency Line-Interactive UPS: widely adopted UPS technology in Europe, not yet adopted in US as it can’t be integrated into an existing environment. New UPS is 70% more efficient
  • Now lets take a step back to the UPS Commission as you go vs. commission build out Modular, shared buss bar, N+1 redundancy internal to the configuration 12kVA increments, 60kVA N+1
  • <M> Remote power allows you to take a proactive approach to managing your power. The Best practice for remote power management is to deploy Switched PDUs IP UPS Monitoring tools at the breaker……. Access power infrastructure anywhere, anytime you have access to internet Alerting based on predetermined events or thresholds Ability to act on alerts instantly It may seem like a daunting/costly undertaking to implement a remote power strategy but the ROI is rather simple to calculate.
  • <M> Most of you on the line are probably nodding rite now. Studies have shown that 72% of all technician calls are solved by simply bouncing a server. The average cost of this service is roughly $500 and the time associated with the call can take hours depending on the time of day and distance to the facility. With remote power management you can decrease this service time to minutes One other factor not listed here are the service level penalties associated with downtime.. The take away here is there are many opportunities to improve the efficiency of your data center through the implementation of a power management strategy. Each approach detailed will yield significant savings to your organization and the approaches are customizable to meet your business requirements.
  • These too should be considered, but they are more difficult to measure and establish hard ROIs
  • E3 Is a calculator we developed to help our clients have a starting point for understanding the cost associated with their energy consumption. E3 stands for Estimated Energy Expense. What we found in many organizations is a disconnect between facilities and IT. This calculator helps the IT manager understand the cost of energy. The calculator is comprised of 5 key factors: kW/hour rate kW/rack design point # of racks in data center Overhead factor (infrastructure required to power and cool the equipment) Uptime requirement (how many 9s of uptime) If you do not have all 5 factors we will use industry averages. The only data you need to supply is the number of racks. Contact a 42U efficiency consultant for more information and to evaluate your energy consumption.
  • Transcript

    • 1. 42U Confidential ©2008 42U All rights reserved Slide Data Center Energy Efficiency Best-Practices – Insights Into The ROI On Best-Practices December 3, 2008 ________________________________
    • 2. Overview
      • Importance of Best-Practices
      • Strategic
      • Fundamentals
      • Advanced Air Flow
      • Cooling Technologies
      • Power Technologies
      • Summary
      • Q&A
      42U Confidential ©2008 42U All rights reserved Slide
    • 3. Downtime Costs 42U Confidential ©2008 42U All rights reserved Slide
    • 4. Cumulative Loss 42U Confidential ©2008 42U All rights reserved Slide Source: Energy Logic
    • 5. Energy Consumers
      • Infrastructure Consumes More Power Than The IT Equipment
      42U Confidential ©2008 42U All rights reserved Slide Source: The Green Grid, Guidelines for Energy-Efficient Datacenters
    • 6. Energy Benchmark
      • Green Grid Benchmark
      • Enables Comparison
      42U Confidential ©2008 42U All rights reserved Slide Source: The Green Grid
    • 7. Benchmark PUE Data
      • Best-Case To Date: 1.21
      42U Confidential ©2008 42U All rights reserved Slide
    • 8. Efficiency Measurement Life-Cycle SM 42U Confidential ©2008 42U All rights reserved Slide
      • Measurement Point Design
      • Measurement Object Database
      • Security
      • Hardware Configuration & Testing
      • Network Configuration & Testing
      • Sensor Installation
      • Software Installation
      • Network Verification
      • Data Collection
      • Analysis
      • System Management
      • Facilities Data
      • IT Infrastructure Data
      • IT Equipment Data
      • Integration Requirements
      • Recommendations
      • Reconfiguration Plan
      • Efficiency Improvements
      Survey Plan Install Measure Reconfigure
    • 9. Efficiency Progression
      • Basic Airflow
      • Close-Coupled Cooling
      • Economizers
      • More Efficient Technologies
      • Consolidation
      • Virtualization
      42U Confidential ©2008 42U All rights reserved Slide Energy Expense
    • 10. Strategic Approaches
      • Measurement
      • Modular Components
        • 10-30% Savings Potential
      • Virtualization
        • 10-40% Savings Potential
      42U Confidential ©2008 42U All rights reserved Slide
    • 11. Measuring Efficiency 42U Confidential ©2008 42U All rights reserved Slide Rack Temperature RH Current Power Sub-floor Pressure Differential CRAH Supply & Return Air Temp Supply RH Chilled Water Supply & Return Temp Fan Power Chilled Water Flow (Captured at CRAH) PDU Current, Voltage & Power
    • 12. Fundamentals
      • Air Flow Management
        • Blanking Panels
        • Floor Tile Management
      • Floor Management
        • Design
      • Maintenance
      42U Confidential ©2008 42U All rights reserved Slide
    • 13. Blanking Panel Impact
      • Impact of a Single Blanking Panel
      • 1-2% Savings
      42U Confidential ©2008 42U All rights reserved Slide Top of rack Middle of rack
    • 14. Floor Tile Management
      • 1-6% Savings Potential
      • Correct Locations
      • Appropriate Configuration
      • Requires Expertise
      42U Confidential ©2008 42U All rights reserved Slide Under-Floor Pressure Rack-Top Temperatures
    • 15. Hot/Cold Aisle Arrangement
      • Reduces
        • Bypass
        • Mixing
      • 5-12% Energy Savings
      42U Confidential ©2008 42U All rights reserved Slide Source: APC, Ten Cooling Solutions, #42
    • 16. Advanced Air Flow
      • Containment
        • Hot Aisle
        • Cold Aisle
        • Level 1-3
      • Coordination
        • Dead Band Set Points
      42U Confidential ©2008 42U All rights reserved Slide
    • 17. Containment Concept 42U Confidential ©2008 42U All rights reserved Slide
    • 18. Containment 42U Confidential ©2008 42U All rights reserved Slide
    • 19. Coordination
      • 0-10% Savings Potential
      • Detailed Analysis Required
      • Dead Band Sensitivity Points
      42U Confidential ©2008 42U All rights reserved Slide
    • 20. Cooling Technologies
      • Rack-Level
        • 7-15% Savings Potential
      • Room-Level
        • 4-15% Savings Potential
      • Coolants
      42U Confidential ©2008 42U All rights reserved Slide
    • 21. Rack-Level Technologies 42U Confidential ©2008 42U All rights reserved Slide Close Coupled Rear Door Ambient Air Supplemental In-Row Active Air 10 15 20 28 35 40 kW Chip + Enclosure
    • 22. Room-Level Technologies
      • Make Up Air
      • Room Air
      • Free Cooling
      42U Confidential ©2008 42U All rights reserved Slide
    • 23. Power Technologies
      • 4-10% Savings Potential
      • Sizing Best Practice
      • 3-Phase Power Distribution
      • Line-Interactive UPS
      • High Efficiency UPS
      42U Confidential ©2008 42U All rights reserved Slide
    • 24. High Efficiency UPS
      • Modular
        • Planned Expansion
        • Efficient Capacity Alignment
      • Hybrid Design
      42U Confidential ©2008 42U All rights reserved Slide
    • 25. Remote Power
      • Proactive Approach
      • Internet Access to Infrastructure
      • Predetermined Alerting
      • Immediate Response to Alerts
      42U Confidential ©2008 42U All rights reserved Slide
    • 26. Remote Power ROI
      • 72% of Technician Calls are Resolved with a Re-Boot
      • Average Service Call Cost is $500
      • Downtime Reduced from 1.5 Hours to Minutes
      42U Confidential ©2008 42U All rights reserved Slide
    • 27. Additional Efficiency Programs
      • Lighting
        • LED Lighting
        • Proximity Sensors
        • In-Cabinet
      • Extend Return Ducts
      42U Confidential ©2008 42U All rights reserved Slide
    • 28. Summary
      • Value of Best-Practices
      • Fundamentals
      • Advanced Air Flow
      • Cooling Technologies
      • Power Technologies
      42U Confidential ©2008 42U All rights reserved Slide
    • 29. E 3 Estimator
      • Estimation of Data Center Energy Costs
        • Industry Benchmark Basis
        • Multiple Research Resources
      • Availability
        • Spreadsheet Now
        • 42U Website Soon
      42U Confidential ©2008 42U All rights reserved Slide
    • 30. Efficiency Checklist
      • Reduce Your Electric Bills
        • 20-50% for No/Low-Cost Design and Operations Changes
        • 90% for a Systematic Approach
      42U Confidential ©2008 42U All rights reserved Slide Source: The Green Grid, Guidelines for Energy-Efficient Datacenters
    • 31. Q&A 42U Confidential ©2008 42U All rights reserved Slide For More Information Contact Your 42U Data Center Efficiency Consultant: 1-800-638-2638 or www.42U.com For a Copy of Today’s Presentation Email: [email_address] Thank You