HVAC for Data Centers

6,736 views

Published on

When developing data center energy-use estimations, engineers must account for all sources of energy use in the facility. Most energy consumption is obvious: computers, cooling plant and related equipment, lighting, and other miscellaneous electrical loads. Designing efficient and effective data centers is a top priority for consulting engineers. Cooling is a large portion of data center energy use, second only to the IT load. Although there are several options to help maximize HVAC efficiency and minimize energy consumption, data centers come in many shapes, sizes, and configurations. By developing a deep understanding of their client’s data center HVAC requirements, consulting engineers can help maintain the necessary availability level of mission critical applications while reducing energy consumption.

Published in: Education
0 Comments
13 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
6,736
On SlideShare
0
From Embeds
0
Number of Embeds
45
Actions
Shares
0
Downloads
430
Comments
0
Likes
13
Embeds 0
No embeds

No notes for slide

HVAC for Data Centers

  1. 1. HVAC for Data Centers Sponsored by: Join the discussion about this Webcast on Twitter at #CSEdatacenterHVAC
  2. 2. Today’s Webcast Sponsors:
  3. 3. Learning Objectives: 1.The audience will learn about codes and guidelines, such as ASHRAE 90.1: Energy Standard for Buildings Except Low- Rise Residential Buildings, and U.S. Green Building Council LEED v4 2.Attendees will learn the relationships between HVAC efficiency and power usage effectiveness (PUE) 3.Viewers will understand the advantages and drawbacks of using an elevated IT equipment inlet temperature 4.Viewers will learn how running IT equipment at partial load affects data center energy efficiency.
  4. 4. Bill Kosik, PE, CEM, BEMP, LEED AP BD+C, HP Enterprise Business, Technology Services, Chicago, Ill. Tom R. Squillo, PE, LEED AP, Environmental Systems Design Inc., Chicago, Ill. Moderator: Jack Smith, Consulting-Specifying Engineer and Pure Power, CFE Media, LLC Presenters:
  5. 5. Energy Code Requirements for Data Centers Tom R. Squillo, PE, LEED AP, Environmental Systems Design Inc., Chicago, Ill.
  6. 6. Energy Code Requirements for Data Centers International Energy Conservation Code: IECC • Adopted by eight states and many local jurisdictions • Written in enforceable code language ASHRAE Energy Standard for Buildings: ASHRAE 90.1 • Standard instead of code. Now written in enforceable language so it can be adopted locally • Has more language specific to data centers California Building Energy Efficiency Standards: Title 24
  7. 7. Energy Code Requirements for Data Centers Where Do They Apply? • Check local jurisdiction for specific requirements or city energy codes • Many local jurisdictions refer to state codes • IECC allows compliance with ASHRAE 90.1 instead. This may be an advantage in some instances • Title 24 compliance required in California
  8. 8. Current Energy Code Adoption Status (U.S. DOE)
  9. 9. Projected Energy Code Adoption by end of 2015 (U.S. DOE)
  10. 10. Energy Code Requirements for Data Centers IECC – 2012: • IECC delineates between simple and complex systems. Stand alone DX ac units may fall under the simple category. Only very small units under 33,000 Btu/h capacity do not require economizers • All cooling systems with some form of common piping distribution fall under the complex category • All complex systems require economizers
  11. 11. Energy Code Requirements for Data Centers IECC – 2012: C403.4.1 Economizers. Economizers shall comply with Sections C403.4.1.1 through C403.4.1.4 • This section requires either an air or water economizer C403.4.1.1 Design capacity. Water economizer systems shall be capable of cooling supply air by indirect evaporation and providing up to 100% of the expected system cooling load at outdoor air temperatures of 50 F dry bulb/45 F wet bulb and below. • Exception for small systems below 33,000 Btu/h • Unlike ASHRAE 90.1, IECC has no specific exceptions for data centers that allow lower dry-bulb/wet-bulb temperatures. Dry-coolers are not allowed.
  12. 12. Energy Code Requirements for Data Centers ASHRAE 90.1-2010 • Data centers, considered “process cooling,” were excluded from the requirements of ASHRAE 90.1-2007 • The 2010 version eliminates the “process cooling” exemption, and adds specific language for computer rooms • To comply with the IECC-2012 code, ASHRAE 90.1-2010 may be used instead.
  13. 13. Energy Code Requirements for Data Centers ASHRAE 90.1 – 2010: 6.5.1 Economizers. Each Cooling System that has a fan shall include either an air or water economizer meeting the requirements of Sections 6.5.1.1 through 6.5.1.4. For data centers, economizers are not required for: • Small fan-cooling units less than 135,000 Btu/hr or 65,000 Btu/hr, depending on climate zone • Extremely hot and humid climate zones • Buildings with no central CHW plant, in which the total computer room cooling capacity is less than 250 tons • Buildings with a central CHW plant, and the computer room cooling load is less than 50 tons • Where cooling towers are not allowed • Addition of less than 50-ton computer room capacity to existing building • Various essential facilities (national defense, emergency response, etc.)
  14. 14. Energy Code Requirements for Data Centers ASHRAE 90.1 – 2010: 6.5.1.2 Water Economizers 6.5.1.2.1 Design capacity. Water economizer systems shall be capable of cooling supply air by indirect evaporation and by providing up to 100% of the expected system cooling load at outdoor air temperatures of 50 F dry bulb/45 F wet bulb and below. • For data centers, the requirement is relaxed slightly to allow 100% economizer cooling at 40 F dry bulb/35 F wet bulb and below • The code also allows dry-coolers for data centers, but they must provide 100% economizer cooling at 35 F dry bulb
  15. 15. Energy Code Requirements for Data Centers Important Changes to ASHRAE 90.1-2013: 6.5.1.2 Water Economizers • For data centers, the outdoor temperature limits for 100% water side economization are not a single condition, but are based on the individual climate zones 6.5.1.6 Economizer Humidification Impact. Systems with hydronic cooling and humidification systems designed to maintain inside humidity at a dew-point temperature greater than 35 F shall use a water economizer if an economizer is required by Section 6.5.1. • This essentially bans air side economizer systems for most data center systems if using a prescriptive approach.
  16. 16. Energy Code Requirements for Data Centers Important Changes to ASHRAE 90.1-2013: 6.6 Alternative Compliance Path • For data centers, the HVAC systems can comply by meeting minimum PUE requirements instead of Section 6.5-Prescriptive Path. • The minimum PUE values are based on the climate zone and range from 1.30 to 1.61. • PUE calculation is based on Green Grid Recommendation document dated May, 2011. • Option 1: Use peak PUE calculation (at 50% and 100% IT load) • Option 2: Use annual PUE, calculated with an approved hourly energy analysis program (DOE, BLAST, EnergyPlus, etc.)
  17. 17. Energy Code Requirements for Data Centers Title 24 – 2013: Highlights Specific to Data Centers • Data centers are exempt from normal economizer requirements • Air or water economizer required • Air economizer must provide 100% economization at 55 F dry bulb • Water economizer must provide 100% economization at 40 F dry bulb/35 F wet bulb • Economizer exceptions exist for small systems • Nonadiabatic humidification (steam, infrared) is prohibited
  18. 18. Energy Code Requirements for Data Centers Title 24 – 2013: Highlights Specific to Data Centers • Variable speed supply fans required for DX systems over 5 tons and all CHW systems • Supply fans shall vary airflow rate as a function of actual load • Containment required for data centers with a design load exceeding 175 W/sq ft • Containment exception for expansions and racks below 1 W/sq ft • Chilled water plants can have no more than 300 tons of air-cooled chillers
  19. 19. Relationships Between HVAC Efficiency and Power Usage Effectiveness (PUE) Bill Kosik, PE, CEM, BEMP, LEED AP BD+C, HP Enterprise Business, Technology Services, Chicago, Ill.
  20. 20. • Extreme regional variations in CO2 from electricity generation • Determine appropriate balance of water and electricity usage • Climate WILL impact HVAC energy use – select sites carefully • Use evaporative cooling where appropriate • Economizer strategy will be driven from climate characteristics • Design power and cooling modularity to match IT growth • Plan for power-aware computing equipment • Use aisle containment or direct-cooled cabinets • Design in ability to monitor and optimize PUE in real time • Push for highest supply temperatures and lowest moisture levels • Identify tipping point of server fan energy/inlet temperature • Minimize data center footprint by using high-density architecture DataCenterClimateSynergiesConvergence Levels of Optimization
  21. 21. Air Cooled Chiller coupled to chilled water coil in air handling unit (AHU) Direct Expansion packaged in ahu or separate DX coil & remote condenser Water Cooled Chiller coupled with chw coil in ahu. typical with open CT & flat plate HX Air Cooled Chiller coupled to chw coil in ahu. typical with closed CT Interior AHU direct outside air with direct evaporative cooling Exterior AHU indirect outside air and indirect evaporative cooling CRAH Unit perimeter air deliver with chilled water coil In-Row Unit close-coupled in rack containment system with module fans and chw coil Rear Door HX Individual rack door chw HX. Passive system with no fan Overhead Coil Module chw coils. Passive system with no fan Typical Data Center Cooling Strategies Air Side Economizers Water Side Economizers System 1 – DEC Direct Outside Air Economizer with Direct Evaporative Cooling System 2 – IEC Recirculating (Closed) Air System with Indirect Evaporative Cooling System 3 – IDEC Recirculating (Closed) and Direct Outside Air System 2 Stage Indirect-Direct Evaporative Cooling System 4 – IOA+EC Indirect Air to Air HX with Direct Evaporative Cooling in Secondary Air System 5 – OCT+FP HX Direct (Open) Evaporative Cooling Tower with Flat Plate System 6 – CCT w/Spray Indirect (Closed) Cooling Tower with Spray Mechanical Cooling Options Mechanical Cooling Options Cooling Configuration Options Cooling Configuration Options
  22. 22. 0.35 difference in PUE based on climate and cooling system type PUE Varies with Climate
  23. 23. Impacts of Climate on Economization Strategy This analysis shows the percent of total ton-hours that require mechanical cooling. The graphs depict two systems with two water temperatures, 12°C and 18°C: 1. Air-cooled chiller with dry-cooled economization 2. Air-cooled chiller with evaporative-cooler economization
  24. 24. The indirect evaporative and indirect air cooling systems have the lowest compressor energy used to cool the facility. The air-cooled DX and air-cooled chiller systems have the highest compressor energy. The air-cooled chiller with economizer is in the middle of the other options. Impacts of Climate on Economization Strategy Santiago, CHL
  25. 25. HVAC System and PUE Five HVAC options are shown. Each option was analyzed using the same input parameters such as climate attributes, air and water temperatures, etc. Each system performs differently based on the inherent strategies used to cool the data center and provide the proper airflow. For each option, the annual HVAC consumption and annual PUE us shown.
  26. 26. HVAC System and PUE Two options are shown. The only difference between the two options is the location of the data centers. Everything else, including power, cooling, and ancillary systems are modelled identically. Month by month PUE values are shown as well as monthly HVAC, electrical losses, lighting and other electrical energy. The energy consumption of the data center located in Singapore is markedly higher based on the hot and humid climate.
  27. 27. Elevated IT Equipment Temperatures Tom R. Squillo, PE, LEED AP, Environmental Systems Design Inc., Chicago, Ill.
  28. 28. Elevated IT Equipment Inlet Temperatures Legacy Data Center Design • Data center supply air set to 50 F to 55 F • DX systems cycled on/off and fought each other, with little capacity control or communication • Chilled water temperatures of 40 F to 45 F • No containment • Wide variation of temperatures entering IT equipment
  29. 29. Elevated IT Equipment Inlet Temperatures Why Were Low Supply Temperatures Needed? • Design needed to take into account massive mixing of hot air with supply air • Temperature of air entering IT equipment at tops of racks still acceptable • Cold room allowed some ride-through if cooling failed Why Was This Bad? • Wastes Energy – Too much airflow (low delta T) – Inefficient chiller operation – Limited economizer use – Unnecessary dehumidification • Hot spots • Inconsistent temperature control • Inconsistent humidity control Source: 42u.com
  30. 30. Elevated IT Equipment Inlet Temperatures ASHRAE Thermal Guidelines • Recommended range for IT inlet conditions – Temperature: 64 F to 80.6 F – Humidity: 41.9 F to 59 F dew point or 60% RH • Extended range for other classes of IT equipment Source: 42u.com
  31. 31. Elevated IT Equipment Inlet Temperatures Advantages of Elevated Temperatures • Increased equipment efficiency – 1.5% to 2.5% increase in chiller efficiency per degree increase in chilled water temperature – Increasing CHW supply temperature from 50 F to 60 F decreases chiller energy by up to 25% – Actual increase depends on chiller type and selection • Decreased unwanted dehumidification at air handling units – Coil temperature never gets below dew point if CHW temperature is raised – Eliminates condensate removal issues • Additional economizer hours – Actual advantage highly dependent on climate and system type – Longer equipment life
  32. 32. Air Side Economizer System: Phoenix, 60 F SA Temperature • Economization available for 5,038 hr/yr • Chillers off when outside air temperature is below 60 F (1,910 hr)
  33. 33. • Economization available for 7,396 hr/yr • Huge gain in economizer hours due to dry climate • Chillers off when outside air temperature is below 75 F (4,294 hr) Air Side Economizer System: Phoenix, 75 F SA Temperature
  34. 34. Air Side Economizer System: Charlotte, 60 F SA Temperature • Economization available for 5,300 hr/yr • More hours of full economization (3,778 hr) than Phoenix
  35. 35. Air Side Economizer System: Charlotte, 75 F SA Temperature • Economization available for 5,630 hr/yr • Due to humid climate, increase in economizer hours is minimal • Chillers off when outside air temperature is below 75 F (5,300 hr)
  36. 36. Water Side Economizer System: Phoenix, 60 F SA Temperature • Economization available for 3,829 hr/yr • Chillers off when outside air wet bulb temperature is below 33 F (55 hr)
  37. 37. Water Side Economizer System: Phoenix, 75 F SA Temperature • Economization available for 8,629 hr/yr • Huge gain in economizer hours due to dry climate • Chillers off when outside wet bulb temperature is below 53 F (4,420 hr)
  38. 38. Water Side Economizer System: Charlotte, 60 F SA Temperature • Economization available for 4,174 hr/yr • Chillers off when outside wet bulb temperature is below 33 F (1,009 hr)
  39. 39. Water Side Economizer System: Charlotte, 75 F SA Temperature • Economization available for 8,334 hr/yr • water side economizer has huge increase in economizer hours because less hours are locked out due to OA humidity • Chillers off when outside wet bulb temperature is below 53 F (4,513 hr)
  40. 40. Elevated IT Equipment Inlet Temperatures Bin Data Energy Use Analysis • Design Criteria – Typical enterprise/co-location data center load • 10,000 sq ft • 200 W/sq ft • 2 MW of total IT load – Within ASHRAE recommended conditions • Supply temperature = 75 F • Return temperature = 95 F • Space dew point temperature between 42 F and 59 F • Efficient adiabatic humidification used for analysis
  41. 41. Elevated IT Equipment Inlet Temperatures Two Systems and two climates analyzed and compared: System Options: 1. Direct Outside Air Economizer • Multiple 500 kW capacity rooftop supply/exhaust AHUs • OA Economizer control • Air-cooled chiller system for supplemental cooling 2. Water-cooled Chillers with Cooling Towers • High efficiency variable speed chillers (0.4 kW/ton) • Induced draft cooling towers • Plate and frame heat exchangers in series w/chillers • CRAH units with high efficiency EC fans on raised floor Climates: 1. Phoenix – Hot and dry 2. Charlotte – Warm and humid
  42. 42. Elevated IT Equipment Inlet Temperatures MPUE 1.20 1.30 1.17 1.26 1.15 1.22 1.13 - 1,000,000 2,000,000 3,000,000 4,000,000 5,000,000 6,000,000 Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT 60°F Supply Air 65°F Supply Air 70°F Supply Air 75°F Supply Air Phoenix Energy Consumption (kWh) Chiller CHW Pump CW Pump AHU Fan Tower Fan Supply Fan Exhaust Fan Humidification 1.34
  43. 43. Elevated IT Equipment Inlet Temperatures MPUE 1.24 1.19 1.22 1.17 1.20 1.15 1.19 1.13 - 1,000,000 2,000,000 3,000,000 4,000,000 5,000,000 6,000,000 Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT Direct Outdoor Air WC Chillers/CT 60°F Supply Air 65°F Supply Air 70°F Supply Air 75°F Supply Air Charlotte Energy Consumption (kWh) Chiller CHW Pump CW Pump AHU Fan Tower Fan Supply Fan Exhaust Fan Humidification
  44. 44. Elevated IT Equipment Inlet Temperatures Disadvantages of elevated temperatures • Working conditions in hot aisle – Hot aisle temperatures may rise above 100 F in some cases – OSHA requirements may come into effect – Think about temporary spot cooling for technology workers • Temperature ratings of cables and sprinkler heads in hot aisle – Some cabling rated for 40 C (104 F) • Reduced ride-through time during cooling failures – Critical server temperatures can be reached in minutes or seconds in some cases – Good containment will help reduce hot air recirculation, though may starve servers if system airflow is interrupted
  45. 45. Elevated IT Equipment Inlet Temperatures Conclusions: • Increasing IT inlet temperatures can help reduce overall energy use substantially by: – Increasing chiller efficiency (10 degree rise can increase efficiency up to 25%) – Reduce humidification requirements – Huge increases in economizer hours • Be careful of very high temperature conditions in the hot aisles affecting worker comfort and equipment ratings • Advantages highly dependent on climate and system type – Look at the psych chart for economizer and lock-out hours – air side and water side economizer systems will be affected differently
  46. 46. Partial Loads Bill Kosik, PE, CEM, BEMP, LEED AP BD+C, HP Enterprise Business, Technology Services, Chicago, Ill.
  47. 47. PUE Sensitivity to Low IT Loads How running IT equipment at partial load affects data center energy efficiency.
  48. 48. • Multiple systems allow for growth without over- provisioning • Modularity lowers fan energy and increases compressor effectiveness • Modularity is not the same as spreading the load across multiple pieces of equipment Efficiency Through Modularity
  49. 49. Electrical losses will increase as the IT load decreases. This increase must be included in cooling load at different loading points. Efficiency Through Modularity
  50. 50. Electrical System Topology and System Efficiency
  51. 51. As the cooling load is distributed over an increasing number of chillers, the overall power (and energy) grows. To maintain the highest efficiency, the chillers should be run as close as possible to their peak efficiency point.
  52. 52. Servers Are More Efficient but Use More Power Trends in Server Turn-Down Ratio
  53. 53. Server Modularity 45 hot-plug cartridges Compute, Storage, or Combination x86 , ARM, or Accelerator • Single-server = 45 servers per chassis • Quad-server =180 servers per chassis (future capability) – that is 1800 servers per cabinet or 45 kW Dual low-latency switches • Switch Module (45 x 1 GB downlinks) Compute, Storage, or combination x86 , ARM, or Accelerator
  54. 54. The PUE values are predicted using data center energy use simulation techniques. Many assumptions are made which affect the predicted energy use and PUE. These ranges are meant to be indicators of the PUE envelope that might be expected based on sub-system efficiency levels, geographic location and methods of operations. Detailed energy use simulation is required to develop more granular and accurate analyses. Input datafor "high"PUEcase Singapore, SGP UseWater Economizer NO UseAdiabatic Cooling NO Lighting (w/SF) 1.50 Misc Power (%of IT) 6.0% Electrical System Loss (%) 10.0% Air-cooled evap temp (°F) 65.0 Fan Pressure 2.0 Input datafor "low"PUEcase Helsinki, FIN UseWater Economizer YES UseAdiabatic Cooling YES Lighting (w/SF) 1.00 Misc Power (%of IT) 4.0% Electrical System Loss (%) 8.5% Air-cooled evap temp (°F) 65.0 Fan Pressure 2.0
  55. 55. Codes and Standards References from Today’s Webcast • HVAC: ASHRAE 90.1: Energy Standard for Buildings Except Low-Rise Residential Buildings • HVAC: ASHRAE 62.1, 62.2, and Air Movement • International Energy Conservation Code • U.S. Green Building Council LEED v4 • California Building Energy Efficiency Standards: Title 24
  56. 56. Bill Kosik, PE, CEM, BEMP, LEED AP BD+C, HP Enterprise Business, Technology Services, Chicago, Ill. Tom R. Squillo, PE, LEED AP, Environmental Systems Design Inc., Chicago, Ill. Moderator: Jack Smith, Consulting-Specifying Engineer and Pure Power, CFE Media, LLC Presenters:
  57. 57. Thanks to Today’s Webcast Sponsors:
  58. 58. Webcasts and Research • Modular data center design • HVAC: ASHRAE 62.1, 62.2, and Air Movement • 2013 HVAC, BAS state of the industry report
  59. 59. HVAC for Data Centers Sponsored by: Join the discussion about this Webcast on Twitter at #CSEdatacenterHVAC

×