Information Technology Systems
Other Opportunities for Energy Efficient Design
Data Center Metrics & Benchmarking
NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable Energy, LLC.US Trends in Data Centre Design with NREL Examples of Large Energy Savings Understanding and Minimising The Costs of Data Centre Based IT Services Conference University of Liverpool O?o Van Geet, PE June 17, 2013
20"5"10"15"20"25"30"35"40"1.00"1.03"1.06"1.09"1.12"1.15"1.18"1.21"1.24"1.27"1.30"1.33"1.36"1.39"1.42"1.45"1.48"1.51"1.54"1.57"1.60"1.63"1.66"1.69"1.72"1.75"1.78"1.81"1.84"1.87"1.90"1.93"Cost%in%Millions%of%Dollars%P.U.E.%Total%Annual%Electrical%Cost:%Compute%+%Facility%2Assume ~20MW HPC system & $1M per MW year utility costFacilityHPCCost and Infrastructure Constraints
3BPG Table of Contents • Summary • Background • Informa?on Technology Systems • Environmental CondiLons • Air Management • Cooling Systems • Electrical Systems • Other Opportuni?es for Energy Eﬃcient Design • Data Center Metrics & Benchmarking
5 Data Center equipment’s environmental condiLons should fall within the ranges established by ASHRAE as published in the Thermal Guidelines book. Environmental CondiLons ASHRAE Reference: ASHRAE (2008), (2011) (@ Equipment Intake) Recommended AllowableTemperature Data Centers ASHRAE 18° – 27°C 15° – 32°C (A1) 5° – 45°C (A4) Humidity (RH) Data Centers ASHRAE 5.5°C DP – 60% RH and 15oC DP 20% – 80% RHEnvironmental SpeciﬁcaLons (°C)
62011 ASHRAE Allowable Ranges Dry Bulb Temperature
8EsLmated Savings Baseline System DX Cooling with no economizer Load 1 ton of cooling, constant year-‐round Eﬃciency (COP) 3 Total Energy (kWh/yr) 10,270 RECOMMENDED RANGE ALLOWABLE RANGE Results Hours Energy (kWh) Hours Energy (kWh) Zone1: DX Cooling Only 25 8 2 1 Zone2: Mul?stage Indirect Evap. + DX (H80) 26 16 4 3 Zone3: Mul?stage Indirect Evap. Only 3 1 0 0 Zone4: Evap. Cooler Only 867 97 510 57 Zone5: Evap. Cooler + Outside Air 6055 417 1656 99 Zone6: Outside Air Only 994 0 4079 0 Zone7: 100% Outside Air 790 0 2509 0 Total 8,760 538 8,760 160 Es0mated % Savings -‐ 95% -‐ 98%
9Data Center Eﬃciency Metric • Power Usage EﬀecLveness (P.U.E.) is an industry standard data center eﬃciency metric. • The raLo of power used or lost by data center facility infrastructure (pumps, lights, fans, conversions, UPS…) to power used by compute. • Not perfect, some folks play games with it. • 2011 survey esLmates industry average is 1.8. • Typical data center, half of power goes to things other than compute capability. 9“IT power” + “Facility power”P.U.E. =“IT power”
11-200204060801000.750.850.951.051.151.251.351.45OutdoorTemperature(°F)PUEData Center PUE -200204060801000.750.850.951.051.151.251.351.45OutdoorTemperature(°F)PUEData Center PUEOutdoor Temperature
“I am re-‐using waste heat from my data center on another part of my site and my PUE is 0.8!” ASHRAE & friends (DOE, EPA, TGG, 7x24, etc..) do not allow reused energy in PUE & PUE is always >1.0. Another metric has been developed by The Green Grid +; ERE – Energy Reuse EﬀecLveness. h?p://www.thegreengrid.org/en/Global/Content/white-‐papers/ERE
13ERE – Adds Energy Reuse UtilityCoolingUPS PDUITRejectedEnergy(a)(b)(c) (d)(f)(e)ReusedEnergy(g)
14Credit: Haselden ConstrucLon • More than 1300 people in DOE oﬃce space on NREL’s campus • 33,445 m2 • Design/build process with required energy goals ̶ 50% energy savings from code ̶ LEED Pla?num • Replicable ̶ Process ̶ Technologies ̶ Cost • Site, source, carbon, cost ZEB:B ̶ Includes plugs loads and datacenter • Firm ﬁxed price -‐ US $22.8/m2 construcLon cost (not including $2.5/m2 for PV from PPA/ARRA) • Opened June 10, 2010 (First Phase) DOE/NREL Research Support Facility
15RSF Datacenter • Fully containing hot aisle – Custom aisle ﬂoor and door seals – Ensure equipment designed for cold aisle containment § And installed to pull cold air Ø Not hot air… – 1.18 annual PUE – ERE = 0.9 • Control hot aisle based on return temperature of ~90F. • Waste heat used to heat building. • Outside air and EvaporaLve Cooling • Low fan energy design • 176 Sq m. Credit: Marjorie Scho?/NREL
1717Data Center Load GROWTH (40+ kW in 2 years) since NO recharge!
18Move to Liquid Cooling • Server fans are ineﬃcient and noisy. – Liquid doors are an improvement but we can do beger! • Power densiLes are rising making component-‐level liquid cooling soluLons more appropriate. • Liquid Beneﬁt – Thermal stability, reduced component failures. – Beger waste heat re-‐use op?ons. – Warm water cooling, reduce/eliminate condensa?on. – Provide cooling with higher temperature coolant. • Eliminate expensive & ineﬃcient chillers. • Save wasted fan energy and use it for compuLng. • Unlock your cores and overclock to increase throughput!
19Liquid Cooling – Overview Water and other liquids (dielectrics, glycols and refrigerants) may be used for heat removal. • Liquids typically use LESS transport energy (14.36 Air to Water Horsepower ra?o for example below). • Liquid-‐to-‐liquid heat exchangers have closer approach temps than Liquid-‐to-‐air (coils), yielding increased outside air hours.
202011 ASHRAE Liquid Cooling Guidelines NREL ESIF HPC (HP hardware) using 24 C supply, 40 C return –W4/W5
21NREL HPC Data Center Showcase Facility • 10MW, 929 m2 • Leverage favorable climate • Use direct water to rack cooling • DC manager responsible for ALL DC cost including energy! • Waste heat captured and used to heat labs & oﬃces. • World’s most energy eﬃcient data center, PUE 1.06! • Lower CapEx and OpEx. Leveraged exper0se in energy eﬃcient buildings to focus on showcase data center. Chips to bricks approach• Opera?onal 1-‐2013, Petascale+ HPC Capability in 8-‐2013 • 20-‐year planning horizon ̶ 5 to 6 HPC genera?ons. High Performance CompuLng
22CriLcal Data Center Specs • Warm water cooling, 24C ̶ Water much beger working ﬂuid than air -‐ pumps trump fans. ̶ U?lize high quality waste heat, 40C or warmer. ̶ +90% IT heat load to liquid. • High power distribuLon ̶ 480VAC, Eliminate conversions. • Think outside the box ̶ Don’t be sa?sﬁed with an energy eﬃcient data center nestled on campus surrounded by ineﬃcient laboratory and oﬃce buildings. ̶ Innovate, integrate, op?mize. Dashboards report instantaneous, seasonal and cumulaLve PUE values.
23• Data center equivalent of the “visible man” – Reveal not just boxes with blinky lights, but the inner workings of the building as well. – Tour views into pump room and mechanical spaces – Color code pipes, LCD monitors NREL ESIF Data Center Cross SecLon
24• 2.5 MW – Day onecapacity (Utility $500K/yr/MW)• 10 MW – UltimateCapacity• Petaflop• No Vapor Compressionfor CoolingData Center
25Summer Cooling ModePUE –Typical Data Center =1.5 – 2.0NREL ESIF= 1.04* 30% more energyefficient than yourtypical “green” datacenterData Center
26Winter Cooling ModeERE – Energy ReuseEffectivenessHow efficient are weusing the waste heat toheat the rest of thebuilding?NREL ESIF= .7 (we use30% of waste heat)(more with future campusloops)Future CampusHeating LoopFutureCampusHeatingLoopHigh BayHeatingLoopOfficeHeatingLoopConferenceHeatingLoopData Center
2795 degAir75 degAir• Water to rack Cooling for High PerformanceComputers handles 90% of total load• Air Cooling for Legacy Equipment handles 10% of total LoadData Center – Cooling Strategy
28 PUE 1.0X -‐-‐ Focus on the “1” Facility PUEIT Power ConsumptionEnergy Re-useWe all know how to do this!True efficiency requires 3-D optimization.
29Facility PUEIT Power ConsumptionEnergy Re-useWe all know how to do this!Increased work per wattReduce or eliminate fansComponent level heat exchangeNewest processors are more efficient.True efficiency requires 3-D optimization. PUE 1.0X -‐-‐ Focus on the “1”
30Facility PUEIT Power ConsumptionEnergy Re-useTrue efficiency requires 3-D optimization.We all know how to do this!Increased work per wattReduce or eliminate fansComponent level heat exchangeNewest processors are more efficient.Direct liquid cooling,Higher return water tempsHolistic view of data centerplanning PUE 1.0X -‐-‐ Focus on the “1”
31What’s Next? ü Energy Eﬃcient supporLng infrastructure. ü Pumps, large pipes, high voltage (380 to 480) electrical to rack ü Eﬃcient HPC for planned workload. ü Capture and re-‐use waste heat. Can we manage and “opLmize” workﬂows, with varied job mix, within a given energy “budget”? Can we do this as part of a larger “ecosystem”? 31 Steve Hammond
32Other Factors 32 5DemandSMART: Comprehensive Demand ResponseBalancing supply and demand on the electricity grid is difficult and expensive. End usersthat provide a balancing resource are compensated for the service.Annual Electricity Demand As a Percent of Available Capacity50%100%Winter Spring Summer Fall75%25%90%4MW solarUse waste heatBetter rates, shed loadDC as part of Campus Energy System
33ParLng Thoughts • Energy Efficient Data Centers – been there, done that– We know how, let’s just apply best practices.– Don’t fear H20: Liquid cooling will be increasingly prevalent.• Metrics will lead us into sustainability– If you don’t measure/monitor it, you can’t manage it.– As PUE has done; ERE, Carbon Use Effectiveness (CUE), etc. will help drivesustainability.• Energy Efficient and Sustainable Computing – it’s all about the “1”– 1.0 or 0.06? Where do we focus? Compute & Energy Reuse.• Holistic approaches to Energy Management.– Lots of open research questions.– Projects may get an energy allocation rather than a node-hour allocation.
34Otto VanGeet303.384.7369Otto.VanGeet@nrel.gov NREL RSF 50% of code energy use Net zero annual energy $22.8/m2 ConstrucLon Cost QUESTIONS?
35• Thermoelectric power generaLon (coal, oil, natural gas and nuclear) consumes about 1.1 gallon per kW hour, on average. • This amounts to about 9.6 M gallons per MW year. • We esLmate about 2.5 M gallons water consumed per MW year for on-‐site evaporaLve cooling towers at NREL. • If chillers need 0.2MW per MW of HPC power, then chillers have an impact of 2.375M gallons per year per MW. • Actuals will depend on your site, but evap. cooling doesn’t necessarily result in a net increase in water use. • Low Energy use = Lower water use. Energy Reuse uses NO water! Water ConsideraLons “We shouldn’t use evaporative cooling, water is scarce.”NREL PIX 00181
36Data Center Eﬃciency • Choices regarding power, packaging, cooling, and energyrecovery in data centers drive TCO.• Why should we care?• Carbon footprint.• Water usage.• Mega$ per MW year.• Cost: OpEx ~ IT CapEx!• A less eﬃcient data center takes away power and dollars that could otherwise be used for compute capability.
37HolisLc Thinking • Approach to Cooling: Air vs Liquid and where? – Components, Liquid Doors or CRACs, … • What is your “ambient” Temperature? – 55F, 65F, 75F, 85F, 95F, 105F … – 13C, 18C, 24C, 30C, 35C, 40.5C … • Electrical distribuLon: – 208v or 480v? • “Waste” Heat: – How hot? Liquid or Air? Throw it away or Use it?
38Liquid Cooling – New ConsideraLons • Air Cooling – Humidity – Fan failures – Air side economizers, par?culates • Liquid Cooling – pH & bacteria – Dissolved solids – Corrosion inhibitors, etc. • When considering liquid cooled systems, insist that providers adhere to the latest ASHRAE water quality spec or it could be costly.
402011 ASHRAE Thermal Guidelines 2011 Thermal Guidelines for Data Processing Environments – Expanded Data Center Classes and Usage Guidance. White paper prepared by ASHRAE Technical Commi?ee TC 9.9
41Energy Savings PotenLal: Economizer Cooling Energy savings poten?al for recommended envelope, Stage 1: Economizer Cooling.12 (Source: Billy Roberts, NREL)
42Data Center Energy • Data centers are energy intensive faciliLes. – 10-‐100x more energy intensive than an oﬃce. – Server racks well in excess of 30kW. – Power and cooling constraints in exis?ng facili?es. • Data Center ineﬃciency steals power that would otherwise support compute capability. • Important to have DC manager responsible for ALL DC cost including energy!
43Energy Savings PotenLal: Economizer + Direct EvaporaLve Cooling Energy savings poten?al for recommended envelope, Stage 2: Economizer + Direct Evap. Cooling.12 (Source: Billy Roberts, NREL)
44Energy Savings PotenLal: Economizer + Direct Evap. + MulLstage Indirect Evap. Cooling Energy savings poten?al for recommended envelope, Stage 3: Economizer + Direct Evap. + Mul?stage Indirect Evap. Cooling.12 (Source: Billy Roberts, NREL)
45Data Center Energy Eﬃciency • ASHRAE 90.1 2011 requires economizer in most data centers. • ASHRAE Standard 90.4P, Energy Standard for Data Centers and Telecommunica0ons Buildings • PURPOSE: To establish the minimum energy eﬃciency requirements of Data Centers and TelecommunicaLons Buildings, for: • Design, construcLon, and a plan for operaLon and maintenance • SCOPE: This Standard applies to: • New, new addiLons, and modiﬁcaLons to Data Centers and TelecommunicaLons Buildings or porLons thereof and their systems • Will set minimum PUE based on climate. • More detail at : h?ps://www.ashrae.org/news/2013/ashrae-‐seeks-‐input-‐on-‐revisions-‐to-‐data-‐centers-‐in-‐90-‐1-‐energy-‐standard-‐scope
461. Reduce the IT load -‐ VirtualizaLon & ConsolidaLon (up to 80% reducLon) 2. Implement contained hot aisle and cold aisle layout. ̶ Curtains, equipment conﬁgura?on, blank panels, cable entrance/exit ports, 3. Install economizer (air or water) and evaporaLve cooling (direct or indirect). 4. Raise discharge air temperature. Install VFD’s on all computer room air condiLoning (CRAC) fans (if used) and network the controls. 5. Reuse data center waste heat if possible. 6. Raise the chilled water (if used) set-‐point. ̶ Increasing chiller water temp by 1°C reduces chiller energy use by about 3% 7. Install high eﬃciency equipment including UPS, power supplies, etc.. 8. Move chilled water as close to server as possible (direct liquid cooling). 9. Consider centralized high eﬃciency water cooled chiller plant ̶ Air-‐cooled = 2.9 COP, water-‐cooled = 7.8 COP Energy ConservaLon Measures
47Equipment Environmental SpeciﬁcaLon Air Inlet to IT Equipment is theimportant specification to meetOutlet temperature is notimportant to IT Equipment
48 Recommended Range (Statement of Reliability) Preferred facility opera?on; most values should be within this range. Allowable Range (Statement of FuncLonality) Robustness of equipment; no values should be outside this range. MAX ALLOWABLE RACK INTAKE TEMPERATURE MAX RECOMMENDED Over-‐Temp Recommended Range Under-‐Temp MIN RECOMMENDED MIN ALLOWABLE Allowable Range Key Nomenclature
49Improve Air Management • Typically, more air circulated than required. • Air mixing and short circuiLng leads to: – Low supply temperature – Low Delta T • Use hot and cold aisles. • Improve isolaLon of hot and cold aisles. – Reduce fan energy – Improve air-‐condi?oning eﬃciency – Increase cooling capacity 49 Hot aisle/cold aisle conﬁguraLon decreases mixing of intake & exhaust air, promoLng eﬃciency. Source: hNp://www1.eere.energy.gov/femp/pdfs/eedatacenterbestprac0ces.pdf
50Isolate Cold and Hot Aisles Source: hNp://www1.eere.energy.gov/femp/pdfs/eedatacenterbestprac0ces.pdf 70-80ºF vs. 45-55ºF95-105ºF vs. 60-70ºF
51Adding Air Curtains for Hot/Cold IsolaLon Photo used with permission from the NaLonal Snow and Ice Data Center. h?p://www.nrel.gov/docs/fy12osL/53939.pdf
Courtesy of Henry Coles, Lawrence Berkeley NaLonal Laboratory
54“Chill-‐oﬀ 2” EvaluaLon of Close-‐coupled Cooling SoluLons Courtesy of Geoﬀrey Bell and Henry Coles, Lawrence Berkeley Na0onal Laboratory less energyuse
55Cooling Takeaways… • Use a central plant (e.g. chiller/CRAHs) vs. CRAC units • Use centralized controls on CRAC/CRAH units to prevent simultaneous humidifying and dehumidifying. • Move to liquid cooling (room, row, rack, chip) • Consider VSDs on fans, pumps, chillers, and towers • Use air-‐ or water-‐side free cooling. • Expand humidity range and improve humidity control (or disconnect).