Greening Data Centers                                                                   Dallas Thornton                   ...
Data Centers Are Enormous Users of Power                                 US Data Centers (TeraWatt Hours per Year)        ...
Measuring Data Center Facility Efficiency • The most common measure is Power Use Efficiency (PUE):                        ...
PUE Tabletop Reference…        PUE                 Level of Efficiency         3.0                     Very Inefficient   ...
SDSC Data Center Overview • ~19,000 sq. ft., 13 MW of on‐site power • Regional co‐location data center for UC system      ...
Optimizing Features • Aisle Thermal Containment          • 15ᵒ ΔT from top to bottom of rack → 1ᵒ ΔT           • 10ᵒ ‐ 15ᵒ...
Optimizing Features (Cont.) • Increased Supply Temperatures          • Move to near top of ASHRAE spec. (80ᵒ F)          •...
Optimizing Features (Cont.) • Rack Blanking Panels          • Cost effective solutions: Coro‐plast • Floor Brushes        ...
SDSC/McGill Data Center Conceptual Design • Goal: Most Efficient Class One Data Center in North America • Optimize Cooling...
Free Cooling Analysis with 65F CHWS                                               0.030                                   ...
Supplemental Cooling:     Seasonal Ice Storage Pond SystemSAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA,...
Supplemental Cooling:     Seasonal Ice Storage Pond SystemSAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA,...
Backup • Pay for rental chillers only when (if) you ever need it • Design for portable chillers to connect in an emergency...
Results                                    Supply Temperatures                                     Annual Energy Use      ...
Potential Facility‐Related Cost Savings                                                                     Assumptions   ...
“Anyone who knows all the answers most                            likely misunderstood the questions.”SAN DIEGO SUPERCOMPU...
Upcoming SlideShare
Loading in...5
×

20110302 on vector green datacenters

717
-1

Published on

SDSC-McGill study on how relocating data centers dramatically reduces costs

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
717
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
17
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

20110302 on vector green datacenters

  1. 1. Greening Data Centers Dallas Thornton SDSC Division Director, Cyberinfrastructure Services March 2, 2011SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  2. 2. Data Centers Are Enormous Users of Power US Data Centers (TeraWatt Hours per Year) US Televisions (248 Million Units) 125 61 27 Sources: Report to Congress on Server and Data Center Energy Efficiency Public Law 109-431; U.S. Environmental Protection Agency ENERGY STAR Program, August 2, 2007; Kaufman, Ron. Televisions Hidden Agenda. TurnOffYourTV.com, 2004SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  3. 3. Measuring Data Center Facility Efficiency • The most common measure is Power Use Efficiency (PUE):  [Total Datacenter Electrical Load]  PUE = [Datacenter IT Equip. Electrical Load] Source: Green GridSAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  4. 4. PUE Tabletop Reference… PUE Level of Efficiency 3.0 Very Inefficient 2.5 Inefficient Typical Server Rooms From office conversions (worst) to basic hot/cold  2.0 Average aisle legacy data centers (better) 1.5 Efficient Optimized Data Centers Hot/cold aisle containment, HVAC throttling based  1.2 Very Efficient on loads, and high‐efficiency UPSes 1.0 Ideal Greenfield Design in Canada All of the above + innovative climate‐leveraging  Sources: Green Grid, 2008 UC NAM Data Center Audit, 2009 UCSD/SDSC NAM Data Center Audit, 2010 technologies and designs SDSC/McGill University Joint Data Center DesignSAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  5. 5. SDSC Data Center Overview • ~19,000 sq. ft., 13 MW of on‐site power • Regional co‐location data center for UC system • 100+ projects from 6 campuses • Energy efficient alternative to server closets, offices, etc. • Home of SD‐NAP • Many 10 Gb and 1 Gb connections to other organizations and networks: • CENIC, Cox, Time Warner, Salk Institute, Scripps Research Institute, SDSC, etc. SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  6. 6. Optimizing Features • Aisle Thermal Containment • 15ᵒ ΔT from top to bottom of rack → 1ᵒ ΔT  • 10ᵒ ‐ 15ᵒ increase in return temperatures • Cold aisle and hot aisle options • Fire code considerationsSAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  7. 7. Optimizing Features (Cont.) • Increased Supply Temperatures • Move to near top of ASHRAE spec. (80ᵒ F) • Drives AHU return temperatures higher,  allowing more cooling from chilled water • VFD Fans on AHUs • Allows for fan energy savings… IF accurate  controls can be put in place. • Adaptive Controls • Address redundancy and inefficient cooling • Allow ‘big picture’ control of cooling, throttling  based on real‐time loadsSAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  8. 8. Optimizing Features (Cont.) • Rack Blanking Panels • Cost effective solutions: Coro‐plast • Floor Brushes • Conveyer belt brush: sold in varying lengths • Efficient Electrical Systems • 480V/277V or (even better) 400V/240V power • Efficient UPS and generator configsSAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  9. 9. SDSC/McGill Data Center Conceptual Design • Goal: Most Efficient Class One Data Center in North America • Optimize Cooling Systems for Quebec Climate • Evaporative free cooling – Primary cooling • Seasonal ice storage – Top up cooling • No compressor based cooling • 1.06 PUE means UC could achieve full CapEx recovery in less than 10 years with energy cost savings • Lower‐cost, green hydro power • $0.045/kWh vs. $0.08‐$0.15/kWh in California • Design funded by grants from Canada‐California Strategic Innovation Partnerships (CCSIP) and CLUMEQSAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  10. 10. Free Cooling Analysis with 65F CHWS 0.030 Data Source: Government of Canada - National Climate Data & Information Archive 0.028 Data Set: WMO #71627, Montreal/Pierre Elliott Trudeau Airport, Typical Year 0.026 Elevation: 118 feet Humidity Ratio (lbs H2O per lbs dry air) 0.024 Air Pressure: 14.633224 psia 0.022 0.020 Auxillary Cooling 152 hrs/yr 0.018 0.016 Partial Free Cooling 1380 hrs/yr 0.014 0.012 0.010 0.008 Full Free Cooling 0.006 7228 hrs/yr 0.004 0.002 0.000 -30 -25 -20 -15 -10 -5 0 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 100 Dry Bulb Temperature (F)SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  11. 11. Supplemental Cooling: Seasonal Ice Storage Pond SystemSAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  12. 12. Supplemental Cooling: Seasonal Ice Storage Pond SystemSAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  13. 13. Backup • Pay for rental chillers only when (if) you ever need it • Design for portable chillers to connect in an emergencySAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  14. 14. Results  Supply Temperatures Annual Energy Use Mechanical Cooling Needed Water Usage Water Cooled Hours of Free Additional Load Air Cooled Cooling / year PUE at Extreme Cost Cost Evaporation + Air Cooled Water Cooled Energy Weather Blowdown ($5.52/1,000 ( $0.058/ kWh) Hours per Year1 Carry Over (wetbulb = gal) 68.7°F) °C °F °C °F hrs/yr % of yr kWh/yr2 $ tons gallons gallons $ 10% 90% 29.4 85.0 23.9 75.0 8,532 97% 1.06 74,543,000 $4,323,000 228 0 33,200,000 8,100,000 $228,000SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  15. 15. Potential Facility‐Related Cost Savings Assumptions • 5 MW IT Load • 24x7 Operation Typical Local DC Efficient Local DC Ultra-Efficient • 2.0 PUE • 1.35 PUE • 1.06 PUE • 10 MW Consumption • 6.75 MW Consumption • 5.3 MW Consumption • $0.10/kWh Power Costs • $0.10/kWh Power Costs • $0.05/kWh Power Costs • $8.8M Power Bill • $5.9M Power Bill • $2.3M Power Bill Potential Cost Savings of 74% and Energy Savings of 47% Though Facility Changes Alone!SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  16. 16. “Anyone who knows all the answers most  likely misunderstood the questions.”SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×