How Green Is Your Data Center

1,032 views

Published on

Presentation for IT professionals on improving energy consumption performance in data centers.

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,032
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
6
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

How Green Is Your Data Center

  1. 1. How green is your Data Center? David Gillespie, AIA, CSI, LEED AP Director: Communications & Technology CRS Engineering & Design Consultants 1
  2. 2. How green is your Data Center? Green as is environmental conscious Green as in money (saved or spent) Green as in solar powered Green as in reduced carbon footprint Green as is those little green lights Green as in ??? Answer: YES David Gillespie, AIA, CSI, LEED AP Director: Communications & Technology CRS Engineering & Design Consultants 2
  3. 3. Commercial Building Resource Impact •Energy experts estimate that data centers gobble up somewhere between 1.5 percent and 3 percent of all electricity generated in the United States. •According to the Uptime Institute, more than 60 percent of the power used to cool equipment in the data center is completely wasted. ©2009 C.R.S Engineering & Design Consultants 3
  4. 4. Perception is NOT Reality ©2009 C.R.S Engineering & Design Consultants 4
  5. 5. Data Center “Deficiencies” opportunities 1. Insufficient power 29% 2. Excessive heat 29% 3. Insufficient raised floor area 21% 4. Other factors* 13% 5. Poor location 6% 6. Excessive facility cost 3% * Other factors (read: Human Error) includes absent or ineffective management policies and practices. © 2005 Gartner ©2009 C.R.S Engineering & Design Consultants 5
  6. 6. Where’s the Problem? Competing priorities IT Department Facilities Management • Computing Services • Life Safety/ Fire Protection • Applications and Servers • Security • Data Storage/ Retrieval • Building systems: M/E/P • Services Management • Infrastructure support • Disaster Avoidance/ Recovery • Control and Monitoring • Business Continuity • O&M • Commissioning and Testing • Disaster Avoidance/ Recovery Mind the Gap ©2009 C.R.S Engineering & Design Consultants 6
  7. 7. Network-Critical Physical Infrastructure (NCPI) NCPI Fire & Safety ©2009 C.R.S Engineering & Design Consultants 7
  8. 8. * The Numbers • 90% of US data centers will experience a service interruption that will affect business operations in the next 5 years. • 80% of data centers surveyed have experienced a service interruption in the last 5 years and of those 82% were power related. • 77% of US data center managers believe they will have major physical improvement or be forced to relocate their data center in the next 10 years. • 53% of US data centers are planning for an expansion in the next 5 years. • 42% believe business growth will be the largest factor in data center changes over the next 10 years, followed by facility age and regulatory compliance • Survey says: 2 largest DC problems: Power, Heat. * © 2006 Data Center Institute Keynote Address ©2009 C.R.S Engineering & Design Consultants 8
  9. 9. We have a problem… • Original data center layout sketch from client • Space is overheating • Both CRAC units are in operation 24/7. (2 units 10 Tons each) • Power panels have capacity • UPS is only loaded to 38.5% • FYI: Two new blade servers racks ship in three days. CRAC units 10 Tons each ©2009 C.R.S Engineering & Design Consultants 9
  10. 10. Where can we start? Techniques for data collection • Template calculators (IT electrical, cooling, service, power factors, generators, UPS, growth, etc.) • Summarize nameplate data: inventory all IT equipment and assign power factors. • Field measurement: collect temperature, power consumption and air flow in the field. (timing is critical) • Consumption records. (UPS data) • Trend analysis of computing schedules. • Interview stakeholders. • RESULT: total power (in watts) consumed by DC Power consumed = Heat generated = Cooling required ©2009 C.R.S Engineering & Design Consultants 10
  11. 11. Network Inventory HELP… To help determine what has caused the current problems and to help avoid future issues the IT department forwards the latest equipment inventory for the data center. This is a typical IT management document. That’s a nice spreadsheet but what does all this mean? ©2009 C.R.S Engineering & Design Consultants 11
  12. 12. Equipment Inventory Example: Single Rack PowerConnect 2216 POW ER 1 9 2 10 3 11 4 12 5 13 6 14 7 15 8 16 SPD/LNK/ACT F DX/HDX SPD/LNK/ACT F DX/HDX 1 3 5 7 9 11 13 15 Dell: 2216 #3 Dell: 2216 #3 Dell: 2216 #1 Dell: 2216 #1 2 4 6 8 10 12 14 16 1 3 5 7 9 11 13 15 PowerConnect 2216 SPD/LNK/ACT 1 2 3 4 5 6 7 8 F DX/HDX SPD/LNK/ACT POW ER 9 10 11 12 13 14 15 16 F DX/HDX Dell: 2216 #2 Dell: 2216 #2 2 4 6 8 10 12 14 16 1 3 5 7 9 11 13 15 PowerConnect RU 45 RU 45 2216 SPD/LNK/ACT 1 2 3 4 5 6 7 8 F DX/HDX SPD/LNK/ACT POW ER 9 10 11 12 13 14 15 16 F DX/HDX 2 4 6 8 10 12 14 16 P R O C E S S O R R 4 PowerEdge 3250 Dell: 3250 #1 Dell: 3250 #1 P R O C E S S O R R 4 PowerEdge 3250 Dell: 3250 #2 Dell: 3250 #2 RU 40 P R O C E S S O R R 4 PowerEdge 3250 Dell: 3250 #3 RU 40 Dell: 3250 #3 P R O C E S S O R R 4 PowerEdge 3250 Dell: 3250 #4 Dell: 3250 #4 E SC E NTER PowerVault 124T Dell: 124T #1 RU 35 E SC PowerVault 124T RU 35 Dell: 124T #2 E NTER E SC E NTER PowerVault 124T Dell: 124T #3 Dell: 310-4227 #1 Dell: 310-4227 #1 RU 30 RU 30 7 6 5 4 3 2 1 RU 25 PowerEdge 4600 Dell: 4600 (Rack) #3 RU 25 1 2 Dell: 4600 (Rack) #3 7 6 5 4 3 2 1 RU 20 PowerEdge 4600 Dell: 4600 (Rack) #4 RU 20 1 2 Dell: 4600 (Rack) #4 E SC E NTER PowerVault 124T Dell: 124T #4 1 1 2 2 3 PowerVault 745N 4 Dell: 745N #1 Dell: 745N #1 RU 15 RU 15 7 6 5 4 3 2 1 RU 10 PowerEdge 4600 Dell: 4600 (Rack) #1 1 2 Dell: 4600 (Rack) #1 RU 10 7 6 5 4 3 2 1 RU 05 Dell: 4600 (Rack) #2 RU 05 PowerEdge 4600 1 2 Dell: 4600 (Rack) #2 Dell: RPS-600 #3 RPS4 RPS2 RPS1 Dell: RPS-600 #3 Dell: RPS-600 #2 RPS4 RPS2 RPS1 Dell: RPS-600 #2 Dell: RPS-600 #1 RPS4 RPS2 RPS1 Dell: RPS-600 #1 43.0 Amps B-LINE Systems: Access 47 U 19 EIA 23 wide B-LINE Systems: Access 47 U 19 EIA 23 wide 12040.0 BTU/HR Rack Front 700 Lbs. Rack Rear ©2009 C.R.S Engineering & Design Consultants 12
  13. 13. Example: Generic Data Center • Existing layout ©2009 C.R.S Engineering & Design Consultants 13
  14. 14. Modeling: Existing Conditions • Thermal model initial results: Immanent Failure Air flow (2) 10 ton CRAC units – 20 Tons ©2009 C.R.S Engineering & Design Consultants 14
  15. 15. Modeling: Hot/ Cold Aisle • Solution: Add more cooling, change equipment orientation • Result: Improved environment, not ideal COLD AISLE HOT AISLE COLD AISLE (3) 10 ton CRAC units – 30 Tons ©2009 C.R.S Engineering & Design Consultants 15
  16. 16. Solutions • Alternate layout and equipment adjustments COLD AISLE COLD AISLE COLD AISLE COLD AISLE (4) 7.5 ton CRAC units – 30 tons ©2009 C.R.S Engineering & Design Consultants 16
  17. 17. Getting it Done • Critical Issues: Power, Heat, Space • Solutions: Infrastructure Matching • Assessment - power, cooling, sequencing • Forecast – growth (and Gotcha’s) • Budget – short term, long term • Schedule – near view, far view • Deploy – modular and balanced • Reality: Budget and schedule are fixed (too small, but fixed) • Small steps, big finish So? ©2009 C.R.S Engineering & Design Consultants 17
  18. 18. The big “Ta-Da” • Alternatives : nothing is free but these trade offs can work • Consolidate, no more single app boxes! • Right sizing – everything from power supplies to rack enclosures • Deploy power management • Optimize air flow - hot aisle, cold aisle • Know your thresholds – temperature and power • Use energy-efficient servers (some use 40% less power) • Use high-efficiency power supplies (see right sizing above) • Bridge the IT/ Facilities gap. • Do your homework – follow standards and benchmark performance • Get the people out! • Push Harder – advocate for change from all sources, suppliers, manufacturers, providers, etc. If you don’t ask… ©2007 C.R.S Engineering, Inc. 18
  19. 19. The Answer? Planning Modular Design Awareness Management System Redundancy Right Sizing All of the above ©2009 C.R.S Engineering & Design Consultants 19
  20. 20. Radical Common Sense You already know this • Model space for temperature, air flow, power distribution and capacity. • Identify single points of failure, infrastructure capacity issues or service interruption threats. • Utility assessment models: current, immanent, future • Adjust equipment orientation. (hot aisle cold aisle) • Segregate power and data cabling. • Introduce redundancy. • Establish modular equipment standards. (and follow them) • Construct and maintain a disaster avoidance/ recovery plan. • Review deployment strategies. • Clean up! Eliminate all non IT equipment, and people. • Look, Plan, Build, then Deploy (in that order) ©2009 C.R.S Engineering & Design Consultants 20
  21. 21. How green is your Data Center? David Gillespie, AIA, CSI, LEED AP Director: Communications & Technology CRS Engineering & Design Consultants 205.276.3393 dgillespie@crsegr.com 21

×