Datacenter Revolution Dean Nelson, Sun

2,014 views
1,879 views

Published on

Published in: Technology, Business
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,014
On SlideShare
0
From Embeds
0
Number of Embeds
9
Actions
Shares
0
Downloads
272
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Datacenter Revolution Dean Nelson, Sun

  1. 1. Presented at ISA Summit - Stockholm, Sweden March 19, 2009 Dean Nelson Sr. Director, Global Lab & Datacenter Design Services (GDS) 1
  2. 2. Unique Challenge 2
  3. 3. Demand and Capacity Are Colliding... ...And Data Centers Are Right in The MIDDLE! 800 Watts per (8,600W m2) Power Square Foot/Meter Demand ● ● Costs Users ● ● ●Space Services ● ●Heat ● Access 120 40 (1300W m2) (430W m2) Next 2003 2005 Generation 3
  4. 4. Moore’s Law in Action zv 128 Threads, 16 Cores 768 Cores, 28.5kW 64 Threads in 1997 E10K 5140 6048 Full Rack 30x 42 1&½ Footprint Size Power 720W 9620 Watts 30 kWatts each (Systems at peak utilization) 1,800 lbs. Weight ~2200 lbs. 2x per ~150k tpm Performance ~300k tpm 5140 4
  5. 5. Reality: Heterogeneous Data Centers Industry Average is Between 4-6kw/cabinet; >20kw Skyscrapers will be Integrated; Must Deal with mixed load environment 5
  6. 6. Why is this topic important? 6
  7. 7. Unprecedented Activity • Sun Datacenter Briefings over 17 months (07/07-2/09) > >675 – an average of ~8 per week > California = >4,000 people representing >400 customer companies have engaged in briefings and toured Santa Clara, CA, India and UK datacenters in 15 months > Colorado = Almost 1,000 people in less than two months > Challenges: Power, cooling, space, connectivity and utility costs > Interest: Investment Protection, Future-Proofing, Efficiency • Investments > 21 of these companies spending $19B in datacenter projects in the US alone > Does not include Microsoft, Google, Facebook or DRT 7
  8. 8. A different perspective A single server is responsible for about the same amount of CO2 as a typical automobile driven for a year Usually on 24x7 Server Auto Travel Air Travel 440 Watt Server Toyota Camry Commercial Airliner 3,942 kWh/year 15,000 m/year Vancouver-Toronto (7 trips) 5.3 Tonnes CO2 (24,000 km/year) 5.2 Tonnes CO2 5.3 Tonnes CO2 8
  9. 9. A different perspective A single server is responsible for about the same amount of CO2 as a typical automobile driven for a year BUT Usually on 24x7 = Moore's Law Mandates Efficiency Gains Automotive Equivalent Efficiency, 10 year period 163 MPG! 9
  10. 10. Changing Priorities & Drivers • $15B investment ($1.2B solar project) • First Carbon Neutral, Waste Free, Car-Free City Investment in solar innovation will change the industry • At the end of a dock instead of the end of a street 10
  11. 11. Floating Data Centers •Tier1-Tier3 ECO datacenters at US and international ports •Capacity: 4000 racks and over 350 SunMDs •75MW of power, free cooling from ocean water •Six months time to market, up to 40% less than traditional build • At the end of a dock instead of the end of a street 11
  12. 12. Strategy 12
  13. 13. A New Age Information Age Participation Age Industrial Age All Things Connected Global Production Unprecedented Contribution Data Storm Building Global Consumption Unprecedented Consumption 13
  14. 14. Top 20 Social Networks 1.3 Billion Users and growing Source: http://en.wikipedia.org/wiki/List_of_social_networking_websites (as of 11/10/2008) 14
  15. 15. The Shift Internet Infrastructure High Performance Computing 1m IT = Weapon Software as Services Global Consumer Services 100k UNDER SERVED 10k Moore’s 1k Law Core 100 Enterprise Apps 10 OVER SERVED 1 IT = Cost .1 1990 1995 2000 2005 2010 15
  16. 16. Innovate 16
  17. 17. A moment of silence... • Raised Floors Are Dead > No longer required > Go against physics > Increasingly cumbersome > Expensive • Next Generation equipment requires a new way of thinking... 17
  18. 18. Pod Architecture Modular Data Center Building Blocks Container and/or Brick & Mortar 18
  19. 19. Modular Pod Components Physical Design Influenced by cooling, brick and mortar • and/or container Cooling – Closely Coupled In-row or overhead, not-aisle containment, • and passive Power Distribution Overhead or under-floor busway • Cabling Localize switching in each pod • 19
  20. 20. Cooling in Sun Modular Datacenter Integrated cooling modules • Circular airflow, 5 cooling zones per module • Variable-speed fans on a per-fan basis • Handles densities up to 25 kW/rack • 20
  21. 21. Sun Pod Architecture 21
  22. 22. Sun Pod Architecture 22
  23. 23. Closeup: Power Distribution Modular overhead, hot-pluggable busway with conductors to handle multiple voltages and phases Requires no floor • space or cooling Transformers moved > outside the datacenter Snap-in cans with • short whips Non-disruptive > Reduced copper > consumption No in-place abandonment > Significant time reduction – from months to minutes > 23
  24. 24. Closeup: Power Distribution Modular overhead, hot-pluggable busway with conductors to handle multiple voltages and phases Supports multiple • Tier levels Use multiple > busways Scalable by • removing jumpers 24
  25. 25. Act 25
  26. 26. Design Services – Holistic Solution Global Lab and Datacenter Design Services Facilities IT/Eng GDS Bridging the gap Between Facilities and IT/Eng • Central design competency center • Understanding IT/Eng requirements, speak facilities’ language • Design services provide holistic solution • Aligned Business Organizations > http://www.sun.com/aboutsun/environment/docs/aligning_business_organizations.pdf 26
  27. 27. History: Sun’s Internal Challenge • Facilities is Sun’s second largest expense > Real estate, utility, tax, and support costs • 20+ years of organic growth > New products, reorgs, acquisitions > Lack of design standards and control of implementations for global technical infrastructure > Duplication and inefficiencies • Multi-billion dollar IT/R&D technical infrastructure portfolio > 860k ft2 (80k m2) of Eng and IT space globally (reduced from 1.4M ft2 - 80k m2) > 1,068 individual rooms (reduced from 1,685) > IT space = 17% of the portfolio (143k ft2 / 13k m2 – 275 rooms) > Engineering/Services = 83% of the portfolio (718k ft2 / 67k m2 – 793 rooms) 27
  28. 28. Corporate Drive to Reduce Costs • Spotlight: Santa Clara, CA - 2007 > Shed 1.8M ft2 (167k m2) of real estate > Compress 202k ft2 (18.8k m2) of datacenters space into < 80k ft2 (7,400 2 m ) of new Datacenter space in Santa Clara > Project included every major business unit in Sun > 12 month project duration (complete by 06/30/2007) • Approach > Phase I: Move 84k ft2 (7,800 m2) of datacenters into existing space > Phase II: Compress and build < 80k ft2 (7,400 m2) of next generation datacenter space in 12 months 28
  29. 29. Spotlight: Hardware Replacement • 88% space compression • 61% utility reduction • $9M cost avoidance • 2.2MW to 500Kw • 450% compute increase • 550 racks to 65 • 3,227 tons CO2 reduced • 312 cars off the road • Minimal downtime • Completed in 3 months • 2:1 Server, 3:1 Storage • 2,915 devices replaced Reality: ROI Sweet Spot = >4 to 1 Replacement Ratio 29
  30. 30. Bring Out Your Dead... 30
  31. 31. Global Consolidation • $250M investment • 41% global datacenter space compression > 1.44M ft2 to 858k ft2 • Scalable/Future Proof > 9MW to 21MW (CA) > 7MW to 10MW (CO) • Largest Liebert/APC installs • 15 Buildings to 2 • 152 Datacenters to 14 • $1.2M Utility Rebates $250k Innovation Award China, India, UK, Czech Republic, Norway • Enabled company pace • Reduced opex 30% (CA) 31
  32. 32. Colorado DC Consolidation • Largest, most complex & costly consolidation in Sun's history • 66% space compression •• 496k Datacenter compression 66% ft2 to 126k ft2 2 2 496k ft to 126k ft > • Scalable/Future Proof • Scalable/Future Proof > 7MW to 10MW > 7MW to 10MW •• First & Largest Liebert XD First & Largest Liebert XD dynamic cooling install dynamic cooling installs •• Water treatment saves 600k • gallons/year, eliminates chemicals •• Waterside economizer, free cooling > 1/3 of year. •• Compressed 165k ft2 raised floor • to <700 ft2 ($4M Cost Avoidance) • Flywheel UPS, eliminates batteries. • Chillers 32% more efficient at avg China, India, UK, Czech Republic, Norway load than ASHRAE std • 2 ACE Awards • Removed 1M kWh per month • Removed 5% of global carbon 32
  33. 33. Share 33
  34. 34. Power Usage Effectiveness (PUE) SCA11-1500 Data Center Efficiency Benchmark SCA11-1500 Power Use SCA11-1500 Software Datacenter PUE Load % of Total Load kW • 573 kW less support IT Load 798 78.02% power compared to Chiller Plant 126 12.28% industry PUE target (2)* RC/CRAC Loads 39 3.84% UPS/Transformer Loss 39 3.86% IT Load Lighting 20 2.00% • 36% More efficient than Lighting Total Load 1023 UPS/Transformer Loss the industry PUE target Total Support Loads 225 RC/CRAC Loads Chiller Plant PUE 1.28 and almost 50% better DciE 78% that industry PUE average (2.5)* Typical Data Center Target Datacenter PUE kW Load % of Total Load* • $400,000 Annual opex IT Load 798 50.00% savings compared to Chiller Plant 399 25.00% RC/CRAC Loads 192 12.00% typical data center UPS/Transformer Loss 160 10.00% ($0.08/kWh) Lighting 48 3.00% Total Load 1596 IT Load Total Support Loads 798 Lighting PUE 2.00 UPS/Transformer Loss RC/CRAC Loads 50% DciE Chiller Plant * Industry average & target from uptime institute: http://www.datacenterknowledge.com/archives/2008/Jan/22/case_study_ups_green_data_center.html 34
  35. 35. Best Practices = Competitive Weapon • Align Facilities, IT & Engineering > Partnering nets significant short term & long term savings http://www.sun.com/aboutsun/environment/docs/aligning_business_organizations.pdf • Hardware Replacement > Apply new hardware solutions and extend the life of your DC http://www.sun.com/aboutsun/environment/docs/creating_energy_efficient_dchw_consolidation.pdf • Simplify Datacenter design with the POD concept > Power: Modular, Scalable, Smart http://www.sun.com/aboutsun/environment/docs/powering_energy_efficientdc.pdf > Cooling: Adaptable, Scalable, Smart http://www.sun.com/aboutsun/environment/docs/cooling_energy_effiicientdc.pdf > Cabling: Distributed vs Centralized http://www.sun.com/aboutsun/environment/docs/connecting_energy_efficientdc.pdf > Measurement: Power to control http://www.sun.com/aboutsun/environment/docs/accurately_measure_dcpower.pdf • Data Center Tour Videos > California: http://www.sun.com/aboutsun/environment/media/datacenter_tour.xml > Colorado: http://www.sun.com/featured-articles/2009-0126/feature/index.jsp 35
  36. 36. Sun Blueprints • First Chapter - Modularity Released June 10, 2008 • Second Chapter - Electrical Released March 10, 2009 • Total of nine chapters to be released over the next 12 months • Download: http://sun.com/blueprints 36
  37. 37. Participate & Contribute 37
  38. 38. Hosted Chill Off • Load: Up to 320, Sun V20z 1U servers > Selected legacy servers to create hostile environment (currently EOL) > Max load 10kW per rack v Rittal LCP+ APC InRow RC Liebert XDV+ IBM/Vette RDHX Spraycool Contained Rack InRow Architecture with Overhead, Passive Heat Exchanger Door Chip Level Cooling Water based Thermal Containment Refrigerant based Water based Fluorinert based Water based 38
  39. 39. Chill-Off Results less efficient more efficient TREND 39
  40. 40. Chill-Off 2 less efficient more efficient FILL PERFORMANCE CHART 40
  41. 41. TEST PARTICIPANTS Clustered Systems TESTING SUPPORT REVIEWERS SPONSORS 41
  42. 42. Chill Off 2 Close Up... Ultra-efficient no-fan, high-density servers that plug into the pod infrastructure Breaking our own warranty. We are experimenting with this type of configuration in the chill off now... 42
  43. 43. Data Center End User Community • 785 members, 481 Companies, 39 Countries, 59 Industries An exclusive group of global datacenter owners, operators and users • http://datacenterpulse.org influencing the datacenter industry through the end user lens. * Membership stats as of 9:05pm 03/18/2009. 43
  44. 44. Open, Global, Focused • Formed Sept/2008 • First Global Summit Held in CA, February 2008 > Un-conference, topics defined and driven by members > In-person & On-line > Selected Topics: Top 10, Metrics, Certification, Cloud, Industry Alignment, Fanless Servers, Power • Access > Website: http://www.datacenterpulse.org : http://www.youtube.com/user/datacenterpulse > • Join the Group through > Owner/Operators http://www.linkedin.com/groups?gid=841187 > Industry http://www.linkedin.com/groups?gid=1315947 44
  45. 45. The Top 10 (Feb/2009) 1) Align Industry Organizations 2) Data Center Certification Standard 3) Standard Data Center Stack 4) Update or Dump Tier Levels 5) More Products with Modularity 6) Simple Top Level Efficiency Metric 7) End to End IT/Facilities Measurement 8) Standard Conductive Cooling Interface 9) 480V/277V Power Supplies 10) Independent Data Center Repository 45
  46. 46. Draft: Standard Data Center Stack • Industry Alignment • add all 46
  47. 47. Thank You Dean Nelson Sr. Director of Global Lab & Datacenter Design Services (GDS) dean.nelson@sun.com http://blogs.sun.com/geekism 47

×