• Like
  • Save
Data Center Lessons Learned
Upcoming SlideShare
Loading in...5
×

Data Center Lessons Learned

  • 4,013 views
Uploaded on

Data Center Lessons Learned at an Intel data center. Innovations in cost and energy savings in high-density data centers including: air economizer, retrofit of factory builiding, high efficiency …

Data Center Lessons Learned at an Intel data center. Innovations in cost and energy savings in high-density data centers including: air economizer, retrofit of factory builiding, high efficiency air-cooled cabinets, and a container data center proof-of-concept.

More in: Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
4,013
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
0
Comments
0
Likes
16

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Data Center Lessons Learned CIO Council Meeting Friday, March 27th 2009 Human Services Department 729 St. Michaels Drive Santa Fe, NM
  • 2. Bio: Tom Greenbaum, CDCP
    • Intel Information Technology employee for 8 ½ years, worked on the Intel Rio Rancho campus all that time
    • Data Center Operations Manager. Responsible for all Information Technology data center rooms on the Rio Rancho site
    • Room owner for the Encanto Supercomputer
    • Owner of Intel’s first Container Data Center
    Tom at the back of the Container Data Center
    • Speaking Events:
      • Data Center Lessons Learned. New Mexico Technology Council (NMTC), Santa Fe, NM, March 13, 2009.
      • Spherical Metaphor (Spheriphor) For Geoscope Multi-Dimensional Data Visualization . The 5th International Symposium on Digital Earth (ISDE5), UC Berkeley, June 5, 2007.
      • Full-Dome Immersive Business Visualization (BizViz) . Intel Information Services and Technology Group, Technical Community Conference, Granlibakken Resort, Lake Tahoe, Oct. 24-26, 2005.
      • Application Analysis for Factory Network . Computer Security Institute (CSI) 31st Annual Conference, Washington D.C., November 8, 2004.
    • Interests:
      • The Art of Innovation, Process Improvement, CMMI, Information Visualization, 3D Animation, Raytracing, Haptics, Arcitectural Design, Light Electric Vehicles, Sustainable Design, Living Off-The-Grid, Digital Fine Art, Virtual Globes, Spherical and Geodesic Geometry, R. Buckminster Fuller, Synergetics, Dome Construction, Nanotechnology, Tensegrity, Stained Glass, Woodworking, Xeriscaping, Science Fiction
    Tom is President of the local Albuquerque AFCOM Chapter Tom’s personal website: karmatetra.com
  • 3. New Mexico Data Centers Energy Saving Innovations, Factory Reuse, High-Density Air-Cooled Cabinets, Air Economizer, Container Data Center, Supercomputer
    • Support the full range of IT customers including:
      • Design Engineering (chip design validation)
      • Office users
      • Manufacturing
      • Enterprise systems
    • Customer systems include:
      • Storage arrays
      • Tape backup systems
      • Virtual tape libraries
      • Virtual server environments
      • High memory servers
      • Large batch computing pools
    • In total, the IT data centers on the Rio Rancho site contain close to 7,000 servers
    • One of Intel's global Hub data centers
    • Supports Intel’s data center consolidation strategy
    • Several data center rooms with different missions and infrastructure
    Encanto Supercompter 3D Rendering by Tom Greenbaum
  • 4. Solar Power Proof-of-Concept
    • Does NOT supply power to data centers: proof-of-concept for larger arrays
    • 64 Sharp solar panels photovoltaic (PV) array
    • Peak yield of 11.2kW of Direct Current power
    • Intel Rio Rancho’s first solar energy system
    Over a 25 year lifespan, the Intel Rio Rancho solar system, will offset an estimated: 907,000 lbs of CO2, the leading greenhouse gas Tom Greenbaum demonstrating his Electric Bicycle at the Solar Panel Ribbon Cutting Ceremony
  • 5. Factory Reuse
    • Retrofit Existing Factory Building
    • Cost Avoidance Of $ Millions
    • 3 Story Factory Conversion
      • 1st Floor UPS/Battery Electrical Room
      • 2nd Floor Data Center
      • 3rd Floor Air Handling And Cooling
    • 6 Megawatts Of IT Power
    • Innovative Air-cool Design
      • No Computer Room AC (CRAC)
      • No Raised Metal Floor (RMF)
      • Chimney Racks
    • Strong Load-bearing Floor Enabled High-density Computing
      • 2,220 lbs Per Rack
      • 30 kW Per Rack
    • Managed Cable Overhead
    Schematic Building Section by Tom Greenbaum Proof-of-Concept for mobile data center, non-production, validate cost savings, JIT compute, DR, energy savings. 480 Container DC Home to Encanto Supercomputer, future home of IT GHDC Phase 3. 55,000 Encanto SC Trailer data center. Subject of Air Economizer POC, whitepaper and video. EOL - completely decommissioned 2008 Q04 1000 Trailer DC RMF, CRAC units, redundant cooling, generator power, mission critical, storage, manufacturing, Office/Enterprise. 1713 Mission Critical 2 RMF, CRAC units, redundant cooling, generator power, mission critical, storage, tape backup, VTL, EC and Office/Enterprise. 2047 Mission Critical 1 High-density high-performance, over 6,000 EC blade servers, non-production virtual servers. 6775 HPDC Manufacturing support systems, moved to Mission Critical 2. EOL - completely decommissioned 2008 Q04 1261 MFG Support Notes Area (Sq Ft) Data Center
  • 6. High-Density Air Cooled Cabinets
    • Optimized Air Management
      • Chimney cabinets closely ducted to air handling units
      • Reduce leakage to 1%
      • Reduce obstructions to air flow at rear of servers
      • Increase pressure differential to draw more air thru cabinets.
      • Result = maximum kW per cabinet
    • Heated air is contained and isolated from the cooling air
    • Equalize temperature of available air at both top and bottom of cabinet
    • Data Center room at a comfortable 68 o F
    • Heat transfer from hot air to air handling unit (AHU) cooling coils is maximized
    Achieve Maximum Cooling Capacity At Lowest Total Operational Cost With Air-Cooled “Chimney” Cabinets Row of High-Density Blade Server Chimney Cabinets in HPDC
  • 7. Air Economizer Proof Of Concept
    • Used an air economizer to cool 900 high-performance production servers
    • Cooled with 100% outside air at temperatures up to 90 o F
    • Data showed that nearly all DC cooling could be supplied with an air economizer, greatly reducing power consumption
    • Potential savings for a 10-MW DC:
      • Reduce annual operating costs by up to $2,870,000
      • Save 76,000,000 gallons of water annually
    • In 2008 the PoC won the prestigious Intel Innovation Award
    • Received a lot of industry praise as leading the way towards reduced power consumption and Green Computing
    • We cooled servers with 100 percent outside air at temperatures up to 90 degree F
    Challenged the current industry assumption that the usefulness of air economizers is limited by the need to supply cooling air at a relatively low temperature Trailer Data Center 3D rendering by Tom Greenbaum
  • 8. Container Data Center (CDC) Proof Of Concept
    • Evaluation of containerized technology and capabilities
    • 40’ ISO standard roll-on, roll-off shipping container containing:
      • Computer racks
      • Electrical distribution
      • Cooling coils
      • High density blade servers
      • 3,520 compute nodes
    • Costs savings
      • Construction
      • Energy
      • Operational
    • Just-In-Time compute
    • Disaster recovery
    • Closely matching cooling capacity + IT power consumption can achieve improved energy efficiency
  • 9. Encanto Supercomputer
    • Debuted as the 3rd fastest in the world (now ranked 12).
    • Entire installation of the supercomputer took only four days!
    • Savings cost avoidance of building a separate data center
    • Reuse of factory power, cooling and architectural infrastructure
      • 1 megawatt
      • 600 GPM chilled water system
      • Load bearing floor
    • 3 Exemplar systems (2.1 TFLOPS peak)
      • UNMHPC at UNM
      • NMSU
      • NMIT
    • 172 TFLOPS Altix ICE 8200 Cluster (SGI)
      • 1792 Nodes
      • 3,584 Sockets
      • 14,336 Cores
      • 28 Racks
      • 28.7 TB memory
      • 172 TB parallel file system (10GB/sec)
      • 20 TB NFS Backup Storage system
    Governor Richardson shaking hands with Tom Greenbaum at ribbon cutting
  • 10. Encanto Supercomputer (continued) Governor Bill Richardson Announces Partnerships & New Mexico’s Supercomputer to Create New High-Tech Jobs