Green Data Center

953 views

Published on

Considerations and understanding in building a datacenter

Published in: Business, Technology
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
953
On SlideShare
0
From Embeds
0
Number of Embeds
9
Actions
Shares
0
Downloads
57
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide
  • Air is 21% oxygenFire needs 16%
  • Green Data Center

    1. 1. Building A Next Generation Data Center<br />Its Not Easy ….<br />
    2. 2.
    3. 3. DATACENTER LIMITATIONS<br />
    4. 4. MOORE’s LAW<br />The number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years. This trend has continued for more than half a century and is expected to continue until 2015 or 2020 or later.<br />
    5. 5. Intel 80386 <br />275,000 Transistors<br />Pentium <br />1 Million Transistors<br />Itanium 2 Quad Core <br />2.6 Million Transistors<br />
    6. 6.
    7. 7. MOORE’s LAW- Power Density<br />
    8. 8. 1 Processor 4GB RAM <br />509GB 2 x 325 Watt <br />2 Processors 12GB RAM <br />1.8TB 2 x 400Watt<br />192 Processors 768 GB RAM, <br />SAN Storage 6 x 1200Watt<br />
    9. 9.
    10. 10.
    11. 11. Room Over Capacity UPS at 90+% Capacity<br />Room Temp around 88° Humidity @ ???<br />5-1/2 Miles of Non Documented Cat5 Cable<br />
    12. 12.
    13. 13. Old school DATACENTER<br />Hot Aisle Cold Aisle<br />
    14. 14. PRIORITIES<br />DESIGN <br />ACCEPTABLE RISK<br />
    15. 15. NEXT GENERATION DATA CENTER<br />Facilities<br />Physical Space<br />Design Liabilities<br />Cable Management<br />Security <br />Fire <br />Safety<br />Access<br />
    16. 16. NEXT GENERATION DATA CENTER<br />Power<br />110v or 220v?<br />UPS <br />Generator<br />Environmental<br />Cooling<br />Humidity<br />Event/ Breach Notification<br />Facilities <br />Technology<br />
    17. 17. physical<br />Layout<br />Flood<br />Sewer Pipes<br />Water Pipes<br />Furnace Proximity<br />Flooring<br />Raised<br />Solid<br />
    18. 18. security<br />
    19. 19. ACCESS<br />
    20. 20. FIRE 101<br />
    21. 21. safety<br />HALON<br /><ul><li> Stops the Chain Reaction
    22. 22. Leave No Residue
    23. 23. Is a CFC
    24. 24. Production Stopped in 1994
    25. 25. Still in Use Today (Grandfathered)</li></ul>FM200<br /><ul><li> Replacement for Halon
    26. 26. Stops the Chain Reaction</li></li></ul><li>safety<br />
    27. 27. Notifications<br />
    28. 28. 110 Volts<br /> 2 Phase 50Hz<br />20% Less Efficient in Generation<br />Requires 30% Larger Windings<br />10-15% Less Efficient in Transmission<br />220v<br />Standard in Europe<br />Draws ½ the AMPS<br />
    29. 29. I Got the power<br />Uninterruptable Power Supply<br />Never “Long Enough”?<br />Batteries Degrade over Time<br />Cost for Downtime <br />Generator<br />“Unlimited Downtime”<br />Placed outdoors<br />Test Weekly<br />
    30. 30. COOL- DEFINED<br /><ul><li>CRAC- Computer Room Air Conditioners</li></ul>Refrigerant-based, installed within the data center floor and connected to outside condensing units. <br />Moves air throughout the data center via fan system- delivers cool air to the servers, returns exhaust air from the room <br />CRAH- Computer Room Air Handler<br />Chilled water based, installed on data center floor and connected to outside chiller plant. <br />Moves air throughout the data center via fan system: delivers cool air to the servers, returns exhaust air from the room<br />CHILLER<br />Produces chilled water via refrigeration process. <br />Delivers chilled water via pumps to CRAH.<br />HUMIDIFIER <br />Installed in CRAC / CRAH to replace water loss before the air exits.<br />Controls Static Electricity<br />Available in standalone units. <br />
    31. 31. Where it Goes<br />HEAT<br />POWER<br />
    32. 32. IDEAL Environment<br />American Society of Heating, Refrigerating & Air-Conditioning Engineers (ASHRAE) Technical Committee (TC) Revised 2004 & 2008 Guidelines for Data Center based on Green Initiatives.<br />TC 9.9 Release 2011<br />20-80% Relative Humidity<br />64.4°F – 80.6°F Temperature <br />
    33. 33. Cooling, in Theory<br />
    34. 34. Cooling, in REALITY<br />88°<br />30°<br />58°<br />
    35. 35. Cooling GREEN <br />74°<br />72°<br />
    36. 36.
    37. 37. DATACENTER goals <br /><ul><li> Ensure Scalability
    38. 38. Design for Disasters
    39. 39. Monitor Systems
    40. 40. Secure Area
    41. 41. Document EVERYTHING
    42. 42. Energy Efficient 220V
    43. 43. Cool Equipment, not the Room!</li></li></ul><li>
    44. 44. Christopher Duffy<br />cduffy@peirce.edu<br />

    ×