Data Center Cooling Study on Liquid Cooling


Published on

One of our most popular webinar presentations on data center cooling: 2007 Data Center Cooling Study: Comparing Conventional Raised Floors with Close Coupled Cooling Technology.

If you're looking for a solution, it's simple physics: Water is 3,500 times more effective at cooling than air. But, liquid cooling carries a large stigma particularly because of the large price tag. And, if you're like other Data Center Managers, the words of Jerry McGuire may be ringing in your head "Show me the money!"

To view the recorded webinar presentation, please visit

Published in: Technology, Business
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Ladies and Gentlemen: Thanks for standing by and welcome to today’s session in the DirectNET web Seminar Serious. Today’s presentation is entitled: “2007 Data Center Cooling Study: Comparing Conventional Raised Floors with Liquid Packages.” During the presentation, all participants will be in a listen only mode. However, we encourage your questions or comments at anytime through the “chat” feature located at the lower left of your screen. These questions will be addressed as time allows. As a reminder, this Web Seminar is being recorded, today, November 14 th 2007 and a recording will be sent to all attendees within 48 hours.
  • <Jen> Before we get started today, I’d like to introduce our speaker: Joining use from Rittal Corporation is Daniel Kennedy Rimatrix Solution Center Coordinator (IT Group). A degreed electrical engineer, with a concentration in power systems, Daniel brings engineering support to the IT market, having been involved in all aspects of system design, installation, and support of high density IT deployments. With Rittal for almost 4 years, Daniel brings fresh ideas for the design and implementation of data center facilities. He has provided training to end users, installation contractors, and design consultants, in conferences, classrooms, as well as in the field. Daniel has shared this knowledge with many different clients in financial services, Fortune 500 companies, manufacturing organizations, as well as education institutions and government agencies. Daniel has been the lead project engineer supporting Rittal’s efforts in bringing close-coupled liquid cooling systems to North America . Moderating today’s conference is Jennifer Osborn, Data Center Infrastructure Consultant for DirectNET. In her 5 years at DirectNET, Jennifer has focused on server management and datacenter management solution design and implementation, including consultative needs analysis, project engineering, and support for enterprise applications. In that time she has attained several certifications from some of the largest vendors within the industry. Jennifer obtained her Bachelor of Science, Interior Design degree from Michigan State University. Prior to joining the team at DirectNET, Inc. Jennifer had an initial career focus in the Commercial Interior Design Industry with an emphasis on IT spatial analysis and datacenter design. This developed into Project Management in the Architectural and Design Industry. <LISA> Jennifer, I’ll turn the conference over to you.
  • <Jen> Thanks Lisa. Before we get started today, let’s quickly go over today’s agenda. First, to set the stage our discussion around Liquid Cooling,, we’re going to review how power is changing the landscape of our data centers. As part of that introduction we’ll share a few very disturbing statistics, which, unfortunately, AREN’T just stats for many of you joining today’s discussion. But, as we all know, the best way to counter the affects of a increasing power and heat load is to employ bullet-proof cooling solution. But, with so many cooling options out there, how do you know which is right for your organization? Well, we’re going to walk you through a quick tutorial of your options and discuss the pros and cons’ of each and where each is best applied, based on density loads. After that tutorial of all the solutions available, we’re going to dive down into liquid cooling – a solution that has a lot of buzz right now but has just has many questions. What exactly is the architecture ? How does is compares to conventional cooling solutions? What's the cost savings can expect? We’ll answer all of those questions. And as you’ll learn, Liquid cooling definitely changes the landscape by which you’ve conventionally measured cost and savings. We’ll also give a brief introduction to Rittal’s Liquid Cooling Package. Finally, we’ll go over 8 Considerations each Data Center should make before investing in a cooling solution. As a reminder, everyone attending today’s conference will receive a copy of the Liquid Cooling Study – sections of which we’ll discuss today TRANSITION: <Jen> Daniel, before we start talking about liquid cooling, let’s start off by acknowledging some of the very extensive changes that are taking place in a data center environment. Specifically, we’re hearing a lot of buzz around the word integration, integration, integration. What exactly does that mean and why is it is critical to today’s data center?
  • Daniel>….. <Jen> Then, when you talk about data center integration you’re talking about integration of several components –systems, infrastructure, cross-functional teams. There’s a people, process and technology component, correct? <Daniel>…. NO TRANSITION
  • <Daniel>….When you think about the data center ecosystem, there are many parts that have to work together, including;…….. TRANSITION <Jen> yes, I would say that this layout is very typical for many of the data centers that I encounter. Which, conventionally speaking, has worked very well. But, more and more, I’m finding that these data centers are having problems with climate control.
  • <Daniel> Exactly, Jen. …. TRANSITION: <Daniel> Let me share a few interesting, if not concerning, stats around this the issue of climate control.
  • <Daniel>……..In the most simple form, what’s happening is that servers are more dense, drawing more power than ever before. And that trend is growing. NO TRANSITION
  • <Daniel>: Likewise, we’re seeing a significant increase the amount of Mechanical Space. <Jen>: This is an extremely important fact, Daniel. Basically, what you’re seeing is that more space has to allocated to the equipment that runs mission critical applications – meaning you have to cut back on the IT hardware that can actually DRIVE business, right? <Daniel>…. TRANSITION<Jen>: Those are some rather grim statistics, Daniel – but I wouldn’t be surprised if many of the folks joining us today haven’t already felt the pressure of these changes. So, how do they remedy the situation?
  • <Daniel>;. There are several options that can be employed to manage your cooling situation. Because there’s not really a one-size-fits-all approach, I think we should spend a little time going through each of these.
  • <Daniel>:… <Jen>: Those are lot of options, Daniel. If we have someone attending today that needs to know the best approach, how you recommend they determine the solution that best meets their environment?
  • <Daniel>:… <Jen>: Those are lot of options, Daniel. If we have someone attending today that needs to know the best approach, how you recommend they determine the solution that best meets their environment? TRANSITION <Jen>: Because Close Coupled is probably a new term for folks in our audience, can you show us what you mean by that?
  • <DANIEL>: “Imagine if we only had to cool the rack, and not the enter room – that’s closed couple cooling.”
  • TRASITION <Jen>: It’s clear how these compare to CRAC units – based one the design- and how it can help with limiting the actual real estate needed. But, one of the questions I frequently get about CRAC units is the issues of redundancy – so, what happens if the unit fails?
  • <DANIEL> “Failure depends on how you design it.”
  • <Daniel>:…. TRANSITION <Jen>: This all makes sense, but I know the question on everyone’s mind is cost? What does a solution like this cost?
  • TRANSITION <Jen>: This all makes sense. And, I think we would all agree that cooling at the server level is far better than cooling at the room level. But, I know the question on everyone’s mind is cost? What does a solution like this cost?”
  • TRANSITION<Jen>: The Real Estate savings makes intuitive sense -- if you’re cooling at the rack level. But what about Energy? Engery cos Is probably the largest cost driver for conventional tools. Is this cost lowered with closed couples or can do we just expect the same types of spending?
  • <Daniel> In summary, we are talking about two distinct areas of savings – physical real estate and energy. <Daniel> But, Jen, these are just some of the expected cost savings areas. Closed couple cooling provides a very interesting savings opportunity for FREE Energy? TRANSITION <Jen> FREE energy? I’ve never heard of such a thing. I think you’re going to have to show us what you mean by that one!
  • TRANSITION <Jen> That sounds great for CA but many of my customers are on the south eastern United States. What results can they expect?
  • TRANSTION: <Jen> So, close coupled cooling offers not only savings with in real estate and energy, but based on outdoor temperatures, there’s also an opportunity for free cooling. I think the next logical question for some folks joining us today it may be worthwhile to put this in terms that our audience will understand. What I mean by that, is how do they know how energy efficient their data center currently is?
  • <TRANSITION> Jen: Many of customer, getting below 2.0 would be wonderful, if this is way to get there and at a cost savings, it seem like a very viable solution. But, these customer are also talking to me about w/ any new solution that they be environmentally friendly.
  • TRANSITION <Jen>: Another item of concern is the ‘newness’ of the technology. I think many folks may be apprehensive about being the first to adopt a new technology, particularly one that requires putting water so close to their servers.
  • TRANSITION <Jen>: That’s a really important point. Liquid cooling isn’t really the ‘uncharted territory’ that we may think it is. You’ve overcome how some misconceptions but with any solution, there has to be drawbacks. What do you consider some of the drawbacks to Closed-couple cooling?
  • TRANSITION <Jen> I think you’ve given us some great information around closed-couple cooling – obtaining free cooling, savings areas that we may not have originally considered, that this is NOT a new technology. For those listening today who are evaluating solutions, what are some of the things you would recommend we consider before purchasing a solution.
  • Data Center Cooling Study on Liquid Cooling

    1. 1. 2007 Data Center Cooling Study: Comparing Conventional Raised Floors with Close Coupled Cooling Technology November 14, 2007 Presented By:
    2. 2. Speakers and Sponsor <ul><li>Daniel Kennedy, Solution Center Coordinator for Rittal Corp. </li></ul><ul><ul><li>Degreed Electrical Engineer with a concentration in power systems </li></ul></ul><ul><ul><li>Over 4 Years Experience with Design and Implementation of data center facilities </li></ul></ul><ul><ul><li>Project Engineer for Rittal’s closed-coupled liquid cooling systems in North America </li></ul></ul><ul><li>Jennifer Osborn , Data Center Infrastructure Consultant for DirectNET </li></ul><ul><ul><li>5 Years Experience in data center planning and management </li></ul></ul><ul><ul><li>Professional portfolio includes over 250 clients, including Fortune 100 organizations </li></ul></ul><ul><ul><li>Successfully managed over 25 enterprise implementations, including deployment, training, and support </li></ul></ul>
    3. 3. Agenda <ul><li>Power: The Change Agent of Today’s Data Center </li></ul><ul><li>A Review of Your Cooling Options </li></ul><ul><li>An In-Depth Assessment of Close Coupled Cooling: 2007 Data Center Cooling Study </li></ul><ul><li>7 Things to Consider before Making a Cooling Investment </li></ul><ul><li>Q&A </li></ul>
    4. 4. Data Center Development Trends Today Future Automation Assetmanagement Energy-Efficiency Virtualisation Physical / logical Security Adaptive Infrastructure Remote Management Fuel cell technology Wireless Sensors Cooling / Power Components Solutions <ul><li>Cost intensive </li></ul><ul><li>Infrastructure: </li></ul><ul><li>None adaptive </li></ul><ul><li>High energy consumption - People-intensive </li></ul><ul><li>Island-character </li></ul><ul><li>Local managed - Lights-on </li></ul><ul><li>Wired </li></ul>Components Solutions <ul><li>Cost-effective </li></ul><ul><li>Infrastructure: </li></ul><ul><li>Adaptive </li></ul><ul><li>Low energy consumption </li></ul><ul><li>Automated </li></ul><ul><li>Shared IT </li></ul><ul><li>Remote managed - Lights-off - Wireless </li></ul>
    5. 5. Inside the Data Center <ul><li>Floor Systems </li></ul><ul><li>Cable Plant </li></ul><ul><ul><li>Structured Wiring </li></ul></ul><ul><li>Cable Routing </li></ul><ul><ul><li>Ladder Rack </li></ul></ul><ul><ul><li>Cable Tray </li></ul></ul><ul><li>Climate Control </li></ul><ul><ul><li>CRAC, CRAH </li></ul></ul><ul><li>Security </li></ul><ul><ul><li>Access Control </li></ul></ul><ul><ul><li>Video & Biometrics </li></ul></ul><ul><li>Power </li></ul><ul><ul><li>Direct, UPS, EDG‘s </li></ul></ul><ul><li>Outside Service </li></ul><ul><ul><li>OC-XXX, Telco </li></ul></ul><ul><li>Enclosures </li></ul>
    6. 6. The Issue <ul><li>Optimized air circulation cannot be achieved without significant facility changes </li></ul><ul><li>The high costs for energy and climate management increase TCO. </li></ul><ul><li>New methods and technologies must be defined for the increased climate control requirements. </li></ul>
    7. 7. Cooling Trends: Where is it Headed?
    8. 8. Space Comparison: White Space vs. Mechanical Space Floor Area Required in Data Center <ul><li>Rising load density of IT areas </li></ul><ul><li>Solutions for High Density Areas needed </li></ul><ul><li>Cooling solutions with air/water heat exchangers </li></ul>TODAY TOMORROW Mechanical Space 40% White Space 60% Mechanical Space 60% White Space 40% Source: Internal estimation Rittal 2007
    9. 9. Cooling Trends: The Way Forward
    10. 10. Cooling Options: A Tutorial Close Coupled Cooling Rear Door Air Cooled Supplemental In Row Solutions Active Air 0 2 10 15 20 28 35 40 kW Chip + Enclosure Cooling
    11. 11. Cooling Options: Air Cooled Ways to Cool – Where they are Used <ul><li>Air cooled Cabinet in a traditional hot aisle / cold aisle arrangement </li></ul>Air Cooled 0 2 10 15 20 28 35 40 kW
    12. 12. Cooling Options: Rear Door Ways to Cool – Where they are Used <ul><li>Rear Door type units are a lower capacity supplemental cooling system </li></ul><ul><ul><li>Allow user to bring air return temperatures from dense cabinets down to inlet temperatures (70F) </li></ul></ul><ul><ul><li>Can be deployed along with CRAC units to bring the hot aisle temperatures down on struggling units </li></ul></ul>0 2 10 15 20 28 35 40 kW Rear Door
    13. 13. Cooling Options: Active Air Ways to Cool – Where they are Used <ul><li>Active Air solutions </li></ul><ul><ul><li>Rely on current cooling capacity to handle the load, but more effectively return it to the CRAC units </li></ul></ul><ul><ul><li>Provides no actual cooling capacity, utilizes drop ceiling if present as a return path to CRAC unit </li></ul></ul>0 2 10 15 20 28 35 40 kW Active Air
    14. 14. Cooling Options: Supplemental In Row Ways to Cool – Where they are Used <ul><li>Supplemental In Row Solutions / Above Row </li></ul><ul><ul><li>Provide cold air and heat removal from typical hot aisle / cold aisle arrangement </li></ul></ul><ul><ul><li>Supplements the CRAC unit much like rear door units, typically, but not always of greater capacity </li></ul></ul>Supplemental In Row Solutions 0 2 10 15 20 28 35 40 kW
    15. 15. Cooling Options: Close Coupled Cooling Ways to Cool – Where they are Used <ul><li>Close Coupled Cooling – Closed Loop </li></ul><ul><ul><li>Provides cooling regardless of room conditions </li></ul></ul><ul><ul><li>Performs all the functions of the CRAC unit, but brings the cooling directly to the cabinet itself </li></ul></ul><ul><ul><li>Creates microclimate that exist only inside the rack </li></ul></ul>Close Coupled Cooling 0 2 10 15 20 28 35 40 kW
    16. 16. Cooling Options: Chip-Level Ways to Cool – Where they are Used <ul><li>Chip + Enclosure cooling </li></ul><ul><ul><li>Deployed in ultra dense environments, using customized servers </li></ul></ul><ul><ul><ul><li>Typically not for commercial deployment </li></ul></ul></ul><ul><ul><ul><li>Aimed at research environments that can benefit from ultra dense clusters </li></ul></ul></ul>0 2 10 15 20 28 35 40 kW Chip + Enclosure Cooling
    17. 17. Cooling Options: Visuals <ul><li>Open Systsem </li></ul><ul><li>(supplemental cooling) </li></ul><ul><li>Closed System (less than 10% Heat to the room) </li></ul><ul><li>CPU (Chip) Cooling </li></ul><ul><li>Combined Solution </li></ul>
    18. 18. Determining an Approach <ul><li>Hot/Cold Aisle Configuration? </li></ul><ul><ul><li>Is it possible in your data center </li></ul></ul><ul><ul><ul><li>If not rear door solutions maybe ideal </li></ul></ul></ul><ul><li>What is the Current Density of your Rack? </li></ul><ul><ul><li>< 15kW - Supplemental System </li></ul></ul><ul><ul><ul><li>Supplemental systems can provide cooling were needed on a hotspot basis </li></ul></ul></ul><ul><ul><li>> 15kW – Close Coupled System </li></ul></ul><ul><ul><ul><li>Can remove all heat being produced by the system, leaving nothing to chance in the ambient environment </li></ul></ul></ul>
    19. 19. The Concept: LCP Plus Touch screen
    20. 20. Differentiating Closed Couple Cooling + = LCP Server Cabinet High-Performance Solution top view HEX
    21. 21. Differentiating Closed Couple Cooling HEX fan alternative 2 top view
    22. 22. Cooling: Redundancy Considerations L C P M1 M2 M3 L C P L C P L C P
    23. 23. Cooling: LCP Extend For pre-deployed racks
    24. 24. Cooling: Supplemental Solutions Cold aisle Cold aisle
    25. 25. TCO: Real Estate Cost Analysis Real Estate Savings $129,617 $195, 946
    26. 26. TCO: Real Estate Cost Analysis Available 40KW, 3 rack space with LCP 20KW 30 – 40 KW common 40KW 10 rack sp. 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW L C P L C P 20KW Available 40+KW, 2 rack space L C P L C P 30 – 40 KW
    27. 27. TCO: Energy Cost Analysis Energy Savings Excellent - $69K Average - $128K Poor - $302K
    28. 28. Cooling: Total Cost of Ownership <ul><li>Allowing for warmer water temperatures, while still maintaining high density loads could save a great deal of money using conventional chillers </li></ul><ul><li>Depending on the operational condition of the chiller plant, the numbers below could be realized annually on an installation of less than 70 close couple cooled cabinets! </li></ul>
    29. 29. Cooling: Total Cost of Ownership <ul><li>Real Estate savings </li></ul><ul><ul><li>Having the ability to cool dense loads allows for server consolidation into a single rack, or adoption of denser server technologies such as blades </li></ul></ul><ul><li>Running costs </li></ul><ul><ul><li>Closed coupled cooling allows for multiple savings in regards to energy </li></ul></ul><ul><ul><ul><li>Lower fan energy costs </li></ul></ul></ul><ul><ul><ul><li>Lower lighting cost for smaller, more dense data center </li></ul></ul></ul><ul><ul><ul><li>Lower chilled water plant costs </li></ul></ul></ul><ul><ul><ul><li>Evaporative and dry free coolers </li></ul></ul></ul>
    30. 30. Cooling: Total Cost of Ownership Freecooler Low noise Chiller Pump Heatrecovery Pump- station Buffertank Emergency water building airconditioning Chiller options
    31. 31. Evaporative or Dry Coolers vs. Chiller Plant <ul><li>Typical Chiller plants provide chilled water at <45F, which is then past to the CRAC units in the data center </li></ul><ul><li>Raising this water temperature lowers operating cost, but the cooling solution must be able to handle the warmer water while providing the same amount of cooling </li></ul><ul><li>Close coupled cooling allows for high density installations, while using water as warm as 70F! </li></ul><ul><li>This warm water temperature allows for more hours each year where evaporative or dry air side economizers / coolers can be used lower operating cost, by reduced electricity usage </li></ul>
    32. 32. Chiller Example – California <ul><li>With a load of 28kW per cabinet, and with an ASHRAE allowable intake temperature to the servers of 77F, we can use water as warm as 70F, which can be realized whenever the Wet Bulb temperature is below 63F. </li></ul><ul><li>In Oakland this is the case over 97% of the year! </li></ul>
    33. 33. Chiller Example: Major Metropolitan Areas
    34. 34. Comparison with Traditional Methods - PUE <ul><li>Typical data centers range from 3.0-1.6, the lower the number the better </li></ul><ul><ul><li>PUE = (Total input Power to Data Center / Total IT Load) </li></ul></ul><ul><ul><ul><li>Example: </li></ul></ul></ul><ul><ul><ul><ul><li>Total Mechanical Load including chillers, UPS units, etc. ~ 216 kW </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Total IT Load at 108 kW </li></ul></ul></ul></ul><ul><ul><ul><ul><li>PUE = 216 kW / 108 kW = 2.0 PUE </li></ul></ul></ul></ul><ul><li>Even small LCP+ systems, without economizers, result in a PUE of 1.54! </li></ul>
    35. 35. The Green Impact The EPA estimates that at the current data center growth rate that we will need 10 new power plants before 2012 just to support IT growth! The use of close coupled cooling results in less energy loss in moving the air through the data center, saving fan energy, as well as reduces the cost to produce chilled water. This can result in significant energy usage reduction to cool the IT load, the place where the biggest impact can be seen today.
    36. 36. A Look Back… Now Then
    37. 37. Drawbacks of Closed Couple <ul><li>If you are using water cooled CRAH units, then its already there! </li></ul><ul><li>The technology has been around for better than 50 years. </li></ul><ul><li>As densities rose in the 70s with TTL logic, water was required, with the advent of CMOS, it went away, but our densities are back where water is needed. </li></ul><ul><li>Bringing the cooling to the rack offers major advantages! </li></ul>Mainframes in these data centers were cooled via water, direct to the chips!
    38. 38. 7 Top Cooling Considerations <ul><li>Focus on complete data center cooling designs </li></ul><ul><li>Provide complete cooling redundancy </li></ul><ul><ul><li>Include a comprehensive monitoring and alarm system at all component levels </li></ul></ul><ul><li>Understand The Complete Facility </li></ul><ul><ul><li>Hot/cold aisle – common aisle for exhaust </li></ul></ul><ul><ul><li>Placement of vented or cutout floor tiles </li></ul></ul><ul><ul><li>Air flow paths </li></ul></ul><ul><li>Consider alternate cabinet configurations </li></ul><ul><ul><li>Supplemental Cooling Systems </li></ul></ul><ul><ul><li>Close Couple Cooling </li></ul></ul><ul><ul><li>Energy-efficient, cost –effective way </li></ul></ul><ul><li>Group common products together </li></ul><ul><li>Develop component installation standards </li></ul><ul><li>Plan for new cooling solutions </li></ul>
    39. 40. Q&A To Receive a Copy of Today’s Presentation: [email_address]