2007 Data Center Cooling Study:  Comparing  Conventional Raised Floors with Close Coupled Cooling Technology  November 14, 2007  Presented By:
Speakers and Sponsor  Daniel Kennedy,  Solution Center Coordinator for Rittal Corp. Degreed Electrical Engineer with a concentration in power systems Over 4 Years Experience with Design and Implementation of data center facilities Project Engineer for Rittal’s closed-coupled liquid cooling systems in North America  Jennifer Osborn , Data Center Infrastructure Consultant for DirectNET  5 Years Experience in data center planning and management  Professional portfolio includes over 250 clients, including Fortune 100 organizations Successfully managed over 25 enterprise implementations, including deployment, training, and support
Agenda Power: The Change Agent of Today’s Data Center  A Review of Your Cooling Options  An In-Depth Assessment of Close Coupled Cooling: 2007 Data Center Cooling Study  7 Things to Consider before Making a Cooling Investment Q&A
Data Center Development Trends Today Future Automation Assetmanagement Energy-Efficiency Virtualisation Physical / logical Security Adaptive Infrastructure Remote Management Fuel cell technology Wireless Sensors Cooling / Power Components Solutions Cost intensive Infrastructure: None adaptive High energy   consumption - People-intensive Island-character Local managed - Lights-on Wired Components Solutions Cost-effective Infrastructure: Adaptive Low energy   consumption Automated Shared IT Remote managed - Lights-off - Wireless
Inside the Data Center  Floor Systems Cable Plant Structured Wiring Cable Routing Ladder Rack Cable Tray Climate Control CRAC, CRAH Security Access Control  Video & Biometrics Power  Direct, UPS, EDG‘s Outside Service  OC-XXX, Telco Enclosures
The Issue  Optimized air circulation cannot be achieved without significant facility changes The high costs for energy and climate management increase TCO.  New methods and technologies must be defined for the increased climate control requirements.
Cooling Trends:  Where is it Headed?
Space Comparison: White Space vs. Mechanical Space  Floor Area Required in Data Center Rising load density of IT areas Solutions for High Density Areas needed Cooling solutions with air/water heat exchangers TODAY TOMORROW Mechanical Space  40% White Space    60% Mechanical Space 60% White Space  40% Source: Internal estimation Rittal 2007
Cooling Trends: The Way Forward
Cooling Options: A Tutorial  Close Coupled Cooling Rear Door Air Cooled Supplemental  In Row Solutions Active Air 0 2 10 15 20 28 35 40 kW Chip + Enclosure Cooling
Cooling Options: Air Cooled  Ways to Cool – Where they are Used Air cooled Cabinet in a traditional hot aisle / cold aisle arrangement Air Cooled 0 2 10 15 20 28 35 40 kW
Cooling Options: Rear Door  Ways to Cool – Where they are Used Rear Door type units are a lower capacity supplemental cooling system Allow user to bring air return temperatures from dense cabinets down to inlet temperatures (70F) Can be deployed along with CRAC units to bring the hot aisle temperatures down on struggling units 0 2 10 15 20 28 35 40 kW Rear Door
Cooling Options: Active Air  Ways to Cool – Where they are Used Active Air solutions  Rely on current cooling capacity to handle the load, but more effectively return it to the CRAC units Provides no actual cooling capacity, utilizes drop ceiling if present as a return path to CRAC unit 0 2 10 15 20 28 35 40 kW Active Air
Cooling Options: Supplemental In Row  Ways to Cool – Where they are Used Supplemental In Row Solutions / Above Row Provide cold air and heat removal from typical hot aisle / cold aisle arrangement Supplements the CRAC unit much like rear door units, typically, but not always of greater capacity Supplemental  In Row Solutions 0 2 10 15 20 28 35 40 kW
Cooling Options: Close Coupled Cooling  Ways to Cool – Where they are Used Close Coupled Cooling – Closed Loop Provides cooling regardless of room conditions Performs all the functions of the CRAC unit, but brings the cooling directly to the cabinet itself Creates microclimate that exist only inside the rack Close Coupled Cooling 0 2 10 15 20 28 35 40 kW
Cooling Options: Chip-Level  Ways to Cool – Where they are Used Chip + Enclosure cooling Deployed in ultra dense environments, using customized servers Typically not for commercial deployment Aimed at research environments that can benefit from ultra dense clusters 0 2 10 15 20 28 35 40 kW Chip + Enclosure Cooling
Cooling Options: Visuals  Open Systsem  (supplemental cooling)  Closed System (less than 10% Heat to the room) CPU (Chip) Cooling Combined Solution
Determining an Approach  Hot/Cold Aisle Configuration? Is it possible in your data center If not rear door solutions maybe ideal What is the Current Density of your Rack?  < 15kW  - Supplemental System  Supplemental systems can provide cooling were needed on a hotspot basis > 15kW – Close Coupled System  Can remove all heat being produced by the system, leaving nothing to chance in the ambient environment
The Concept: LCP Plus  Touch screen
Differentiating Closed Couple Cooling  + = LCP  Server Cabinet  High-Performance Solution top view HEX
Differentiating Closed Couple Cooling HEX fan alternative 2 top view
Cooling: Redundancy Considerations L C P M1 M2 M3 L C P L C P L C P
Cooling: LCP Extend  For pre-deployed racks
Cooling: Supplemental Solutions   Cold aisle Cold aisle
TCO: Real Estate Cost Analysis Real Estate Savings   $129,617 $195, 946
TCO: Real Estate Cost Analysis Available 40KW, 3 rack space   with LCP 20KW 30 – 40 KW common   40KW  10 rack sp. 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW L C P L C P 20KW Available 40+KW, 2 rack space   L C P L C P 30 – 40 KW
TCO: Energy Cost Analysis  Energy Savings  Excellent - $69K Average - $128K Poor - $302K
Cooling: Total Cost of Ownership   Allowing for warmer water temperatures, while still maintaining high density loads could save a great deal of money using conventional chillers Depending on the operational condition of the chiller plant, the numbers below could be realized annually on an installation of less than 70 close couple cooled cabinets!
Cooling: Total Cost of Ownership Real Estate savings Having the ability to cool dense loads allows for server consolidation into a single rack, or adoption of denser server technologies such as blades Running costs Closed coupled cooling allows for multiple savings in regards to energy  Lower fan energy costs Lower lighting cost for smaller, more dense data center Lower chilled water plant costs Evaporative and dry free coolers
Cooling: Total Cost of Ownership Freecooler Low noise Chiller Pump Heatrecovery Pump- station Buffertank Emergency water building airconditioning Chiller options
Evaporative or Dry Coolers vs. Chiller Plant Typical Chiller plants provide chilled water at <45F, which is then past to the CRAC units in the data center Raising this water temperature lowers operating cost, but the cooling solution must be able to handle the warmer water while providing the same amount of cooling Close coupled cooling allows for high density installations, while using water as warm as 70F! This warm water temperature allows for more hours each year where evaporative or dry air side economizers / coolers can be used lower operating cost, by reduced electricity usage
Chiller Example – California   With a load of 28kW per cabinet, and with an ASHRAE allowable intake temperature to the servers of 77F, we can use water as warm as 70F, which can be realized whenever the Wet Bulb temperature is below 63F. In Oakland this is the case over 97% of the year!
Chiller Example: Major Metropolitan Areas
Comparison with Traditional Methods - PUE   Typical data centers range from 3.0-1.6, the lower the number the better PUE = (Total input Power to Data Center / Total IT Load) Example: Total Mechanical Load including chillers, UPS units, etc.  ~ 216 kW Total IT Load at 108 kW PUE = 216 kW / 108 kW = 2.0 PUE Even small LCP+ systems, without economizers, result in a PUE of 1.54!
The Green Impact   The EPA estimates that at the current data center growth rate that we will need 10 new power plants before 2012 just to support IT growth! http://www.energystar.gov/index.cfm?c=prod_development.server_efficiency_study The use of close coupled cooling results in less energy loss in moving the air through the data center, saving fan energy, as well as reduces the cost to produce chilled water. This can result in significant energy usage reduction to cool the IT load, the place where the biggest impact can be seen today.
A Look Back… Now Then
Drawbacks of Closed Couple   If you are using water cooled CRAH units, then its already there! The technology has been around for better than 50 years. As densities rose in the 70s with TTL logic, water was required, with the advent of CMOS, it went away, but our densities are back where water is needed. Bringing the cooling to the rack offers major advantages! Mainframes in these data centers were cooled via water, direct to the chips!
7 Top Cooling Considerations Focus on complete data center cooling designs Provide complete cooling redundancy Include a comprehensive monitoring and alarm system at all component levels Understand The Complete Facility Hot/cold aisle – common aisle for exhaust Placement of vented or cutout floor tiles Air flow paths Consider alternate cabinet configurations Supplemental Cooling Systems Close Couple Cooling Energy-efficient, cost –effective way  Group common products together Develop component installation standards Plan for new cooling solutions
 
Q&A  To Receive a Copy of Today’s Presentation:  [email_address]

Data Center Cooling Study on Liquid Cooling

  • 1.
    2007 Data CenterCooling Study: Comparing Conventional Raised Floors with Close Coupled Cooling Technology November 14, 2007 Presented By:
  • 2.
    Speakers and Sponsor Daniel Kennedy, Solution Center Coordinator for Rittal Corp. Degreed Electrical Engineer with a concentration in power systems Over 4 Years Experience with Design and Implementation of data center facilities Project Engineer for Rittal’s closed-coupled liquid cooling systems in North America Jennifer Osborn , Data Center Infrastructure Consultant for DirectNET 5 Years Experience in data center planning and management Professional portfolio includes over 250 clients, including Fortune 100 organizations Successfully managed over 25 enterprise implementations, including deployment, training, and support
  • 3.
    Agenda Power: TheChange Agent of Today’s Data Center A Review of Your Cooling Options An In-Depth Assessment of Close Coupled Cooling: 2007 Data Center Cooling Study 7 Things to Consider before Making a Cooling Investment Q&A
  • 4.
    Data Center DevelopmentTrends Today Future Automation Assetmanagement Energy-Efficiency Virtualisation Physical / logical Security Adaptive Infrastructure Remote Management Fuel cell technology Wireless Sensors Cooling / Power Components Solutions Cost intensive Infrastructure: None adaptive High energy consumption - People-intensive Island-character Local managed - Lights-on Wired Components Solutions Cost-effective Infrastructure: Adaptive Low energy consumption Automated Shared IT Remote managed - Lights-off - Wireless
  • 5.
    Inside the DataCenter Floor Systems Cable Plant Structured Wiring Cable Routing Ladder Rack Cable Tray Climate Control CRAC, CRAH Security Access Control Video & Biometrics Power Direct, UPS, EDG‘s Outside Service OC-XXX, Telco Enclosures
  • 6.
    The Issue Optimized air circulation cannot be achieved without significant facility changes The high costs for energy and climate management increase TCO. New methods and technologies must be defined for the increased climate control requirements.
  • 7.
    Cooling Trends: Where is it Headed?
  • 8.
    Space Comparison: WhiteSpace vs. Mechanical Space Floor Area Required in Data Center Rising load density of IT areas Solutions for High Density Areas needed Cooling solutions with air/water heat exchangers TODAY TOMORROW Mechanical Space 40% White Space 60% Mechanical Space 60% White Space 40% Source: Internal estimation Rittal 2007
  • 9.
  • 10.
    Cooling Options: ATutorial Close Coupled Cooling Rear Door Air Cooled Supplemental In Row Solutions Active Air 0 2 10 15 20 28 35 40 kW Chip + Enclosure Cooling
  • 11.
    Cooling Options: AirCooled Ways to Cool – Where they are Used Air cooled Cabinet in a traditional hot aisle / cold aisle arrangement Air Cooled 0 2 10 15 20 28 35 40 kW
  • 12.
    Cooling Options: RearDoor Ways to Cool – Where they are Used Rear Door type units are a lower capacity supplemental cooling system Allow user to bring air return temperatures from dense cabinets down to inlet temperatures (70F) Can be deployed along with CRAC units to bring the hot aisle temperatures down on struggling units 0 2 10 15 20 28 35 40 kW Rear Door
  • 13.
    Cooling Options: ActiveAir Ways to Cool – Where they are Used Active Air solutions Rely on current cooling capacity to handle the load, but more effectively return it to the CRAC units Provides no actual cooling capacity, utilizes drop ceiling if present as a return path to CRAC unit 0 2 10 15 20 28 35 40 kW Active Air
  • 14.
    Cooling Options: SupplementalIn Row Ways to Cool – Where they are Used Supplemental In Row Solutions / Above Row Provide cold air and heat removal from typical hot aisle / cold aisle arrangement Supplements the CRAC unit much like rear door units, typically, but not always of greater capacity Supplemental In Row Solutions 0 2 10 15 20 28 35 40 kW
  • 15.
    Cooling Options: CloseCoupled Cooling Ways to Cool – Where they are Used Close Coupled Cooling – Closed Loop Provides cooling regardless of room conditions Performs all the functions of the CRAC unit, but brings the cooling directly to the cabinet itself Creates microclimate that exist only inside the rack Close Coupled Cooling 0 2 10 15 20 28 35 40 kW
  • 16.
    Cooling Options: Chip-Level Ways to Cool – Where they are Used Chip + Enclosure cooling Deployed in ultra dense environments, using customized servers Typically not for commercial deployment Aimed at research environments that can benefit from ultra dense clusters 0 2 10 15 20 28 35 40 kW Chip + Enclosure Cooling
  • 17.
    Cooling Options: Visuals Open Systsem (supplemental cooling) Closed System (less than 10% Heat to the room) CPU (Chip) Cooling Combined Solution
  • 18.
    Determining an Approach Hot/Cold Aisle Configuration? Is it possible in your data center If not rear door solutions maybe ideal What is the Current Density of your Rack? < 15kW - Supplemental System Supplemental systems can provide cooling were needed on a hotspot basis > 15kW – Close Coupled System Can remove all heat being produced by the system, leaving nothing to chance in the ambient environment
  • 19.
    The Concept: LCPPlus Touch screen
  • 20.
    Differentiating Closed CoupleCooling + = LCP Server Cabinet High-Performance Solution top view HEX
  • 21.
    Differentiating Closed CoupleCooling HEX fan alternative 2 top view
  • 22.
    Cooling: Redundancy ConsiderationsL C P M1 M2 M3 L C P L C P L C P
  • 23.
    Cooling: LCP Extend For pre-deployed racks
  • 24.
    Cooling: Supplemental Solutions Cold aisle Cold aisle
  • 25.
    TCO: Real EstateCost Analysis Real Estate Savings $129,617 $195, 946
  • 26.
    TCO: Real EstateCost Analysis Available 40KW, 3 rack space with LCP 20KW 30 – 40 KW common 40KW 10 rack sp. 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW 4KW L C P L C P 20KW Available 40+KW, 2 rack space L C P L C P 30 – 40 KW
  • 27.
    TCO: Energy CostAnalysis Energy Savings Excellent - $69K Average - $128K Poor - $302K
  • 28.
    Cooling: Total Costof Ownership Allowing for warmer water temperatures, while still maintaining high density loads could save a great deal of money using conventional chillers Depending on the operational condition of the chiller plant, the numbers below could be realized annually on an installation of less than 70 close couple cooled cabinets!
  • 29.
    Cooling: Total Costof Ownership Real Estate savings Having the ability to cool dense loads allows for server consolidation into a single rack, or adoption of denser server technologies such as blades Running costs Closed coupled cooling allows for multiple savings in regards to energy Lower fan energy costs Lower lighting cost for smaller, more dense data center Lower chilled water plant costs Evaporative and dry free coolers
  • 30.
    Cooling: Total Costof Ownership Freecooler Low noise Chiller Pump Heatrecovery Pump- station Buffertank Emergency water building airconditioning Chiller options
  • 31.
    Evaporative or DryCoolers vs. Chiller Plant Typical Chiller plants provide chilled water at <45F, which is then past to the CRAC units in the data center Raising this water temperature lowers operating cost, but the cooling solution must be able to handle the warmer water while providing the same amount of cooling Close coupled cooling allows for high density installations, while using water as warm as 70F! This warm water temperature allows for more hours each year where evaporative or dry air side economizers / coolers can be used lower operating cost, by reduced electricity usage
  • 32.
    Chiller Example –California With a load of 28kW per cabinet, and with an ASHRAE allowable intake temperature to the servers of 77F, we can use water as warm as 70F, which can be realized whenever the Wet Bulb temperature is below 63F. In Oakland this is the case over 97% of the year!
  • 33.
    Chiller Example: MajorMetropolitan Areas
  • 34.
    Comparison with TraditionalMethods - PUE Typical data centers range from 3.0-1.6, the lower the number the better PUE = (Total input Power to Data Center / Total IT Load) Example: Total Mechanical Load including chillers, UPS units, etc. ~ 216 kW Total IT Load at 108 kW PUE = 216 kW / 108 kW = 2.0 PUE Even small LCP+ systems, without economizers, result in a PUE of 1.54!
  • 35.
    The Green Impact The EPA estimates that at the current data center growth rate that we will need 10 new power plants before 2012 just to support IT growth! http://www.energystar.gov/index.cfm?c=prod_development.server_efficiency_study The use of close coupled cooling results in less energy loss in moving the air through the data center, saving fan energy, as well as reduces the cost to produce chilled water. This can result in significant energy usage reduction to cool the IT load, the place where the biggest impact can be seen today.
  • 36.
  • 37.
    Drawbacks of ClosedCouple If you are using water cooled CRAH units, then its already there! The technology has been around for better than 50 years. As densities rose in the 70s with TTL logic, water was required, with the advent of CMOS, it went away, but our densities are back where water is needed. Bringing the cooling to the rack offers major advantages! Mainframes in these data centers were cooled via water, direct to the chips!
  • 38.
    7 Top CoolingConsiderations Focus on complete data center cooling designs Provide complete cooling redundancy Include a comprehensive monitoring and alarm system at all component levels Understand The Complete Facility Hot/cold aisle – common aisle for exhaust Placement of vented or cutout floor tiles Air flow paths Consider alternate cabinet configurations Supplemental Cooling Systems Close Couple Cooling Energy-efficient, cost –effective way Group common products together Develop component installation standards Plan for new cooling solutions
  • 39.
  • 40.
    Q&A ToReceive a Copy of Today’s Presentation: [email_address]

Editor's Notes

  • #2 Ladies and Gentlemen: Thanks for standing by and welcome to today’s session in the DirectNET web Seminar Serious. Today’s presentation is entitled: “2007 Data Center Cooling Study: Comparing Conventional Raised Floors with Liquid Packages.” During the presentation, all participants will be in a listen only mode. However, we encourage your questions or comments at anytime through the “chat” feature located at the lower left of your screen. These questions will be addressed as time allows. As a reminder, this Web Seminar is being recorded, today, November 14 th 2007 and a recording will be sent to all attendees within 48 hours.
  • #3 &lt;Jen&gt; Before we get started today, I’d like to introduce our speaker: Joining use from Rittal Corporation is Daniel Kennedy Rimatrix Solution Center Coordinator (IT Group). A degreed electrical engineer, with a concentration in power systems, Daniel brings engineering support to the IT market, having been involved in all aspects of system design, installation, and support of high density IT deployments. With Rittal for almost 4 years, Daniel brings fresh ideas for the design and implementation of data center facilities. He has provided training to end users, installation contractors, and design consultants, in conferences, classrooms, as well as in the field. Daniel has shared this knowledge with many different clients in financial services, Fortune 500 companies, manufacturing organizations, as well as education institutions and government agencies. Daniel has been the lead project engineer supporting Rittal’s efforts in bringing close-coupled liquid cooling systems to North America . Moderating today’s conference is Jennifer Osborn, Data Center Infrastructure Consultant for DirectNET. In her 5 years at DirectNET, Jennifer has focused on server management and datacenter management solution design and implementation, including consultative needs analysis, project engineering, and support for enterprise applications. In that time she has attained several certifications from some of the largest vendors within the industry. Jennifer obtained her Bachelor of Science, Interior Design degree from Michigan State University. Prior to joining the team at DirectNET, Inc. Jennifer had an initial career focus in the Commercial Interior Design Industry with an emphasis on IT spatial analysis and datacenter design. This developed into Project Management in the Architectural and Design Industry. &lt;LISA&gt; Jennifer, I’ll turn the conference over to you.
  • #4 &lt;Jen&gt; Thanks Lisa. Before we get started today, let’s quickly go over today’s agenda. First, to set the stage our discussion around Liquid Cooling,, we’re going to review how power is changing the landscape of our data centers. As part of that introduction we’ll share a few very disturbing statistics, which, unfortunately, AREN’T just stats for many of you joining today’s discussion. But, as we all know, the best way to counter the affects of a increasing power and heat load is to employ bullet-proof cooling solution. But, with so many cooling options out there, how do you know which is right for your organization? Well, we’re going to walk you through a quick tutorial of your options and discuss the pros and cons’ of each and where each is best applied, based on density loads. After that tutorial of all the solutions available, we’re going to dive down into liquid cooling – a solution that has a lot of buzz right now but has just has many questions. What exactly is the architecture ? How does is compares to conventional cooling solutions? What&apos;s the cost savings can expect? We’ll answer all of those questions. And as you’ll learn, Liquid cooling definitely changes the landscape by which you’ve conventionally measured cost and savings. We’ll also give a brief introduction to Rittal’s Liquid Cooling Package. Finally, we’ll go over 8 Considerations each Data Center should make before investing in a cooling solution. As a reminder, everyone attending today’s conference will receive a copy of the Liquid Cooling Study – sections of which we’ll discuss today TRANSITION: &lt;Jen&gt; Daniel, before we start talking about liquid cooling, let’s start off by acknowledging some of the very extensive changes that are taking place in a data center environment. Specifically, we’re hearing a lot of buzz around the word integration, integration, integration. What exactly does that mean and why is it is critical to today’s data center?
  • #5 Daniel&gt;….. &lt;Jen&gt; Then, when you talk about data center integration you’re talking about integration of several components –systems, infrastructure, cross-functional teams. There’s a people, process and technology component, correct? &lt;Daniel&gt;…. NO TRANSITION
  • #6 &lt;Daniel&gt;….When you think about the data center ecosystem, there are many parts that have to work together, including;…….. TRANSITION &lt;Jen&gt; yes, I would say that this layout is very typical for many of the data centers that I encounter. Which, conventionally speaking, has worked very well. But, more and more, I’m finding that these data centers are having problems with climate control.
  • #7 &lt;Daniel&gt; Exactly, Jen. …. TRANSITION: &lt;Daniel&gt; Let me share a few interesting, if not concerning, stats around this the issue of climate control.
  • #8 &lt;Daniel&gt;……..In the most simple form, what’s happening is that servers are more dense, drawing more power than ever before. And that trend is growing. NO TRANSITION
  • #9 &lt;Daniel&gt;: Likewise, we’re seeing a significant increase the amount of Mechanical Space. &lt;Jen&gt;: This is an extremely important fact, Daniel. Basically, what you’re seeing is that more space has to allocated to the equipment that runs mission critical applications – meaning you have to cut back on the IT hardware that can actually DRIVE business, right? &lt;Daniel&gt;…. TRANSITION&lt;Jen&gt;: Those are some rather grim statistics, Daniel – but I wouldn’t be surprised if many of the folks joining us today haven’t already felt the pressure of these changes. So, how do they remedy the situation?
  • #11 &lt;Daniel&gt;;. There are several options that can be employed to manage your cooling situation. Because there’s not really a one-size-fits-all approach, I think we should spend a little time going through each of these.
  • #18 &lt;Daniel&gt;:… &lt;Jen&gt;: Those are lot of options, Daniel. If we have someone attending today that needs to know the best approach, how you recommend they determine the solution that best meets their environment?
  • #19 &lt;Daniel&gt;:… &lt;Jen&gt;: Those are lot of options, Daniel. If we have someone attending today that needs to know the best approach, how you recommend they determine the solution that best meets their environment? TRANSITION &lt;Jen&gt;: Because Close Coupled is probably a new term for folks in our audience, can you show us what you mean by that?
  • #20 &lt;DANIEL&gt;: “Imagine if we only had to cool the rack, and not the enter room – that’s closed couple cooling.”
  • #22 TRASITION &lt;Jen&gt;: It’s clear how these compare to CRAC units – based one the design- and how it can help with limiting the actual real estate needed. But, one of the questions I frequently get about CRAC units is the issues of redundancy – so, what happens if the unit fails?
  • #23 &lt;DANIEL&gt; “Failure depends on how you design it.”
  • #24 &lt;Daniel&gt;:…. TRANSITION &lt;Jen&gt;: This all makes sense, but I know the question on everyone’s mind is cost? What does a solution like this cost?
  • #25 TRANSITION &lt;Jen&gt;: This all makes sense. And, I think we would all agree that cooling at the server level is far better than cooling at the room level. But, I know the question on everyone’s mind is cost? What does a solution like this cost?”
  • #27 TRANSITION&lt;Jen&gt;: The Real Estate savings makes intuitive sense -- if you’re cooling at the rack level. But what about Energy? Engery cos Is probably the largest cost driver for conventional tools. Is this cost lowered with closed couples or can do we just expect the same types of spending?
  • #30 &lt;Daniel&gt; In summary, we are talking about two distinct areas of savings – physical real estate and energy. &lt;Daniel&gt; But, Jen, these are just some of the expected cost savings areas. Closed couple cooling provides a very interesting savings opportunity for FREE Energy? TRANSITION &lt;Jen&gt; FREE energy? I’ve never heard of such a thing. I think you’re going to have to show us what you mean by that one!
  • #33 TRANSITION &lt;Jen&gt; That sounds great for CA but many of my customers are on the south eastern United States. What results can they expect?
  • #34 TRANSTION: &lt;Jen&gt; So, close coupled cooling offers not only savings with in real estate and energy, but based on outdoor temperatures, there’s also an opportunity for free cooling. I think the next logical question for some folks joining us today it may be worthwhile to put this in terms that our audience will understand. What I mean by that, is how do they know how energy efficient their data center currently is?
  • #35 &lt;TRANSITION&gt; Jen: Many of customer, getting below 2.0 would be wonderful, if this is way to get there and at a cost savings, it seem like a very viable solution. But, these customer are also talking to me about w/ any new solution that they be environmentally friendly.
  • #36 TRANSITION &lt;Jen&gt;: Another item of concern is the ‘newness’ of the technology. I think many folks may be apprehensive about being the first to adopt a new technology, particularly one that requires putting water so close to their servers.
  • #37 TRANSITION &lt;Jen&gt;: That’s a really important point. Liquid cooling isn’t really the ‘uncharted territory’ that we may think it is. You’ve overcome how some misconceptions but with any solution, there has to be drawbacks. What do you consider some of the drawbacks to Closed-couple cooling?
  • #38 TRANSITION &lt;Jen&gt; I think you’ve given us some great information around closed-couple cooling – obtaining free cooling, savings areas that we may not have originally considered, that this is NOT a new technology. For those listening today who are evaluating solutions, what are some of the things you would recommend we consider before purchasing a solution.