SlideShare a Scribd company logo
1 of 4
Download to read offline
IT@Intel Brief
Intel Information Technology
                                   Reducing Data Center Cost
Computer Manufacturing
Energy Efficiency
                                   with an Air Economizer
                                   To challenge established industry assumptions regarding                 Profile: Air Economizer PoC
August 2008
                                   data center cooling, Intel IT conducted a proof of concept
                                                                                                           • 900 heavily utilized production
                                   (PoC) test that used an air economizer to cool production
                                                                                                             servers in a high-density data center
                                   servers with 100 percent outside air at temperatures of
                                   up to 90 degrees Fahrenheit (F). With this approach, we
                                                                                                           • 100% air exchange at up to 90
                                                                                                             degrees F, with no humidity control
                                   could use an economizer to provide nearly all data center
                                                                                                             and minimal air filtration
                                   cooling, substantially reducing power consumption. This
                                   could potentially reduce annual operating costs by up to                • 67% estimated power savings using
                                   USD 2.87 million for a 10-megawatt (MW) data center.                      economizer 91% of the time—an
                                                                                                             estimated annual savings of
                                   We ran the PoC in a dry, temperate climate over 10 months
                                                                                                             approximately USD 2.87 million in
                                   using about 900 production blade servers, divided equally                 a 10-MW data center
                                   between two side-by-side compartments, as shown in Figure 1.
                                   One used standard air conditioning, the other an air economizer.

                                   Servers in the economizer compartment were subjected to considerable variation in temperature and humidity as
                                   well as poor air quality; however, there was no significant increase in server failures. If subsequent investigation
                                   confirms these promising results, we anticipate using this approach in future high-density data centers.


                                                                                        Air Conditioning
                                                                                              Unit




                                                                                                                      Traditional Air-Conditioned
                                                                                                                               Compartment
                                                                                    Server Racks                      Hot air is cooled and recirculated




 Air Economizer Compartment              Server Racks
  Hot air is flushed outside, and
       outside air is drawn in


                                                Air
                                            Economizer                                                             Temperature and Humidity Sensor
                                                                                                                   Hot Air
                                                                                                                   Cold Air

  Figure 1. Proof of concept (PoC) data center environment.


                                                                                                                          IT@Intel
Background                                              If we were successful, we would be able to
                                                        use air economizers for most of the year in dry,
Data center power consumption is soaring, driven
                                                        temperate climates. This could drastically reduce
by increasing demand for computing capacity. In
                                                        power consumption and cooling costs while
a typical data center, 60 to 70 percent of data
                                                        improving Intel’s environmental footprint.
center power may be used for facilities power
and data center cooling.
                                                        Proof of Concept
At Intel, our data centers need to support the
                                                        We conducted a large PoC test using
rapid growth in computing capacity required to
                                                        approximately 900 production design servers
design increasingly complex semiconductors. At
                                                        at a data center located in a temperate desert
the same time, we are trying to minimize data
                                                        climate with generally low relative humidity. We
center power consumption and operating costs.
                                                        began the PoC in October 2007, and continued
Our strategy is based on high-performance, high-        the test for 10 months until August 2008.
density data centers containing thousands of
                                                        We set up the PoC in a 1,000-square-foot (SF)
blade servers. These blades deliver considerable
                                                        trailer that was originally installed to provide
computing capacity, but they also generate
                                                        temporary additional computing capacity and
substantial heat. We supply cooling air to the
                                                        divided the trailer into two compartments of
blades at 68 degrees F; as the air passes over
                                                        approximately 500 SF each. To minimize the cost
the blades, the air temperature rises by 58
                                                        of the PoC, we used low-cost, warehouse-grade
degrees F, resulting in an exit temperature of
                                                        direct expansion (DX) air conditioning equipment.
126 degrees F. This means that we need to
                                                        Temperature and humidity sensors were installed
cool the air by 58 degrees F before recirculating
                                                        to monitor the conditions in each compartment.
it. The air conditioning units required to do this
consume a considerable amount of electricity.           We cooled one compartment with a traditional
                                                        approach, using a DX unit to recirculate hot air
Air economizers represent one potential way
                                                        and provide cooling at all times.
to reduce data center power consumption and
cooling cost. Instead of cooling and recirculating      For the other compartment, we used essentially
the hot air from the servers, air economizers           the same air-conditioning equipment, but with
simply expel the hot air outdoors and draw in           modifications that enabled it to operate as an
outside air to cool the IT equipment.                   economizer by expelling hot air to the outdoors
                                                        and drawing in 100 percent outside air for cooling.
The current industry assumption is that the
usefulness of air economizers is limited by             Because one of our goals was to test the
the need to supply cooling air at a relatively          acceptable limits of operating temperature,
low temperature. The implication is that air            we configured the cooling equipment in the
economizers can only be used at times when the          economizer compartment to supply air at
outside air is relatively cool. There is also concern   temperatures ranging from 65 degrees F to 90
about variation in humidity, because the humidity       degrees F. We designed the system to use only
of outside air can change rapidly. A third area of      the economizer until the supply air exceeded
concern is particulate counts in outside air.           the 90 degree F maximum, at which point we
                                                        began using the chiller to cool the air to 90
We decided to challenge these assumptions
                                                        degrees F. If the temperature dropped below
by employing an economizer to cool a high-
                                                        65 degrees F, we warmed the supply air by
density production environment using a much
                                                        mixing it with hot return air from the servers.
wider range of outside air temperatures—up to
90 degrees F. We reasoned that this might be            We made no attempt to control humidity. We also
feasible because server manufacturers specify           wanted to test the limits of air quality, so we
that their products can operate in temperatures         applied minimal filtering to incoming air, using a
as high as 98 degrees F. We also wanted to push         standard household air filter that removed only
the accepted limits of humidity and air quality.        large particles from the incoming air but permitted
                                                        fine dust to pass through.
Each room contained eight racks. Each rack contained four                                  100º F                                                                                                      35%
                                                                                                                                  Temperature
blade servers with 14 blades each, for a total of 448 blades per                                                                  Humidity




                                                                             Temperature in Degrees Fahrenheit (° F)
                                                                                                                        90
                                                                                                                                                                                                       30
compartment. This represented a power density of more than                                                              80
200 watts per square foot (WPSF).                                                                                                                                                                      25
                                                                                                                        70




                                                                                                                                                                                                            Percent Humidity
During the PoC, we used the servers to run large production batch                                                       60
                                                                                                                                                                                                       20
silicon design workloads, resulting in very high server utilization rates                                               50
of about 90 percent.                                                                                                    40
                                                                                                                                                                                                       15

We measured server failure rates in each compartment and compared                                                       30                                                                             10
them with the failure rates that we experienced during the same period                                                  20
                                                                                                                                                                                                       5
within our main data center at the same location.                                                                       10

                                                                                                                        0                                                                              0
Results                                                                                                                        January          February        March            April          May

During the PoC, the servers in the economizer compartment were                                                         Figure 2. Temperature and humidity variation in the air
subjected to wide variation in environmental conditions, as shown in                                                   economizer compartment from January through May 2008.
Figure 2.

• The temperature of the supply air varied from 64 degrees F to
  more than 92 degrees F. This variation slightly exceeded our
  set points partly due to the slow response of our low-cost air
  conditioning units.                                                                                                  Table 1. Average Annual Temperatures at the Proof of
                                                                                                                       Concept (PoC) Test Location
• Humidity varied from 4 percent to more than 90 percent and
  changed rapidly at times.                                                                                            Month                               Average High                  Average Low
                                                                                                                       January                     48 Degrees Fahrenheit (° F)              24º F
• The servers and the interior of the compartment became covered
                                                                                                                       February                                55º F                        28º F
  in a layer of dust.
                                                                                                                       March                                   62º F                        33º F
Power Consumption                                                                                                      April                                   70º F                        41º F
Total power consumption of the trailer was approximately 500 kilowatts                                                 May                                     80º F                        49º F
(KW) when using air conditioning in both compartments. When using                                                      June                                    90º F                        59º F
the economizer, the DX cooling load in the economizer compartment                                                      July                                    92º F                        65º F
was reduced from 111.78 KW to 28.6 KW, representing a 74 percent                                                       August                                  88º F                        63º F
reduction in energy consumption.                                                                                       September                               83º F                        56º F
                                                                                                                       October                                 71º F                        44º F
Server Failure Rates
                                                                                                                       November                                57º F                        32º F
Despite the dust and variation in humidity and temperature, there was
                                                                                                                       December                                48º F                        24º F
only a minimal difference between the 4.46 percent failure rate in
the economizer compartment and the 3.83 percent failure rate in our
main data center over the same period. The failure rate in the trailer
compartment with DX cooling was 2.45 percent, actually lower than in
the main data center.


Analysis
We estimated the average annual power savings we could achieve in
a data center that uses an economizer. To do this, we used historical
weather data for the data center location, summarized in Table 1. Analysis
of the data indicated that during an average year, the temperature is
below our 90 degrees F maximum 91 percent of the time.

Based on our 74 percent measured decrease in power consumption
when using the economizer during the PoC, and assuming that we could
rely on the economizer 91 percent of the year, we could potentially save
approximately 67 percent of the total power used annually for cooling
compared with a traditional data center cooling approach. This translates
into approximately 3,500 kilowatt hours (KWH) per KW of overall data
center power consumption, based on an assumption that 60 percent of
                                                                                                                      Intel IT’s Eight-year Data Center
data center power typically is used for mechanical cooling systems.                                                   Efficiency Strategy
                                                                                                                      This proof of concept is part of Intel IT’s enterprise-wide, eight-
This would result in an estimated annual cost reduction of approximately
                                                                                                                      year data center efficiency strategy, begun in 2007. The goals of
USD 143,000 for a small 500-KW data center, based on electricity costs
                                                                                                                      the strategy are to transform our global data center environment,
of 0.08 per KWH. In a larger 10-MW data center, the estimated annual
                                                                                                                      increasing efficiency and business responsiveness while
cost reduction would be approximately USD 2.87 million.
                                                                                                                      substantially reducing cost.
In addition, we could avoid certain capital expenditures in new data                                                  Key focus areas of the strategy include:
centers because we would require less cooling equipment. Even when the                                                • Standardizing processes and design specifications by using a
outside air temperature exceeded our maximum supply air temperature,                                                    strategic, long-range planning approach.
we would only have to cool the air to our specified maximum rather
                                                                                                                      • Accelerating server refresh, while implementing grid computing
than to the 68 degrees F used in our traditional data center approach.                                                  and server virtualization to increase utilization.
Reducing the complexity and cost of the cooling system also reduces
                                                                                                                      • Consolidating data centers, concentrating computing resources
the number of failure modes, increasing overall resiliency.
                                                                                                                        into fewer large, highly efficient hub data centers.
                                                                                                                      • Using green computing with power-efficient servers and energy-
Conclusion                                                                                                              saving data center design, as well as developing our commitment
We observed no consistent increase in server failure rates as a result of                                               to industry initiatives.
the greater variation in temperature and humidity, and the decrease in
air quality, in the trailer. This suggests that existing assumptions about
the need to closely regulate these factors bear further scrutiny.

Air economizers seem particularly suited to temperate climates with
low humidity. A data center equipped with an air economizer could
substantially reduce Intel’s environmental footprint by reducing
consumption of both power and water. In dry climates, traditional air-
conditioned data centers typically include evaporative cooling using
water towers as a pre-cooling stage. With an economizer, this would not
typically be used, potentially saving up to 76 million gallons of water
annually in a 10-MW data center.

We plan to further test for possible hardware degradation using a server
aging analysis that compares systems used in the economizer compartment,                                              Author
in the air-conditioned compartment, and in our main data center.
                                                                                                                      Don Atwood is a regional data center manager with Intel
If subsequent investigation confirms our promising PoC results, we expect                                             Information Technology.
to include air economizers in future data center designs. A possible next                                             John G. Miner is a senior systems engineer with Intel
step would be a 1-MW demonstration data center using the equipment                                                    Information Technology.
designed for the PoC.




This paper is for informational purposes only. THIS DOCUMENT IS               Intel and the Intel logo are trademarks of Intel Corporation in the
PROVIDED “AS IS” WITH NO WARRANTIES WHATSOEVER, INCLUDING                     U.S. and other countries.
ANY WARRANTY OF MERCHANTABILITY, NONINFRINGEMENT,                             *Other names and brands may be claimed as the property of others.
FITNESS FOR ANY PARTICULAR PURPOSE, OR ANY WARRANTY
OTHERWISE ARISING OUT OF ANY PROPOSAL, SPECIFICATION OR                       Copyright © 2008 Intel Corporation. All rights reserved.
SAMPLE. Intel disclaims all liability, including liability for infringement   Printed in USA                                              Please Recycle
of any proprietary rights, relating to use of information in this             0808/KAR/KC/PDF                                            320074-001US
specification. No license, express or implied, by estoppel or otherwise,
to any intellectual property rights is granted herein.

More Related Content

What's hot

Hot Aisle & Cold Aisle Containment Solutions & Case Studies
Hot Aisle & Cold Aisle Containment Solutions & Case StudiesHot Aisle & Cold Aisle Containment Solutions & Case Studies
Hot Aisle & Cold Aisle Containment Solutions & Case Studies42U Data Center Solutions
 
Lowering operating costs through cooling system design
Lowering operating costs through cooling system designLowering operating costs through cooling system design
Lowering operating costs through cooling system designAFCOM
 
District Cooling Jenny Dahlberg, Fortum
District Cooling   Jenny Dahlberg, FortumDistrict Cooling   Jenny Dahlberg, Fortum
District Cooling Jenny Dahlberg, FortumNiklas Johnsson
 
Radiant cooling for residential and commercial applications (Messana Radiant ...
Radiant cooling for residential and commercial applications (Messana Radiant ...Radiant cooling for residential and commercial applications (Messana Radiant ...
Radiant cooling for residential and commercial applications (Messana Radiant ...Alessandro Arnulfo
 
Proceedings nordic wind power conference 2007 part 2
Proceedings nordic wind power conference 2007 part 2Proceedings nordic wind power conference 2007 part 2
Proceedings nordic wind power conference 2007 part 2Dickdick Maulana
 
Water Efficiency in Thermal power Plant
Water Efficiency in Thermal power PlantWater Efficiency in Thermal power Plant
Water Efficiency in Thermal power PlantAtanu Maity
 
Energy Benchmarking - David Unger
Energy Benchmarking - David UngerEnergy Benchmarking - David Unger
Energy Benchmarking - David UngerRyan Slack
 
Impact Of High Density Colo Hot Aisles on IT Personnel Work Conditions
Impact Of High Density Colo Hot Aisles on IT Personnel Work ConditionsImpact Of High Density Colo Hot Aisles on IT Personnel Work Conditions
Impact Of High Density Colo Hot Aisles on IT Personnel Work ConditionsScott_Blake
 
Saving energy with cooling towers
Saving energy with cooling towersSaving energy with cooling towers
Saving energy with cooling towersAbdelrahim Ramadan
 
Auditac carbon and cooling in uk office environments dunn knight
Auditac carbon and cooling in uk office environments dunn knightAuditac carbon and cooling in uk office environments dunn knight
Auditac carbon and cooling in uk office environments dunn knightRoyal Mail
 
Ppt4 exp leeds - alan real and jon summers ( university of leeds ) experien...
Ppt4   exp leeds - alan real and jon summers ( university of leeds ) experien...Ppt4   exp leeds - alan real and jon summers ( university of leeds ) experien...
Ppt4 exp leeds - alan real and jon summers ( university of leeds ) experien...JISC's Green ICT Programme
 
zHome Signs 1
zHome Signs 1zHome Signs 1
zHome Signs 1zHome
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)IJERD Editor
 
Combi Cooler Tech Detail Gb 8733
Combi Cooler Tech Detail Gb 8733Combi Cooler Tech Detail Gb 8733
Combi Cooler Tech Detail Gb 8733Bartosz Pijawski
 
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NRELBits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NRELinside-BigData.com
 

What's hot (18)

Hot Aisle & Cold Aisle Containment Solutions & Case Studies
Hot Aisle & Cold Aisle Containment Solutions & Case StudiesHot Aisle & Cold Aisle Containment Solutions & Case Studies
Hot Aisle & Cold Aisle Containment Solutions & Case Studies
 
Lowering operating costs through cooling system design
Lowering operating costs through cooling system designLowering operating costs through cooling system design
Lowering operating costs through cooling system design
 
District Cooling Jenny Dahlberg, Fortum
District Cooling   Jenny Dahlberg, FortumDistrict Cooling   Jenny Dahlberg, Fortum
District Cooling Jenny Dahlberg, Fortum
 
Radiant cooling for residential and commercial applications (Messana Radiant ...
Radiant cooling for residential and commercial applications (Messana Radiant ...Radiant cooling for residential and commercial applications (Messana Radiant ...
Radiant cooling for residential and commercial applications (Messana Radiant ...
 
Proceedings nordic wind power conference 2007 part 2
Proceedings nordic wind power conference 2007 part 2Proceedings nordic wind power conference 2007 part 2
Proceedings nordic wind power conference 2007 part 2
 
Water Efficiency in Thermal power Plant
Water Efficiency in Thermal power PlantWater Efficiency in Thermal power Plant
Water Efficiency in Thermal power Plant
 
Energy Benchmarking - David Unger
Energy Benchmarking - David UngerEnergy Benchmarking - David Unger
Energy Benchmarking - David Unger
 
Impact Of High Density Colo Hot Aisles on IT Personnel Work Conditions
Impact Of High Density Colo Hot Aisles on IT Personnel Work ConditionsImpact Of High Density Colo Hot Aisles on IT Personnel Work Conditions
Impact Of High Density Colo Hot Aisles on IT Personnel Work Conditions
 
Saving energy with cooling towers
Saving energy with cooling towersSaving energy with cooling towers
Saving energy with cooling towers
 
Auditac carbon and cooling in uk office environments dunn knight
Auditac carbon and cooling in uk office environments dunn knightAuditac carbon and cooling in uk office environments dunn knight
Auditac carbon and cooling in uk office environments dunn knight
 
Ppt4 exp leeds - alan real and jon summers ( university of leeds ) experien...
Ppt4   exp leeds - alan real and jon summers ( university of leeds ) experien...Ppt4   exp leeds - alan real and jon summers ( university of leeds ) experien...
Ppt4 exp leeds - alan real and jon summers ( university of leeds ) experien...
 
Hc2512901294
Hc2512901294Hc2512901294
Hc2512901294
 
zHome Signs 1
zHome Signs 1zHome Signs 1
zHome Signs 1
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)
 
Ailr solar ac
Ailr solar acAilr solar ac
Ailr solar ac
 
Combi Cooler Tech Detail Gb 8733
Combi Cooler Tech Detail Gb 8733Combi Cooler Tech Detail Gb 8733
Combi Cooler Tech Detail Gb 8733
 
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NRELBits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
Bits, Bytes and BTUs: Warm Water Liquid Cooling at NREL
 
Uponor radiant cooling gv
Uponor radiant cooling gvUponor radiant cooling gv
Uponor radiant cooling gv
 

Viewers also liked

Cla Rii O Nfor V Mware
Cla Rii O Nfor V MwareCla Rii O Nfor V Mware
Cla Rii O Nfor V MwareNick Turunov
 
Odi Suite – готовая интеграционная платформа на базе COA
Odi Suite – готовая интеграционная платформа на базе COAOdi Suite – готовая интеграционная платформа на базе COA
Odi Suite – готовая интеграционная платформа на базе COANick Turunov
 
Saa s microsoft spla_kalachova
Saa s microsoft spla_kalachovaSaa s microsoft spla_kalachova
Saa s microsoft spla_kalachovaNick Turunov
 
Dell – Стратегия Роста
Dell – Стратегия РостаDell – Стратегия Роста
Dell – Стратегия РостаNick Turunov
 
автор комплексное решение компании «автор» в сфере эквайринга на базе системы...
автор комплексное решение компании «автор» в сфере эквайринга на базе системы...автор комплексное решение компании «автор» в сфере эквайринга на базе системы...
автор комплексное решение компании «автор» в сфере эквайринга на базе системы...Nick Turunov
 
Процессор Intel Xeon
Процессор Intel Xeon Процессор Intel Xeon
Процессор Intel Xeon Nick Turunov
 
3 Com новое знакомство с 3 Com и новые возможности
3 Com новое знакомство с 3 Com и новые возможности3 Com новое знакомство с 3 Com и новые возможности
3 Com новое знакомство с 3 Com и новые возможностиNick Turunov
 
Emc выбор оптимальной информационной инфраструктуры для систем автоматизации ...
Emc выбор оптимальной информационной инфраструктуры для систем автоматизации ...Emc выбор оптимальной информационной инфраструктуры для систем автоматизации ...
Emc выбор оптимальной информационной инфраструктуры для систем автоматизации ...Nick Turunov
 
джет Dlp или Dайте Lюдям работать
джет Dlp или Dайте Lюдям работатьджет Dlp или Dайте Lюдям работать
джет Dlp или Dайте Lюдям работатьNick Turunov
 
Kingston обработка, хранение и перенос конфиденциальных данных – аппаратные...
Kingston   обработка, хранение и перенос конфиденциальных данных – аппаратные...Kingston   обработка, хранение и перенос конфиденциальных данных – аппаратные...
Kingston обработка, хранение и перенос конфиденциальных данных – аппаратные...Nick Turunov
 
Megatrade артем вижуткин академия цод
Megatrade артем вижуткин академия цодMegatrade артем вижуткин академия цод
Megatrade артем вижуткин академия цодNick Turunov
 
Решения Oracle в области управления идентификацией и доступом пользователей к...
Решения Oracle в области управления идентификацией и доступом пользователей к...Решения Oracle в области управления идентификацией и доступом пользователей к...
Решения Oracle в области управления идентификацией и доступом пользователей к...Nick Turunov
 
Проектный опыт АОЗТ «Атлас»
Проектный опыт АОЗТ «Атлас»Проектный опыт АОЗТ «Атлас»
Проектный опыт АОЗТ «Атлас»Nick Turunov
 
Tonbeller Ag наилучшие методы осуществления решений, направленных на борьбу с...
Tonbeller Ag наилучшие методы осуществления решений, направленных на борьбу с...Tonbeller Ag наилучшие методы осуществления решений, направленных на борьбу с...
Tonbeller Ag наилучшие методы осуществления решений, направленных на борьбу с...Nick Turunov
 
Stulz datacentre cooling
Stulz datacentre coolingStulz datacentre cooling
Stulz datacentre coolingNick Turunov
 
блинов 2010 aten lcd_kvm_and over ip
блинов 2010 aten lcd_kvm_and over ipблинов 2010 aten lcd_kvm_and over ip
блинов 2010 aten lcd_kvm_and over ipNick Turunov
 

Viewers also liked (17)

Cla Rii O Nfor V Mware
Cla Rii O Nfor V MwareCla Rii O Nfor V Mware
Cla Rii O Nfor V Mware
 
Odi Suite – готовая интеграционная платформа на базе COA
Odi Suite – готовая интеграционная платформа на базе COAOdi Suite – готовая интеграционная платформа на базе COA
Odi Suite – готовая интеграционная платформа на базе COA
 
Saa s microsoft spla_kalachova
Saa s microsoft spla_kalachovaSaa s microsoft spla_kalachova
Saa s microsoft spla_kalachova
 
Dell – Стратегия Роста
Dell – Стратегия РостаDell – Стратегия Роста
Dell – Стратегия Роста
 
автор комплексное решение компании «автор» в сфере эквайринга на базе системы...
автор комплексное решение компании «автор» в сфере эквайринга на базе системы...автор комплексное решение компании «автор» в сфере эквайринга на базе системы...
автор комплексное решение компании «автор» в сфере эквайринга на базе системы...
 
Процессор Intel Xeon
Процессор Intel Xeon Процессор Intel Xeon
Процессор Intel Xeon
 
3 Com новое знакомство с 3 Com и новые возможности
3 Com новое знакомство с 3 Com и новые возможности3 Com новое знакомство с 3 Com и новые возможности
3 Com новое знакомство с 3 Com и новые возможности
 
Emc выбор оптимальной информационной инфраструктуры для систем автоматизации ...
Emc выбор оптимальной информационной инфраструктуры для систем автоматизации ...Emc выбор оптимальной информационной инфраструктуры для систем автоматизации ...
Emc выбор оптимальной информационной инфраструктуры для систем автоматизации ...
 
джет Dlp или Dайте Lюдям работать
джет Dlp или Dайте Lюдям работатьджет Dlp или Dайте Lюдям работать
джет Dlp или Dайте Lюдям работать
 
Kwf Slideshow
Kwf SlideshowKwf Slideshow
Kwf Slideshow
 
Kingston обработка, хранение и перенос конфиденциальных данных – аппаратные...
Kingston   обработка, хранение и перенос конфиденциальных данных – аппаратные...Kingston   обработка, хранение и перенос конфиденциальных данных – аппаратные...
Kingston обработка, хранение и перенос конфиденциальных данных – аппаратные...
 
Megatrade артем вижуткин академия цод
Megatrade артем вижуткин академия цодMegatrade артем вижуткин академия цод
Megatrade артем вижуткин академия цод
 
Решения Oracle в области управления идентификацией и доступом пользователей к...
Решения Oracle в области управления идентификацией и доступом пользователей к...Решения Oracle в области управления идентификацией и доступом пользователей к...
Решения Oracle в области управления идентификацией и доступом пользователей к...
 
Проектный опыт АОЗТ «Атлас»
Проектный опыт АОЗТ «Атлас»Проектный опыт АОЗТ «Атлас»
Проектный опыт АОЗТ «Атлас»
 
Tonbeller Ag наилучшие методы осуществления решений, направленных на борьбу с...
Tonbeller Ag наилучшие методы осуществления решений, направленных на борьбу с...Tonbeller Ag наилучшие методы осуществления решений, направленных на борьбу с...
Tonbeller Ag наилучшие методы осуществления решений, направленных на борьбу с...
 
Stulz datacentre cooling
Stulz datacentre coolingStulz datacentre cooling
Stulz datacentre cooling
 
блинов 2010 aten lcd_kvm_and over ip
блинов 2010 aten lcd_kvm_and over ipблинов 2010 aten lcd_kvm_and over ip
блинов 2010 aten lcd_kvm_and over ip
 

Similar to !08 0609b air-economizer_final

CPD Presentation Evaporative cooling in data centres
CPD Presentation   Evaporative cooling in data centresCPD Presentation   Evaporative cooling in data centres
CPD Presentation Evaporative cooling in data centresColt UK
 
Green cloud computing
Green cloud computingGreen cloud computing
Green cloud computingJauwadSyed
 
Green cloud computing India
Green cloud computing IndiaGreen cloud computing India
Green cloud computing Indiaakashlaldas
 
High Efficiency Indirect Air Economizer Based Cooling for Data Centers
High Efficiency Indirect Air Economizer Based Cooling for Data CentersHigh Efficiency Indirect Air Economizer Based Cooling for Data Centers
High Efficiency Indirect Air Economizer Based Cooling for Data CentersSchneider Electric
 
Aurora hpc energy efficiency
Aurora hpc energy efficiencyAurora hpc energy efficiency
Aurora hpc energy efficiencyEurotech Aurora
 
75F Outside Air Optimization and Economizer Control
75F Outside Air Optimization and Economizer Control75F Outside Air Optimization and Economizer Control
75F Outside Air Optimization and Economizer ControlBrendonMartin3
 
white_paper_Liquid-Cooling-Solutions.pdf
white_paper_Liquid-Cooling-Solutions.pdfwhite_paper_Liquid-Cooling-Solutions.pdf
white_paper_Liquid-Cooling-Solutions.pdfwberring
 
Energy Efficient Air Conditioning System
Energy Efficient Air Conditioning SystemEnergy Efficient Air Conditioning System
Energy Efficient Air Conditioning SystemTantish QS, UTM
 
Slides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingSlides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingGraybar
 
Dell H2C™ Technology: Hybrid Cooling for Overclocked CPUs
	 Dell H2C™ Technology: Hybrid Cooling for Overclocked CPUs	 Dell H2C™ Technology: Hybrid Cooling for Overclocked CPUs
Dell H2C™ Technology: Hybrid Cooling for Overclocked CPUsWayne Caswell
 
Econet Marketing Brochure 8692 Gb 2008 08
Econet   Marketing Brochure 8692 Gb 2008 08Econet   Marketing Brochure 8692 Gb 2008 08
Econet Marketing Brochure 8692 Gb 2008 08Bartosz Pijawski
 
A compressed air & gas institute q&a session heat recovery from industrial co...
A compressed air & gas institute q&a session heat recovery from industrial co...A compressed air & gas institute q&a session heat recovery from industrial co...
A compressed air & gas institute q&a session heat recovery from industrial co...Singh Ashwani
 
Green IT @ STRATO - Rene Weinholtz
Green IT @ STRATO - Rene WeinholtzGreen IT @ STRATO - Rene Weinholtz
Green IT @ STRATO - Rene Weinholtzcatherinewall
 
Ppt3 london - sophia ( operation intelligence ) what is the eu code of conduct
Ppt3   london - sophia ( operation intelligence ) what is the eu code of conductPpt3   london - sophia ( operation intelligence ) what is the eu code of conduct
Ppt3 london - sophia ( operation intelligence ) what is the eu code of conductJISC's Green ICT Programme
 
The Economics of Green HPC
The Economics of Green HPCThe Economics of Green HPC
The Economics of Green HPCIntel IT Center
 
Apc cooling solutions
Apc cooling solutionsApc cooling solutions
Apc cooling solutionspeperoca
 
Hybrid Ventilation using CFD Simulation in the Cloud
Hybrid Ventilation using CFD Simulation in the CloudHybrid Ventilation using CFD Simulation in the Cloud
Hybrid Ventilation using CFD Simulation in the CloudSimScale
 

Similar to !08 0609b air-economizer_final (20)

CPD Presentation Evaporative cooling in data centres
CPD Presentation   Evaporative cooling in data centresCPD Presentation   Evaporative cooling in data centres
CPD Presentation Evaporative cooling in data centres
 
Green cloud computing
Green cloud computingGreen cloud computing
Green cloud computing
 
Green cloud computing India
Green cloud computing IndiaGreen cloud computing India
Green cloud computing India
 
High Efficiency Indirect Air Economizer Based Cooling for Data Centers
High Efficiency Indirect Air Economizer Based Cooling for Data CentersHigh Efficiency Indirect Air Economizer Based Cooling for Data Centers
High Efficiency Indirect Air Economizer Based Cooling for Data Centers
 
Aurora hpc energy efficiency
Aurora hpc energy efficiencyAurora hpc energy efficiency
Aurora hpc energy efficiency
 
75F Outside Air Optimization and Economizer Control
75F Outside Air Optimization and Economizer Control75F Outside Air Optimization and Economizer Control
75F Outside Air Optimization and Economizer Control
 
white_paper_Liquid-Cooling-Solutions.pdf
white_paper_Liquid-Cooling-Solutions.pdfwhite_paper_Liquid-Cooling-Solutions.pdf
white_paper_Liquid-Cooling-Solutions.pdf
 
Energy Efficient Air Conditioning System
Energy Efficient Air Conditioning SystemEnergy Efficient Air Conditioning System
Energy Efficient Air Conditioning System
 
Slides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingSlides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for Cooling
 
Dell H2C™ Technology: Hybrid Cooling for Overclocked CPUs
	 Dell H2C™ Technology: Hybrid Cooling for Overclocked CPUs	 Dell H2C™ Technology: Hybrid Cooling for Overclocked CPUs
Dell H2C™ Technology: Hybrid Cooling for Overclocked CPUs
 
Green computing
Green computingGreen computing
Green computing
 
Econet Marketing Brochure 8692 Gb 2008 08
Econet   Marketing Brochure 8692 Gb 2008 08Econet   Marketing Brochure 8692 Gb 2008 08
Econet Marketing Brochure 8692 Gb 2008 08
 
Data center m&e
Data center  m&eData center  m&e
Data center m&e
 
A compressed air & gas institute q&a session heat recovery from industrial co...
A compressed air & gas institute q&a session heat recovery from industrial co...A compressed air & gas institute q&a session heat recovery from industrial co...
A compressed air & gas institute q&a session heat recovery from industrial co...
 
Green IT @ STRATO - Rene Weinholtz
Green IT @ STRATO - Rene WeinholtzGreen IT @ STRATO - Rene Weinholtz
Green IT @ STRATO - Rene Weinholtz
 
Deep Freeze - Design
Deep Freeze - DesignDeep Freeze - Design
Deep Freeze - Design
 
Ppt3 london - sophia ( operation intelligence ) what is the eu code of conduct
Ppt3   london - sophia ( operation intelligence ) what is the eu code of conductPpt3   london - sophia ( operation intelligence ) what is the eu code of conduct
Ppt3 london - sophia ( operation intelligence ) what is the eu code of conduct
 
The Economics of Green HPC
The Economics of Green HPCThe Economics of Green HPC
The Economics of Green HPC
 
Apc cooling solutions
Apc cooling solutionsApc cooling solutions
Apc cooling solutions
 
Hybrid Ventilation using CFD Simulation in the Cloud
Hybrid Ventilation using CFD Simulation in the CloudHybrid Ventilation using CFD Simulation in the Cloud
Hybrid Ventilation using CFD Simulation in the Cloud
 

More from Nick Turunov

Kharkov conference 2011 q4 odrov s.
Kharkov conference 2011 q4 odrov s.Kharkov conference 2011 q4 odrov s.
Kharkov conference 2011 q4 odrov s.Nick Turunov
 
Huawei smart grid rus
Huawei smart grid rusHuawei smart grid rus
Huawei smart grid rusNick Turunov
 
Huawei smart grid rus
Huawei smart grid rusHuawei smart grid rus
Huawei smart grid rusNick Turunov
 
2011 ukraine channel sales business report
2011 ukraine channel sales business report2011 ukraine channel sales business report
2011 ukraine channel sales business reportNick Turunov
 
Kharkov conference 2011 q4 odrov s.
Kharkov conference 2011 q4 odrov s.Kharkov conference 2011 q4 odrov s.
Kharkov conference 2011 q4 odrov s.Nick Turunov
 
Megatrade артем вижуткин академия цод
Megatrade артем вижуткин академия цодMegatrade артем вижуткин академия цод
Megatrade артем вижуткин академия цодNick Turunov
 
Aflex distribution
Aflex distributionAflex distribution
Aflex distributionNick Turunov
 
решения Eaton вадим харитонов
решения Eaton вадим харитоноврешения Eaton вадим харитонов
решения Eaton вадим харитоновNick Turunov
 
лаборатория касперского киберпреступность
лаборатория касперского киберпреступностьлаборатория касперского киберпреступность
лаборатория касперского киберпреступностьNick Turunov
 
комплексные решения по гарантированному электропитанию харитонов мегатрейд
комплексные решения по гарантированному электропитанию харитонов мегатрейдкомплексные решения по гарантированному электропитанию харитонов мегатрейд
комплексные решения по гарантированному электропитанию харитонов мегатрейдNick Turunov
 
генераторные установки нового поколения Ipt в.дюбанов
генераторные установки нового поколения Ipt в.дюбановгенераторные установки нового поколения Ipt в.дюбанов
генераторные установки нового поколения Ipt в.дюбановNick Turunov
 
Stulz mission energy
Stulz mission energyStulz mission energy
Stulz mission energyNick Turunov
 
Oracle exa2 biz_summit
Oracle exa2 biz_summitOracle exa2 biz_summit
Oracle exa2 biz_summitNick Turunov
 
Ibm megatrade шиндак xiv v3.0
Ibm megatrade шиндак xiv v3.0Ibm megatrade шиндак xiv v3.0
Ibm megatrade шиндак xiv v3.0Nick Turunov
 
Conteg юрий шульга
Conteg юрий шульгаConteg юрий шульга
Conteg юрий шульгаNick Turunov
 
Cisco solutions for dc whiteboarding
Cisco solutions for dc whiteboardingCisco solutions for dc whiteboarding
Cisco solutions for dc whiteboardingNick Turunov
 
Bas consulting for_it_conference_(by_po)
Bas consulting for_it_conference_(by_po)Bas consulting for_it_conference_(by_po)
Bas consulting for_it_conference_(by_po)Nick Turunov
 
Alcatel lucent решения для бизнеса
Alcatel lucent решения для бизнесаAlcatel lucent решения для бизнеса
Alcatel lucent решения для бизнесаNick Turunov
 
Alcatel lucent 10 gbit
Alcatel lucent 10 gbitAlcatel lucent 10 gbit
Alcatel lucent 10 gbitNick Turunov
 
Aflex distribution
Aflex distributionAflex distribution
Aflex distributionNick Turunov
 

More from Nick Turunov (20)

Kharkov conference 2011 q4 odrov s.
Kharkov conference 2011 q4 odrov s.Kharkov conference 2011 q4 odrov s.
Kharkov conference 2011 q4 odrov s.
 
Huawei smart grid rus
Huawei smart grid rusHuawei smart grid rus
Huawei smart grid rus
 
Huawei smart grid rus
Huawei smart grid rusHuawei smart grid rus
Huawei smart grid rus
 
2011 ukraine channel sales business report
2011 ukraine channel sales business report2011 ukraine channel sales business report
2011 ukraine channel sales business report
 
Kharkov conference 2011 q4 odrov s.
Kharkov conference 2011 q4 odrov s.Kharkov conference 2011 q4 odrov s.
Kharkov conference 2011 q4 odrov s.
 
Megatrade артем вижуткин академия цод
Megatrade артем вижуткин академия цодMegatrade артем вижуткин академия цод
Megatrade артем вижуткин академия цод
 
Aflex distribution
Aflex distributionAflex distribution
Aflex distribution
 
решения Eaton вадим харитонов
решения Eaton вадим харитоноврешения Eaton вадим харитонов
решения Eaton вадим харитонов
 
лаборатория касперского киберпреступность
лаборатория касперского киберпреступностьлаборатория касперского киберпреступность
лаборатория касперского киберпреступность
 
комплексные решения по гарантированному электропитанию харитонов мегатрейд
комплексные решения по гарантированному электропитанию харитонов мегатрейдкомплексные решения по гарантированному электропитанию харитонов мегатрейд
комплексные решения по гарантированному электропитанию харитонов мегатрейд
 
генераторные установки нового поколения Ipt в.дюбанов
генераторные установки нового поколения Ipt в.дюбановгенераторные установки нового поколения Ipt в.дюбанов
генераторные установки нового поколения Ipt в.дюбанов
 
Stulz mission energy
Stulz mission energyStulz mission energy
Stulz mission energy
 
Oracle exa2 biz_summit
Oracle exa2 biz_summitOracle exa2 biz_summit
Oracle exa2 biz_summit
 
Ibm megatrade шиндак xiv v3.0
Ibm megatrade шиндак xiv v3.0Ibm megatrade шиндак xiv v3.0
Ibm megatrade шиндак xiv v3.0
 
Conteg юрий шульга
Conteg юрий шульгаConteg юрий шульга
Conteg юрий шульга
 
Cisco solutions for dc whiteboarding
Cisco solutions for dc whiteboardingCisco solutions for dc whiteboarding
Cisco solutions for dc whiteboarding
 
Bas consulting for_it_conference_(by_po)
Bas consulting for_it_conference_(by_po)Bas consulting for_it_conference_(by_po)
Bas consulting for_it_conference_(by_po)
 
Alcatel lucent решения для бизнеса
Alcatel lucent решения для бизнесаAlcatel lucent решения для бизнеса
Alcatel lucent решения для бизнеса
 
Alcatel lucent 10 gbit
Alcatel lucent 10 gbitAlcatel lucent 10 gbit
Alcatel lucent 10 gbit
 
Aflex distribution
Aflex distributionAflex distribution
Aflex distribution
 

!08 0609b air-economizer_final

  • 1. IT@Intel Brief Intel Information Technology Reducing Data Center Cost Computer Manufacturing Energy Efficiency with an Air Economizer To challenge established industry assumptions regarding Profile: Air Economizer PoC August 2008 data center cooling, Intel IT conducted a proof of concept • 900 heavily utilized production (PoC) test that used an air economizer to cool production servers in a high-density data center servers with 100 percent outside air at temperatures of up to 90 degrees Fahrenheit (F). With this approach, we • 100% air exchange at up to 90 degrees F, with no humidity control could use an economizer to provide nearly all data center and minimal air filtration cooling, substantially reducing power consumption. This could potentially reduce annual operating costs by up to • 67% estimated power savings using USD 2.87 million for a 10-megawatt (MW) data center. economizer 91% of the time—an estimated annual savings of We ran the PoC in a dry, temperate climate over 10 months approximately USD 2.87 million in using about 900 production blade servers, divided equally a 10-MW data center between two side-by-side compartments, as shown in Figure 1. One used standard air conditioning, the other an air economizer. Servers in the economizer compartment were subjected to considerable variation in temperature and humidity as well as poor air quality; however, there was no significant increase in server failures. If subsequent investigation confirms these promising results, we anticipate using this approach in future high-density data centers. Air Conditioning Unit Traditional Air-Conditioned Compartment Server Racks Hot air is cooled and recirculated Air Economizer Compartment Server Racks Hot air is flushed outside, and outside air is drawn in Air Economizer Temperature and Humidity Sensor Hot Air Cold Air Figure 1. Proof of concept (PoC) data center environment. IT@Intel
  • 2. Background If we were successful, we would be able to use air economizers for most of the year in dry, Data center power consumption is soaring, driven temperate climates. This could drastically reduce by increasing demand for computing capacity. In power consumption and cooling costs while a typical data center, 60 to 70 percent of data improving Intel’s environmental footprint. center power may be used for facilities power and data center cooling. Proof of Concept At Intel, our data centers need to support the We conducted a large PoC test using rapid growth in computing capacity required to approximately 900 production design servers design increasingly complex semiconductors. At at a data center located in a temperate desert the same time, we are trying to minimize data climate with generally low relative humidity. We center power consumption and operating costs. began the PoC in October 2007, and continued Our strategy is based on high-performance, high- the test for 10 months until August 2008. density data centers containing thousands of We set up the PoC in a 1,000-square-foot (SF) blade servers. These blades deliver considerable trailer that was originally installed to provide computing capacity, but they also generate temporary additional computing capacity and substantial heat. We supply cooling air to the divided the trailer into two compartments of blades at 68 degrees F; as the air passes over approximately 500 SF each. To minimize the cost the blades, the air temperature rises by 58 of the PoC, we used low-cost, warehouse-grade degrees F, resulting in an exit temperature of direct expansion (DX) air conditioning equipment. 126 degrees F. This means that we need to Temperature and humidity sensors were installed cool the air by 58 degrees F before recirculating to monitor the conditions in each compartment. it. The air conditioning units required to do this consume a considerable amount of electricity. We cooled one compartment with a traditional approach, using a DX unit to recirculate hot air Air economizers represent one potential way and provide cooling at all times. to reduce data center power consumption and cooling cost. Instead of cooling and recirculating For the other compartment, we used essentially the hot air from the servers, air economizers the same air-conditioning equipment, but with simply expel the hot air outdoors and draw in modifications that enabled it to operate as an outside air to cool the IT equipment. economizer by expelling hot air to the outdoors and drawing in 100 percent outside air for cooling. The current industry assumption is that the usefulness of air economizers is limited by Because one of our goals was to test the the need to supply cooling air at a relatively acceptable limits of operating temperature, low temperature. The implication is that air we configured the cooling equipment in the economizers can only be used at times when the economizer compartment to supply air at outside air is relatively cool. There is also concern temperatures ranging from 65 degrees F to 90 about variation in humidity, because the humidity degrees F. We designed the system to use only of outside air can change rapidly. A third area of the economizer until the supply air exceeded concern is particulate counts in outside air. the 90 degree F maximum, at which point we began using the chiller to cool the air to 90 We decided to challenge these assumptions degrees F. If the temperature dropped below by employing an economizer to cool a high- 65 degrees F, we warmed the supply air by density production environment using a much mixing it with hot return air from the servers. wider range of outside air temperatures—up to 90 degrees F. We reasoned that this might be We made no attempt to control humidity. We also feasible because server manufacturers specify wanted to test the limits of air quality, so we that their products can operate in temperatures applied minimal filtering to incoming air, using a as high as 98 degrees F. We also wanted to push standard household air filter that removed only the accepted limits of humidity and air quality. large particles from the incoming air but permitted fine dust to pass through.
  • 3. Each room contained eight racks. Each rack contained four 100º F 35% Temperature blade servers with 14 blades each, for a total of 448 blades per Humidity Temperature in Degrees Fahrenheit (° F) 90 30 compartment. This represented a power density of more than 80 200 watts per square foot (WPSF). 25 70 Percent Humidity During the PoC, we used the servers to run large production batch 60 20 silicon design workloads, resulting in very high server utilization rates 50 of about 90 percent. 40 15 We measured server failure rates in each compartment and compared 30 10 them with the failure rates that we experienced during the same period 20 5 within our main data center at the same location. 10 0 0 Results January February March April May During the PoC, the servers in the economizer compartment were Figure 2. Temperature and humidity variation in the air subjected to wide variation in environmental conditions, as shown in economizer compartment from January through May 2008. Figure 2. • The temperature of the supply air varied from 64 degrees F to more than 92 degrees F. This variation slightly exceeded our set points partly due to the slow response of our low-cost air conditioning units. Table 1. Average Annual Temperatures at the Proof of Concept (PoC) Test Location • Humidity varied from 4 percent to more than 90 percent and changed rapidly at times. Month Average High Average Low January 48 Degrees Fahrenheit (° F) 24º F • The servers and the interior of the compartment became covered February 55º F 28º F in a layer of dust. March 62º F 33º F Power Consumption April 70º F 41º F Total power consumption of the trailer was approximately 500 kilowatts May 80º F 49º F (KW) when using air conditioning in both compartments. When using June 90º F 59º F the economizer, the DX cooling load in the economizer compartment July 92º F 65º F was reduced from 111.78 KW to 28.6 KW, representing a 74 percent August 88º F 63º F reduction in energy consumption. September 83º F 56º F October 71º F 44º F Server Failure Rates November 57º F 32º F Despite the dust and variation in humidity and temperature, there was December 48º F 24º F only a minimal difference between the 4.46 percent failure rate in the economizer compartment and the 3.83 percent failure rate in our main data center over the same period. The failure rate in the trailer compartment with DX cooling was 2.45 percent, actually lower than in the main data center. Analysis We estimated the average annual power savings we could achieve in a data center that uses an economizer. To do this, we used historical weather data for the data center location, summarized in Table 1. Analysis of the data indicated that during an average year, the temperature is below our 90 degrees F maximum 91 percent of the time. Based on our 74 percent measured decrease in power consumption when using the economizer during the PoC, and assuming that we could rely on the economizer 91 percent of the year, we could potentially save
  • 4. approximately 67 percent of the total power used annually for cooling compared with a traditional data center cooling approach. This translates into approximately 3,500 kilowatt hours (KWH) per KW of overall data center power consumption, based on an assumption that 60 percent of Intel IT’s Eight-year Data Center data center power typically is used for mechanical cooling systems. Efficiency Strategy This proof of concept is part of Intel IT’s enterprise-wide, eight- This would result in an estimated annual cost reduction of approximately year data center efficiency strategy, begun in 2007. The goals of USD 143,000 for a small 500-KW data center, based on electricity costs the strategy are to transform our global data center environment, of 0.08 per KWH. In a larger 10-MW data center, the estimated annual increasing efficiency and business responsiveness while cost reduction would be approximately USD 2.87 million. substantially reducing cost. In addition, we could avoid certain capital expenditures in new data Key focus areas of the strategy include: centers because we would require less cooling equipment. Even when the • Standardizing processes and design specifications by using a outside air temperature exceeded our maximum supply air temperature, strategic, long-range planning approach. we would only have to cool the air to our specified maximum rather • Accelerating server refresh, while implementing grid computing than to the 68 degrees F used in our traditional data center approach. and server virtualization to increase utilization. Reducing the complexity and cost of the cooling system also reduces • Consolidating data centers, concentrating computing resources the number of failure modes, increasing overall resiliency. into fewer large, highly efficient hub data centers. • Using green computing with power-efficient servers and energy- Conclusion saving data center design, as well as developing our commitment We observed no consistent increase in server failure rates as a result of to industry initiatives. the greater variation in temperature and humidity, and the decrease in air quality, in the trailer. This suggests that existing assumptions about the need to closely regulate these factors bear further scrutiny. Air economizers seem particularly suited to temperate climates with low humidity. A data center equipped with an air economizer could substantially reduce Intel’s environmental footprint by reducing consumption of both power and water. In dry climates, traditional air- conditioned data centers typically include evaporative cooling using water towers as a pre-cooling stage. With an economizer, this would not typically be used, potentially saving up to 76 million gallons of water annually in a 10-MW data center. We plan to further test for possible hardware degradation using a server aging analysis that compares systems used in the economizer compartment, Author in the air-conditioned compartment, and in our main data center. Don Atwood is a regional data center manager with Intel If subsequent investigation confirms our promising PoC results, we expect Information Technology. to include air economizers in future data center designs. A possible next John G. Miner is a senior systems engineer with Intel step would be a 1-MW demonstration data center using the equipment Information Technology. designed for the PoC. This paper is for informational purposes only. THIS DOCUMENT IS Intel and the Intel logo are trademarks of Intel Corporation in the PROVIDED “AS IS” WITH NO WARRANTIES WHATSOEVER, INCLUDING U.S. and other countries. ANY WARRANTY OF MERCHANTABILITY, NONINFRINGEMENT, *Other names and brands may be claimed as the property of others. FITNESS FOR ANY PARTICULAR PURPOSE, OR ANY WARRANTY OTHERWISE ARISING OUT OF ANY PROPOSAL, SPECIFICATION OR Copyright © 2008 Intel Corporation. All rights reserved. SAMPLE. Intel disclaims all liability, including liability for infringement Printed in USA Please Recycle of any proprietary rights, relating to use of information in this 0808/KAR/KC/PDF 320074-001US specification. No license, express or implied, by estoppel or otherwise, to any intellectual property rights is granted herein.