This presentation discusses data center cooling technologies. It provides a brief history of data centers and outlines ASHRAE thermal guidelines for operating envelopes and temperature change requirements. The presentation then reviews common cooling system types including computer room air conditioners, computer room air handlers, and water-side economizers. It also examines heat rejection options and trends toward higher supply air/water temperatures to improve efficiency.
1. Data Center Cooling
Ahmed Abdel-Salam
Ph.D., P.Eng., CEM, BEMP, HBDP, LEED GA
ASHRAE Saskatoon Chapter Meeting
Image Source: https://www.greenbiz.com/
This presentation is copyrighted by Ahmed Hamdi Abdel-Salam
1
7. Why Data Centers?!
Top-left Image Source: https://d3w2mpp70f6o8z.cloudfront.net/media/images/Google_DC1.original.jpg . Other images are cited in previous slides.
7
Applications
DataServers
Data
Centers
10. Typical Layout for a Data Center
Image Source: AH Khalaj, SK Halgamuge, 2017. A review on efficient thermal management of air-and liquid-cooled data centers: From chip to the cooling system. Applied Energy 2015, 1165-1188.
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
10
11. Traditional Data Center Design Power Allocation
▪ Approximately 1.5-2% of all electricity used in the US
▪ Historical trends show EXPONENTIAL growth
▪ Cooling accounts for majority of electrical energy consumption
IT Equipment
52%
UPS
5%PDU
1%
Fans
8%
Cooling
30%
Lighting
1%
Other
3%
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
11
12. ITE Heat Loads
Historical Trends and Future Projections
▪ All heat loads are generated by ITE
▪ Historical trends confirm EXPONENTIAL growth
▪ Future predictions show EXPONENTIAL growth
▪ Cooling technologies have to ADAPT
Source: D Betty, D Quirk, J Jaworski, 2018. ASHRAE IT Power Trends. ASHRAE Journal, July 2018.
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
12
14. ITE Metrics for Data Center
Operating Envelope
▪ ASHRAE Thermal Guidelines enable users to select the operating envelopes that
deliver their technical and business needs
▪ The right operating envelope provides an optimum balance between ITE performance
and cooling system performance
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
Source: ASHRAE Thermal Guidelines. A presentation by W Tschudi.
14
15. Operating Envelope
▪ Facilities should be designed to operate under the Recommended operating envelope
under normal operating conditions
▪ IT manufacturers test ITE to ensure it will be functional under the Allowable operating
envelope
▪ The table below shows the 2011 edition. Lower humidity levels were revised in 2015
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
Source: Thermal Guidelines for Data Processing Environments. ASHRAE TC9.9.
15
16. Operating Envelope
▪ Recommended: Under normal
operating conditions
▪ Class A1: Enterprise servers and
storage products
▪ Classes A2, A3, and A4: Volume
servers, storage products , PC,
and workstations
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
Source: Thermal Guidelines for Data Processing Environments. ASHRAE TC9.9.
16
17. Operating Envelope1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
2011 Thermal
Guidelines
▪ ASHRAE TC9.9 is keeping up-to-date with the fast growing industry
▪ ASHRAE RP on electrostatic discharge enabled lower humidity levels
▪ Current ASHRAE RP exploring the possibility of expanding the upper humidity levels
2015 Thermal
Guidelines
Source: Thermal Guidelines for Data Processing Environments. ASHRAE TC9.9.
17
18. Temperature Change Requirement
▪ Tape-based ITE: 5°C (9°F) in an hour
▪ ITE (except tape): 20°C (36°F) in an hour
▪ ITE manufacturers identify the allowable
temperature change requirement
▪ HVAC designers should ensure the cooling
systems are able to maintain the
temperature change required for ITE
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
Source: Thermal Guidelines for Data Processing Environments. ASHRAE TC9.9.
18
19. ▪ The X-Factors represent a relative failure with respect to the baseline at 20°C (68°F)
▪ The lower and upper bounds accounts for all components within the volume server package
▪ Continuous (24 x 7 x 365) operation with DBT at server inlet are assumed to derive the
X-Factors. This is not realistic approach, as majority of operating hours will be under cool
temperatures. A time-weighted average X-Factor is a more realistic approach
Server’s Reliability (X-Factor)
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
Source: Thermal Guidelines for Data Processing Environments. ASHRAE TC9.9.
▪ The X-Factors represent a relative failure with respect to the baseline at 20°C (68°F)
▪ The lower and upper bounds accounts for all components within the volume server package
▪ Continuous (24 x 7 x 365) operation with DBT at server inlet are assumed to derive the
X-Factors. This is not realistic approach, as majority of operating hours will be under cool
temperatures. A time-weighted average X-Factor is a more realistic approach
Server’s Reliability (X-Factor)
19
20. Server Power vs Ambient Temperature
Ambient Temperature (°C)
ServerPowerIncrease
10 15 20 25 30 35 40
1.00
1.05
1.10
1.15
1.20
1.25
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
Source: Thermal Guidelines for Data Processing Environments. ASHRAE TC9.9.
▪ Required airflow rate increases at higher inlet air temperatures
▪ This increases the server power requirements
20
21. ITE Metrics for Data Center
Operating Envelope
▪ ASHRAE Thermal Guidelines enable users to select the operating envelopes that
deliver their technical and business needs
▪ The right operating envelope provides an optimum balance between ITE performance
and cooling system performance
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
Source: ASHRAE Thermal Guidelines. A presentation by W Tschudi.
21
23. Outside Air vs Recirculated Air
▪ Majority of Data Centers are operated using 100% Recirculated Air
▪ Data Centers have high air-filtration requirements (up to MERV 15)
▪ Generally, there are significant energy-savings from using Air-Side Economizers,
especially when higher supply air temperatures are allowed.
▪ There is no current standards or guidelines on the use of Outside Air (Air-Side
Economization) in the Data Center industry.
▪ Outside air can bring many problems to Data Centers:
▪ Moisture control
▪ Dust
▪ Pollutants
▪ Fire-smoke
▪ Reliability
▪ Availability
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
Source: http://www.aztec-cooling.com/. The website referred to an article entitled “Free cooling maps for data center design and operation”. White Paper #46 by the Green Grid.
23
24. Computer Room Air Conditioner (CRAC)
Source: https://www.enviromon.net/wp-content/uploads/cold-hot-aisles.png
▪ The evaporator is located inside the Data Hall to cool hot air from ITE
▪ Condenser heat can be rejected using various ways
▪ Cold glycol from a dry cooler
▪ Cold water from a cooling tower
▪ Air-cooled condenser
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
24
25. Computer Room Air Conditioner (CRAC)
Air-Cooled Condenser
Glycol-Cooled Condenser using Dry-Cooler
Source: The different technologies for cooling Data Centers. White Paper 59. Tony Evans. APC by Schneider Electric
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
25
26. Computer Room Air Handler (CRAH)
Source: https://www.enviromon.net/wp-content/uploads/cold-hot-aisles.png
▪ CHW coil located inside Data Hall to cool hot air from ITE
▪ CHW is generated using chiller located outside the Data Hall
CRAH
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
26
27. Computer Room Air Handler (CRAH)
Source: The different technologies for cooling Data Centers. White Paper 59. Tony Evans. APC by Schneider Electric
▪ Water-cooled chillers have higher COP than Air-cooled chillers
▪ Cooling tower and proper water treatment are essential for water-cooled chillers
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
27
28. CRAC vs CRAH
Source: http://www.dchuddle.com/wp-content/uploads/2011/08/sidebyside_2.jpg
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
28
30. Water-Side Economizer
Source: https://aeroventic.uk/photos/articles-contents/w770/480.jpg
▪ Sequence of operation: Dry Mode → Wet Mode → Top-Up (Chiller-Trim) Mode
▪ WATER Savings due to Dry Mode
▪ Energy Savings due to Dry & Wet Modes
▪ Reduced Peak Power
▪ Improved Life Cycle Costs
▪ Majority of new data centers use this
approach to improve water and
energy efficiency
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
30
31. Water-Side Economizer
Source: D Beaty, D Quirk, F Morrison, 2019. Data Centers: Designing Data Center Waterside Economizers. ASHRAE Journal, vol. 61.
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
31
32. Heat Rejection Options for CRAC
Source: AH Khalaj, SK Halgamuge, 2017. A review on efficient thermal management of air-and liquid-cooled data centers: From chip to the cooling system. Applied Energy 2015, 1165-1188.
Glycol Dry Cooler
Cooling Tower
Cooling Tower &
Water-Side
Economizer
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
32
33. Heat Rejection Options for CRAH
Source: AH Khalaj, SK Halgamuge, 2017. A review on efficient thermal management of air-and liquid-cooled data centers: From chip to the cooling system. Applied Energy 2015, 1165-1188.
Air-Cooled Chiller
Air-Cooled Chiller &
Water-Side Economizer
Water-Cooled Chiller
Water-Cooled Chiller &
Water-Side Economizer
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
33
34. Past, Present & Future of Cooling Equipment
Past Present Future
Air-Cooled ITE 100% 99% 50%??
Water-Cooled ITE 0% 1% (Research) 50%??
ITE Heat Density Low Medium High
Supply Air/Water
Temperature
Low (18-24) Medium (24-30) High (28-35??)
Cooling Equipment VCS
VCS
Evaporative + VCS trim
Economizer
Evaporative
Economizer
Energy Efficiency Low High Highest
Water Efficiency Low Medium High
Environmental
Impact
High Medium Low
LCC High Medium Low
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
34
35. Hot Spots at Rack Inlet
Paul Lin. Hot to Fix Hot Spots in the Data Center. APC by Schneider Electric. Retrieved on Feb 24, 2019 from https://www.apc.com/salestools/VAVR-9GNNGR/VAVR-9GNNGR_R0_EN.pdf
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
35
36. Hot Spots Monitoring
▪ Usually occur in the top section of the rack inlet
▪ Results from inadequate use of cooling systems
and not insufficient cooling capacity
▪ The main cause of hot spots is inefficient
airflow management
▪ Can be identified using measurements and/or
CFD modelling
Source: A Radmehr, J Fitzpatrick, K Karki, 2018. Optimizing Cooling Performance of a Data Center. ASHRAE Journal, July 2018.
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
36
37. Best Practices for Fixing Hot Spots
Cold-Aisle ContainmentHot-Aisle Containment
1. Manage airflow in the rack
2. Manage airflow in the room
3. Relocate problem loads
4. Change the location of the air temperature sensor
5. Allow DCIM to control airflow of the cooling units
http://wireraven.com/wp-content/uploads/2017/09/HAC-hot-aisle-cold-aisle-containment.jpg https://www.sourceups.co.uk/wp-content/uploads/2015/12/Thermal-Containment-2-1024x595.jpg
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
37http://wireraven.com/wp-content/uploads/2017/09/HAC-hot-aisle-cold-aisle-containment.jpg
39. Performance Metrics
=
𝐓𝐨𝐭𝐚𝐥 𝐅𝐚𝐜𝐢𝐥𝐢𝐭𝐲 𝐄𝐧𝐞𝐫𝐠𝐲 𝐂𝐨𝐧𝐬𝐮𝐦𝐩𝐭𝐢𝐨𝐧
𝐈𝐓 𝐄𝐧𝐞𝐫𝐠𝐲 𝐂𝐨𝐧𝐬𝐮𝐦𝐩𝐭𝐢𝐨𝐧PUE
Power Usage
Effectiveness
=
𝐒𝐩𝐞𝐜𝐢𝐟𝐢𝐜 𝐄𝐪𝐮𝐢𝐩𝐦𝐞𝐧𝐭 𝐄𝐧𝐞𝐫𝐠𝐲 𝐂𝐨𝐧𝐬𝐮𝐦𝐩𝐭𝐢𝐨𝐧
𝐈𝐓 𝐄𝐧𝐞𝐫𝐠𝐲 𝐂𝐨𝐧𝐬𝐮𝐦𝐩𝐭𝐢𝐨𝐧pPUE
Partial Power
Usage Effectiveness
=
𝐓𝐨𝐭𝐚𝐥 𝐅𝐚𝐜𝐢𝐥𝐢𝐭𝐲 𝐖𝐚𝐭𝐞𝐫 𝐂𝐨𝐧𝐬𝐮𝐦𝐩𝐭𝐢𝐨𝐧
𝐈𝐓 𝐄𝐧𝐞𝐫𝐠𝐲 𝐂𝐨𝐧𝐬𝐮𝐦𝐩𝐭𝐢𝐨𝐧WUE
Water Usage
Effectiveness
▪ Several performance metrics can be used to evaluate the performance of a Data Center
in terms of availability, reliability, and efficiency.
▪ Energy and Water Efficiency of Data Center are commonly evaluated using:
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
39
The Green Grid
40. 1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
Performance Metrics
Total Facility Energy Usage kWh 1,000
IT Energy Consumption kWh 520
Cooling Equipment Energy Usage kWh 380
Total Facility Water Usage L 600
PUE kWh/kWh 1.9
pPUE kWh/kWh 1.7
WUE L/kWh 1.2
40
41. The following is a breakdown of the general
components commonly found in DCIM
solutions:
- Capacity Management
- IT Rack Utilization
- Power Management
- Cooling/Temperature Management
- Asset Management
- Change Management
DCIM
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
41
43. Liquid-Cooled ITE1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
43
44. Liquid-Cooled ITE
Hybrid Air/Water-Cooled ITEAir-Cooled ITE
▪ Larger heat densities than Air-Cooled ITE
▪ Higher entering water temperatures (may be no chiller is needed?!)
▪ Potential for improved CapEx, OpEx and TCO
▪ Still under R&D
Source: V Sorell, B Carter, R Zeighami, S Smith, R Steinbrecher, 2015. Liquid-Cooled IT Equipment in Data Centers. ASHRAE Journal, December 2015
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
44
45. Waste Heat Recovery
Source:http://indiannexus.com/
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
45
46. Prefabricated Modular Data Centers
Source: https://carrier.huawei.com/en/products/network-energy/data-center-energy/prefabricated-modular-datacenter
• -40C to +55C stable operation
• 25 years of service life
• 9-degree magnitude earthquake resistance
• Fire-resistant for 120 minutes
• In-row variable frequency air conditioners
• Hot/cold-aisle technology
• Supports Free Cooling devices
• Saves TCO by at least 10%
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
46
49. Google’s Data Center in Europe
https://renewablesnow.com/news/google-plans-renewables-powered-danish-data-centre-633943/
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
49
50. Microsoft’s Data Center in the US
https://acibuilds.com/project/microsoft-data-center/
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
50
51. Facebook’s Data Center in Europe
https://code.fb.com/data-center-engineering/facebook-in-ireland-our-newest-data-center/
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
51
53. ASHRAE Technical Committee 9.9
Source: ASHRAE TC9.9
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
53
54. ASHRAE TC9.9 Datacom Book Series1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
Source: ASHRAE TC9.9
54
55. Energy Standard for Data Center1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
Source: ASHRAE TC9.9
55
56. Energy Standard for Data Center
Source: ASHRAE TC9.9
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
56
58. Summary
Source: ASHRAE TC9.9
1. Layout,
Power and ITE
Loads
2. ASHRAE
Thermal
Guidelines
3. Cooling
Systems
4. Performance
Metrics
5. Future
Trends
6. Mega
Projects
7. ASHRAE TC
9.9
8. Summary
1. Digital civilization → a rapid growth in IT and Data Center industries
2. ASHRAE Thermal Guidelines → most reliable source for data center operating envelopes
3. Various factors determine operating envelope → Reliability, Noise, ITE/Cooling Power etc.
4. Mostly operated using 100% Recirculated Air → No Air-Side Economization
5. Past relied on VCS –Present relies on Evaporative cooling and water-side economization
6. Improper air flow management → Hot spots
7. Hot-aisle and cold-aisle containments → Among best practices to prevent hot spots
8. Future trends → Water-cooled ITE, Modular, Underwater Cooled, and waste heat recovery
9. ASHRAE TC 9.9 → The largest and most active TC. Reliable and up-to-date publications
58
59. Thank you :)
Ahmed Abdel-Salam
Ph.D., P.Eng., CEM, BEMP, HBDP, LEED GA
Email: ahmed.abdel-salam@usask.com
Cell: +1 (306) 850-1527
Image Source: https://www.greenbiz.com/
59