Energy Efficient Server Rooms at the University of Cambridge

356 views

Published on

• Electricity Incentivisation Scheme (EIS) at the University of Cambridge
• Design of Engineering’s Data Centre cooling system
• Energy use from 2010 onwards
• Next steps

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
356
On SlideShare
0
From Embeds
0
Number of Embeds
22
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Energy Efficient Server Rooms at the University of Cambridge

  1. 1. Energy Efficient Server Rooms at theUniversity of CambridgeDavid Greendsg1000@eng.cam.ac.ukDepartment of Engineering
  2. 2. Presentation Overview2• Electricity Incentivisation Scheme (EIS) at the Universityof Cambridge• Design of Engineering’s Data Centre cooling system• Energy use from 2010 onwards• Next steps
  3. 3. The Electricity Incentivisation Scheme (EIS)3• Financial incentives to useelectricity more efficiently• Annual allowances at departmentallevel• Financial reward if use less thanallowance• Financial penalty if exceedallowance• Implemented 1 August 2008• Energy & Carbon Reduction ProjectIn 2010/11 electricityusage was 4.4% belowtarget, saving:• £0.51 million• 4,950 MWh• 2,678 tonnes CO2
  4. 4. Department of Engineering Overview4• Accounts for around 10% ofuniversity.• Activities based in 7 buildings.• Around 600 members of staff• Four year M.Eng course – around1,200 students.• Postgraduate students numbers:• 2011 (792) - 2012 (830)
  5. 5. 5Server Room Cooling Project - Introduction• The Problem• Increase cooling capacity tosupport future purchases• Minimise all aspects of runningcosts and carbon footprint• The Solution• Review cooling arrangements,expand and consider options• Alternative approach to cooling• The Results• PUE of 1.1
  6. 6. 6The ProblemBackground• Initially a distributed arrangement.• Centralised computing resources in two computerrooms (34 racks,12 racks)Pre 2010 Cooling Arrangement• Refrigerant based CRAC system, full recirculation viaunder floor plenum• 63kW plug-loadKey Project Drivers• University Energy Incentivisation Scheme (EIS)• Further server purchases planned• IT electricity consumption is a significant part of theDepartment’s energy base loadApproach• KJ Tait feasibilitystudy• Support from theUniversity’s EstateManagement• Computing Staff• Salix Funding
  7. 7. 7The Problem - Server Room Cooling ProjectDrive to reduce energy costsand carbon footprintConsolidation of serverroomsPower managementExisting DX coolingequipment could not copewith future plansTo implement a solution in alive data centre
  8. 8. 8The Problem - Server Room Cooling Project
  9. 9. 9The Solution – Options Considered• Cold Aislecontainment• Increasing existingCRAC capacity• In-rack cooling withchilled water• Evaporative Cooling2610141822263034380255075051015202530airtempair RH
  10. 10. 10The Solution - Air Flow
  11. 11. 11The Solution – TemperatureMaximumTemperature 24C
  12. 12. 12The Solution – Temperature and FlowDataRackDamperConstant flow andtemperatureEvaporative Cooling
  13. 13. 13The Solution – TemperatureVentilation plusattemperationEvaporative coolingplus attemperationEvaporative Cooling
  14. 14. 14The Solution – Design Running Cost
  15. 15. 15The Solution - InstallationCold aislecontainment6 EcoCoolingCREC’s giving150kW N+1EC Extract FansAmbient airthrough louverSelf containedplant room No raised floor
  16. 16. 16The Solution - Installation
  17. 17. The Solution - Installation17MechanicalCooling PlantData Centre (34racks – 150kW)Electrical supplydistribution andmetering
  18. 18. 18The Results – Key Points• System has been operationalsince December 2010• IT load has risen from 63kW to95kW• Mix of low and medium densityservers• Update of air filtration andhumidity control.• Ambient conditions exceeded30C with high RH• Cold aisle did not exceed 25C• Max RH 70%• PUE 1.1 over 2 1/2 years• Annual savings 200 tonnescarbon and ~£40K• Some fan and equipment failures• Some visible dust
  19. 19. 19June 2010 - kWh used per day, per consumer unit791810866840837837823852866826851850850831781848908863633624772149215651587149314571474145815681574150615251525152515461284160115841528745830106902004006008001000120014001600180001/06/201002/06/201003/06/201004/06/201005/06/201006/06/201007/06/201008/06/201009/06/201010/06/201011/06/201012/06/201013/06/201014/06/201015/06/201016/06/201017/06/201018/06/201019/06/201020/06/201021/06/201022/06/201023/06/201024/06/201025/06/201026/06/201027/06/201028/06/201029/06/201030/06/2010kWhrAir-con Units kWh usedRacks Units kWh usedThe Results – Energy Use 2010IT Load~63kWCooling andLighting~35kW• `
  20. 20. 20June 2010 - kWh used per day, per consumer unit791810866840837837823852866826851850850831781848908863633624772149215651587149314571474145815681574150615251525152515461284160115841528745830106902004006008001000120014001600180001/06/201002/06/201003/06/201004/06/201005/06/201006/06/201007/06/201008/06/201009/06/201010/06/201011/06/201012/06/201013/06/201014/06/201015/06/201016/06/201017/06/201018/06/201019/06/201020/06/201021/06/201022/06/201023/06/201024/06/201025/06/201026/06/201027/06/201028/06/201029/06/201030/06/2010kWhrAir-con Units kWh usedRacks Units kWh usedThe Results – Energy Use 2011June 2011 - kWh used per day, per consumer unit989696961091031069696979810312815916818017817526017517023152294223221902254229823562325222123002393225522372191220322202239227922102058218405001000150020002500300001/06/201102/06/201103/06/201104/06/201105/06/201106/06/201107/06/201108/06/201109/06/201110/06/201111/06/201112/06/201113/06/201114/06/201115/06/201116/06/201117/06/201118/06/201119/06/201120/06/201121/06/201122/06/201123/06/201124/06/201125/06/201126/06/201127/06/201128/06/201129/06/201130/06/2011kWhrAir-con Units kWh usedRacks Units kWh usedPUE of 1.1PUE of 1.65
  21. 21. 21Design Development 2012 onwards – temperature,humidity & air quality monitoring• Enhanced filtration and airquality monitoring• Humidity limiting controlalgorithm and web interface• Fan updates and flowdampers• Low levels of equipmentfailure• Hosting from otheruniversity departments• Fire suppression
  22. 22. The Results – 2013 energy use22
  23. 23. 23The Results – 2013 temperature & humidity logs
  24. 24. 24Initial Results – Contamination and Server Failure• Initially limited filtration, now extensiveand multi staged.• Some visible dust and blackparticulates.• Basic analysis showed the particulatesto consist of dust, possibly pollenparticles and diesel engine exhaustparticulates.• There has been a small number of fanfailures on servers but this is difficultto directly attribute to the coolingsystem.
  25. 25. 25The Results – Reliability and Maintenance• Initially maintenance was notcomprehensively scheduled• Location - surprising amount oflarge fibres caught by insectscreen in the Spring• 3 monthly maintenance of theequipment is required• Routine ‘deep’ cleaning of facilityto ISO 7• With internal installation roomcleanliness needs to bemaintained
  26. 26. Visibility of Building Performance - Energy Dashboard26• Visibility of actual buildingperformance.• Digital signage.• Encourage individuals to ‘own’and take responsibility.• ‘Buy-in’ now apparent in someequipment purchases.• Individual racks are metered
  27. 27. Engineering’s Data Centre electrical loads• 300 MWh electrical base load• Pre 2010 – 35% = Server rooms• Now 2 x Data Centres and 23% ofbase load• Purchasing vs energy performance27
  28. 28. 28Summary• Evaporative cooling has resulted insignificant energy and carbon savings• Second Data Centre in Engineering isnow also based on this technology• Interest from academic and commercialsectors• Catalyst for good practice in terms ofenergy and carbon reduction• Option for hot air exhaust use in naturalventilation strategy – purge/enhancestack ventilation strategy.

×