B7 merlin

1,361 views

Published on

Published in: Technology
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,361
On SlideShare
0
From Embeds
0
Number of Embeds
191
Actions
Shares
0
Downloads
0
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide
  • Data centre modules are manufactured and transportedLimits environmental impact of constructionModules can be deployed and re-deployed to anywhere with power and communications – 100% reusableModules not containers! Internal space is highly configurableCareful choice of components – highly recyclableRe-use an existing buildingClose to 100% fresh air coolingA revolution on data – measurements in every corridor, aisle and rack – advanced control that regulates air conditioning without wasting fan powerHot and cold aisles cut energy loss to a minimumLess than a third of a swimming pool of water used per yearModules have very low embedded carbonSustainable materials - Flywheels not batteries
  • Data centre modules are manufactured and transportedLimits environmental impact of constructionModules can be deployed and re-deployed to anywhere with power and communications – 100% reusableModules not containers! Internal space is highly configurableCareful choice of components – highly recyclableRe-use an existing buildingClose to 100% fresh air coolingA revolution on data – measurements in every corridor, aisle and rack – advanced control that regulates air conditioning without wasting fan powerHot and cold aisles cut energy loss to a minimumLess than a third of a swimming pool of water used per yearModules have very low embedded carbonSustainable materials - Flywheels not batteries
  • If every datacentre in the UK was as efficient as Merlin, we would save 2300 MW of power.
  • If every datacentre in the UK was as efficient as Merlin, we would save 2300 MW of power.
  • Tier 3 is described generically as ‘concurrently maintainable’. This means that we are able to change every single component of the datacentre without causing any impact to the operational running.We join an exclusive club - Only one existing constructed datacentre in Europe certified as Tier 3 by the Uptime Institute (Fujitsu, North London)One additional has certified design documents (Telecommunications and Data Centre Services, UK). One in 10 Globally
  • Modules are buildings with a structural guarantee of 60 years. Can be mounted outside without a building.Emphasise modules NOT containers. Containers and sea crates, built for high density, not configurable or flexible. The datacentre modules are flexible and can be configured for client needs (e.g. security)Modules can be sold to clients at the end of a deal – redeployable, reusable
  • Capex is the largest element of the run cost, making up almost halfData centre modules sized to support use of standard build ‘off the shelf’ components – manufactured in large production runs at lowest costStandard parts also means faster delivery – so we can deploy new modules in 22 weeksModularity allows us to scale quickly to avoid large bench costsSome ‘fixed costs’ converted to variable – e.g. flywheels versus UPS
  • B7 merlin

    1. 1. Inéov, 7th of JuneGreen IT & DC ConsolidationYannick Tricaud<br />
    2. 2. Agenda<br />Green IT : context & foundations<br />Green IT : Strategy & Prospective <br />CG Datacenter : The world’ s most sustainable data center<br />Datacenter Consolidation<br />
    3. 3. Green IT / Green Computing  : Few numbers<br />IT energy management is the analysis and management of energy demand within the information technology arena. <br />IT energy demand accounts for approximately 2% of global energy demand, approximately the same level as aviation. IT can account for 25% of a modern office building’s energy cost.<br />The main sources of IT energy demand are PCs and Monitors, accounting for 39% of energy use, followed by data centers and servers, accounting for 23% of energy use.<br />Communication & Information Infrastructure (TIC) eat 13,5 % of the power in France and are responsible of 5% of CO2 footprint in the country. <br />PC consumption of electricity increases by 5% every year. <br />Electrical power accounts for 10 % within CIO budget. <br />Electrical power cost over PC lifecycle is now much higher than PC purchase. <br />PC manufacturing in China rejects 24 times more CO2 than a one year usage in France.. <br />Average Server Utilization rate : < 6 % (even below 3% for 30 % installed into one DC) <br />5 à 10 % for Wintel server, 20 % for Unix, between 50 à 60 % for large systems => 20% in average for large company (IDC). <br />DC Utilization rate : 56 % of its capability. <br />
    4. 4. Green IT / Green Computing : Approaches<br />Different way to optimize energy consumption and ecological footprint<br />Product Longevity<br />PC manufacturing process accounts for 70 % of the natural resources used in the life cycle of a PC<br />the biggest contribution to green computing usually is to prolong the equipment's lifetime<br />manufacturing a new PC makes a far bigger ecological footprintthan manufacturing a new RAM module to upgrade an existing one, a common upgrade that saves the user having to purchase a new computer<br />Software and deployment optimization<br />Algorithmic efficiency : The efficiency of algorithms/program has an impact on the amount of computer resources required (eg. Harward Survey on Google search; GreenIt benchmark) <br />
    5. 5. Green IT / Green Computing : Approaches<br />Software and deployment optimization<br />Resource allocation : Algorithms can also be used to route data to data centers where electricity is less expensive (eg. Recent MIT Research : 40% savings…). Could be used also to optimized green energy usage or the location depending on environmental factors :<br /><ul><li>Local availability of renewable energy,
    6. 6. climate that allows outside air to be used for cooling, or locating them where the heat they produce may be used for other purposes could be factors in green sitting decisions.</li></ul>Virtualization : Computer virtualization refers to the abstraction of the primary resources (eg. Computer, disk, cloud…). Main benefits are :<br /><ul><li>reduced operating and capital costs,
    7. 7. improved utilization of computing resources (CUOD ;
    8. 8. Improved the security layers (user isolation, NW segregation…)
    9. 9. greater IT staff productivity (Master deployment…)</li></li></ul><li>Green IT / Green Computing : Power management<br />Terminal servers : Terminal servers have also been used in green computing. When using the system, users at a terminal connect to a central server; all of the actual computing is done on the server, but the end user experiences the operating system on the terminal. These can be combined with thin clients, which use up to 1/8 the amount of energy of a normal workstation, resulting in a decrease of energy costs and consumption. Benefits are : <br /><ul><li>Lower power consumption
    10. 10. TCO (asset Mgt / capitalization…)
    11. 11. Data security / Data leak protection</li></ul>Power management<br />Operating system support : Windows 2000 was the first NT based operating system to include power management and the Group Policy. It allows the OS to control the power saving aspects : undervolting, SpeedStep (Intel) solution ; and further (standby / Hybernation…) <br />
    12. 12. Green IT / Green Computing : Datacenter <br />Focus is now clearly on facilities and power consumption !<br />
    13. 13. Cap Gemini Merlin DC : The world’ s most sustainable data center.<br />
    14. 14. What we set out to achieve: – A World Leading Data Centre<br /><ul><li>Sustainability –Reduce power, Reduce Water, Reduce contaminants
    15. 15. Energy Efficiency – World leading PUE
    16. 16. Resilient –Tier 3 Certification
    17. 17. Secure –IL3 with flexibility and limited impact to change
    18. 18. Cost per square metre in lowest quartile –Equipment choice, Asset depreciation
    19. 19. Modular –Reuse, Recycle, Redeploy
    20. 20. Location –Reuse an existing building, low carbon impact</li></li></ul><li>Sustainability<br /><ul><li>Recycling / Re-use, reduction in Embedded Carbon
    21. 21. Existing building
    22. 22. Dust, noise and other pollution associated with site works is almost entirely eliminated
    23. 23. Modules 100% reusable
    24. 24. 95% is recyclable
    25. 25. Sustainable materials
    26. 26. No Batteries – flywheel UPS
    27. 27. Merlin uses 30% less than traditional Water cooled Data centre facilities
    28. 28. A centralised production facility reduces transport and vehicle movements and associated CO2 emissions </li></li></ul><li>Energy Efficiency<br /><ul><li>100% fresh air cooling (subject to record weather and emergency conditions)
    29. 29. Hot and cold aisles containment
    30. 30. Flywheel UPS
    31. 31. Variable speed fans / air flow velocity sensors
    32. 32. PIR lighting</li></li></ul><li>Power Usage Effectiveness (PUE)<br />Fujitsu, London <br />Toltec UK post upgrade<br />Industry Average *<br />Best Practice *<br />1 1.3 1.5 2 2.5 3 <br />Microsoft, Chicago (1.22) (d)<br />Merlin (1.09 Factory Tested)<br />HP Wynyard (1.16)<br />Yahoo, Quincy (1.21) (b)<br />State of the Art *<br />Google E (1.12) (c)<br />*Source: Data Centre Efficiency Measures;<br />Google Efficient Computing [2009]<br />1 1.1 1.2<br />
    33. 33. Proven - Power Usage Effectiveness (PUE) <br />Proven PUE<br />The PUE of 1.11 is a proven annual PUE and includes all ancillary energy usage including:<br /><ul><li> Cooling
    34. 34. Ventilation
    35. 35. Humidification
    36. 36. Lighting
    37. 37. UPS losses
    38. 38. Transformer losses
    39. 39. Generator losses (crankcase heaters, water pump, battery charger)
    40. 40. Electrical distribution losses</li></ul>Bladeroom in isolation achieve P.U.E of 1.05<br />
    41. 41. Proven P.U.E – Testing Regime<br />CLIMATE EMULATOR<br />Temperature: -5 oC to 48 oC<br />Humidity: 5%RH to 100%RH<br /><ul><li> The development of a bespoke climate emulator
    42. 42. Provide any climate condition as a fresh air source to the BladeRoom.
    43. 43. Proven control functionality in all the different psychrometric control zones
    44. 44. Proven control system functionality at all the transition points between psychrometric zones.
    45. 45. The final test heated the incoming air up to 48 degC which was cooled to a supply air temperature of 24degC with no mechanical cooling.</li></ul>Capgemini believes that Merlin is currently the only data centre in the world that has had its P.U.E fully proved under all operating and climate conditions<br />
    46. 46. Resiliency with World leading P.U.E<br /><ul><li>Merlin is currently undergoing Design certification with the Uptime Institute and expect to be fully certified by Mar 2011
    47. 47. The P.U.E achieved by Merlin has proved that with the correct design, Tier 3 topology can achieve an extremely high level of Energy Efficiency</li></ul>Capgemini UK<br /> Merlin Data Centre<br />Mar 1st 2011<br />
    48. 48. Secure<br /><ul><li>Merlin has been built to IL3 level with the ability to upgrade to IL4
    49. 49. Upgrade to IL4 planned
    50. 50. Modules can be individually built to meet client specific needs
    51. 51. Total Security system incorporates:
    52. 52. Infra red beams securing boundary
    53. 53. Alarmed fencing
    54. 54. Security managed entrance gate with a PAS 68 protection rating
    55. 55. Bristorm anti-vehicle barrier
    56. 56. Motion detection CCTV system integrated to all intrusion alarm points
    57. 57. Number plate recognition camera system
    58. 58. Access control system with bio-metric readers</li></li></ul><li>Modular<br /><ul><li>Reduced Bench time
    59. 59. Significantly reduced lead time
    60. 60. Recycle, Re-usable and Re-deployable </li></li></ul><li>Flood Risk<br />Too Damp<br />Too Cold<br />Limited Telco<br />Too Warm<br />Preferred<br />Location<br /><ul><li>Perfect for Fresh Air Cooling all year round
    61. 61. Max temp since records began 31.5 oC
    62. 62. Merlin solution controls cooling up to 36 oC without DX chilled water systems
    63. 63. Excellent Power Availability
    64. 64. Excellent Telco Provision
    65. 65. Free from Flood, Air Lane and other natural disasters
    66. 66. Within Synchronous Network Range of Toltec for Active – Active business continuity</li></ul>Ratified by Hurley Palmer Flatt<br />Design Consultants<br />
    67. 67. Data Centre Module Components<br /><ul><li>A data centre module consists of 6 sections (7 for a power density of 2000 w/m2)</li></ul>View in to cold aisle<br />Air Optimiser<br />Power distribution<br />120 racks forming hot and cold aisles<br />
    68. 68. Data Centre Module – Normal Air Flow<br />
    69. 69. Merlin on Film<br />
    70. 70. Datacenter Consolidation<br />Based on previous elements, it becomes now obvious that DC consolidation will become a strategic initiative for large companies :<br />Four categories of business benefits in data center consolidation:<br />Reduced costs (of staff, building and complexity) <br />Keeping up with business challenges <br />Improving service levels and availability <br />Minimizing the impact of external pressures<br />
    71. 71. In a nutshell…<br />Lifecycle Mgt for every piece of the infrastructure to amortize gray energy and decrease building and recycling effects. <br />Leverage software capabilities and features to automatically switch off / standby PC and laptop, CUOD for large batch treatment / BCRS (eg. A PC switched on 24*7 cost 30€ more every year). <br />Virtualization and Cloud computing to optimize utilization rate of the HW and other equipments. <br />Optimize power consumption for cooling and DC facilities. <br />Reuse lost energy (eg. Hot air) for external needs (eg. heating…)<br />Encourage renewal energy for DC provisioning (eg. Google initiatives and research…)<br />
    72. 72. Summary, Q and A<br />
    73. 73. Backup slides<br />
    74. 74. Cost per m2 in Lowest Quartile<br /><ul><li>Asset depreciation is the largest portion of run costs
    75. 75. We have maximized use of standard components to minimize costs</li></ul>Design Co. Costs 7%<br />Switchgear & Cabling 7%<br />Cooling 5%<br />Power generation 6%<br />UPS 5%<br />Room Build 5%<br />Fire Systems 3%<br />Office Rooms 2%<br />CCTV, Access & BMS 1%<br />Landscaping 1%<br />
    76. 76. Evaporative cooling : Principles<br />Evaporative coolers are energy efficient since the blower motor only has to move the air and not compress a refrigerant gas as required in an air conditioner (5 to 10 times less). <br />
    77. 77. Flywheel UPS : Principles<br />

    ×