Eurotech 30 01 2013
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Eurotech 30 01 2013

on

  • 18,771 views

 

Statistics

Views

Total Views
18,771
Views on SlideShare
1,101
Embed Views
17,670

Actions

Likes
0
Downloads
8
Comments
0

15 Embeds 17,670

http://blogs.nvidia.com 15873
http://www.gadgeterija.net 721
http://giga.geek.hr 633
http://feeds.feedburner.com 154
http://m.gadgeterija.net 137
http://gadgeterija.tportal.hr 104
http://m.gadgeterija.tportal.hr 18
http://bitlovers.ilbello.com 9
http://www.weebly.com 6
http://newsblur.com 4
http://translate.googleusercontent.com 4
http://reader.mreotech.com 4
http://www.newsblur.com 1
http://nvinfo 1
http://www.aftercollege.com 1
More...

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Eurotech 30 01 2013 Presentation Transcript

  • 1. Aurora TigonGreen, Dense, Standard HPCG. Tecchiolli – CTOCineca, Bologna March, 30th 2013
  • 2. EurotechHPC R&D and Operations USA J APAN INDIA USA UK USA F R AN C E I TALY SINGAPORE
  • 3. AURORA TIGONUnleash the hybrid power
  • 4. From 1 node to large petascale systemsNode Backplane Chassis Cooling System Rack
  • 5. The Aurora TigonUnleash the hybrid powerKey Features:High Performance Density – 256 CPUs, 256accelerators, up to 350 TFlops in just 1.5 m2Energy efficiency– the Aurora direct coolingtarget datacenter PUE of 1.05, no need for airconditioning, up to 50% less energyProgrammability and compatibility – Based onstandard HPC cluster architecture. 100%compatibility with existing applications.Flexible Liquid Cooling– All components arecooled by water, temperature from 18°C to 52°Cand variable flow ratesReliability– 3 independent sensor networks,soldered memory, no moving parts, uniformcooling, quality controls
  • 6. The Aurora Tigon node card Cooling Plate Infiniband QDR 3D Torus 2 x Nvidia K20 2 x Intel Xeon E5 sockets Local Disk Soldered RAM
  • 7. The Setup Energy efficiency measurements according to the Green500 guidelines• All measurements made with a calibrated power meter with the system running HPL System Eurora supercomputer: 64 nodes, 128 CPUs, 128 GPUs Node Card Intel Xeon E5-2687W (150W) n.2 nVIDIA K20s, n.1 Infiniband QDR NVIDIA® Tesla® K20 Ambient Temperature 20°C+/-1°C Coolant Temperature 19°C+/-1°C Coolant water Flowrate 120lph +/-7lph each EuroraBoard
  • 8. HPL Benchmark Results Performance Vs CPU Frequency Power Vs CPU Frequency 1000,0 1750Performance (GFlops) 1700 800,0 Power (W) 1650 600,0 1600 400,0 1550 1500 200,0 1450 0,0 1,00 1,50 2,00 2,50 3,00 3,50 4,00 1,00 1,50 2,00 2,50 3,00 3,50 4,00 Frequenza (GHz) Frequenza (GHz) Performance/Power Vs CPU Frequency 3,50 Performance (GFlops/W) 3,00 2,50 2,00 3.15 ± 0.12 !! 1,50 1,00 0,50 0,00 1,00 1,50 2,00 2,50 3,00 3,50 4,00 Frequenza (GHz)
  • 9. 30% more efficient than #1 in Green 500Final results 3150 MFlop/s per WATT 1700 Sustained GFlop/s per node What does it mean? Each Eurora node (server) of the same size of a laptop is capable of performing 1.700.000.000.000 floating point operations per second, 30 times more than a desktop The Eurora system is currently the most energy-efficient standard x86-based system of the world with 3150 Mflop/s per WATT. This is 15 times more efficient than an average desktop computer
  • 10. Tigon: an energy-aware design• GPUs• Optimized design: – No unused components – No fans – Soldered components – Dense architecture (with integrated interconnect)• Optimized power conversion chain – To enable system level energy efficiency – To enable data-center level energy efficiency• Liquid Cooling – To enable system level energy efficiency when cold water is used – To enable data-center level energy efficiency when hot water is used
  • 11. “standard” power distribution conversion steps Data from Intel
  • 12. Moving towards DC reduces steps in power conversion Data from Intel
  • 13. Aurora power distribution 10 V 230 V 48 Vdc Optional UPS 97% efficiency 98% efficiency
  • 14. Gain DC/DC conversion efficiency• In the DC/DC conversion a gain of over 2% in efficiency, from 95,5 % to 98% Existing DC/DC conversion New upgraded DC/DC conversion
  • 15. Liquid cooling and efficiency at system level178 nodes – AMD Opteron 6128HE CPUs (Magny Cours) - 16GB RAM Measuremets taken by LRZ
  • 16. Why liquid cooling is better? • Heat capacity: • Air: 1 • Water: 3500 • Control over coolant flow and heat exchange • Control over temperature
  • 17. Ways of cooling car engines 20
  • 18. Free cooling and energy re-use PUE < 1 !!
  • 19. 11000 CO2 tons saved! 1500 cars that do not circulate for 1 year 11500 saved trees 15 Km2 of rain forest left untouched“Green is the prime color of the world, and that from whichits loveliness arises” Pedro Calderon de la Barca