HPC Midlands - Supercomputing for Research and Industry (Hartree Centre presentation)
Upcoming SlideShare
Loading in...5
×
 

HPC Midlands - Supercomputing for Research and Industry (Hartree Centre presentation)

on

  • 611 views

These are the slides for my talk on HPC Midlands at the Hartree Centre's "HPC As A Service for Industry" event at STFC Daresbury Labs in January 2013. For more information on the HPC Midlands ...

These are the slides for my talk on HPC Midlands at the Hartree Centre's "HPC As A Service for Industry" event at STFC Daresbury Labs in January 2013. For more information on the HPC Midlands project, please see http://hpc-midlands.ac.uk

Statistics

Views

Total Views
611
Views on SlideShare
336
Embed Views
275

Actions

Likes
0
Downloads
1
Comments
0

6 Embeds 275

http://blog.martinh.net 201
http://hpc-midlands.ac.uk 54
http://3480704540309659677_ed172087176715f6bb22b9380d500d683f7a5cb5.blogspot.com 13
http://laicized11.dabbadoo.com 4
https://www.google.co.uk 2
http://hpc-midlands-dev.lboro.ac.uk 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-ShareAlike LicenseCC Attribution-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

HPC Midlands - Supercomputing for Research and Industry (Hartree Centre presentation) HPC Midlands - Supercomputing for Research and Industry (Hartree Centre presentation) Presentation Transcript

  • Martin Hamilton, Centre Managerhpc-midlands.ac.uk
  • Contents • UK e-Infrastructure initiative • HPC Midlands • Industrial engagement • Lessons learned & opportunities
  • UK e-Infrastructure Programme• OSI e-Infrastructure Working Group• RCUK e-Infrastructure Advisory Group• £158m BIS e-Infrastructure investment• Tildesley Report: http://goo.gl/VSw4x• Regional HPC consortia funded via EPSRC
  • UK e-Infrastructure Programme• OSI e-Infrastructure Working Group• RCUK e-Infrastructure Advisory Group• £158m BIS e-Infrastructure investment• Tildesley Report: http://goo.gl/VSw4x• Regional HPC consortia funded via EPSRC• “Eight Great Technologies”, including £189m for Big Data & Energy Efficient Computing
  • What is HPC Midlands?• HPC on demand, delivered via JANET• Consortium of Loughborough University & University of Leicester• Managed service delivered by Bull• ISV support, e.g. ANSYS and CD-adapco• Funded by EPSRC/BIS e-Infrastructure initiative (£1m hardware grant + ‘recurrent’)
  • Hera, the HPC Midlands cluster • 3,000 cores (48 Teraflops) • 11 chassis (18 blades each) • 15TB RAM • 120TB Lustre storage • Non-blocking QDR Infiniband • 188 compute node blades – 2 x 2.0GHz (8 core) Sandy Bridge – 140 with 64GB RAM (4GB/core) – 48 with 128GB RAM (8GB/core)
  • HPC Midlands Applications Large Eddy Simulation of Premixed Combustion in Spark Ignition Engines. Prof. W. Malalasekera Dr. A. Clarke C. Ranasinghe
  • HPC Midlands Applications
  • HPC Midlands Applications Large Eddy Simulation of Crossflow Vortices on an Infinite Swept Wing V. I. Mistry G .J. Page J. J. McGuirk 42nd AIAA Fluid Dynamics Conference DOI: 10.1016/j.paerosci.2011.12.002
  • HPC Midlands Applications
  • HPC Midlands ApplicationsCFD Based Study of UnconventionalAeroengine Exhaust SystemsT. CoatesG. J. Page30th AIAA Applied AerodynamicsConferenceDOI: 10.2514/6.2012-2775
  • Industrial Engagement
  • Industrial EngagementStrategic partnershipsLocal / Regional firms & initiativesScience Parks and Incubators
  • HPC Midlands Case Study – E.ON Key points: • Steady state, complex geometry, simple physics. • Simulations with 24 to 128 cores • 6 different meshes used from 4.5 to 80 million cells • Speed-ups from 30 to 145 times observed vs. in-house systemANSYS CFX simulation ofleak in gas turbine enclosure
  • HPC Midlands Case Study – E.ON Normalized Timings for HPC Runs CPU core time per million cells 180 169.31Grid 1 4.5M elements 24 cores 160 per 100 iterations (s)Grid 2 8M elements 48 cores 140 120Grid 3 15M elements 48 cores 93.18 100 84.81Grid 4 25M elements 60 cores 80 67.62Grid 5 60 39.18 45M elements 128 cores 36.81 40Grid 6 80M elements 128 cores 20 0 Grid 1 … … … … … “Straightforward to use, secure and fast”  “Biggest advantage is for jobs that:  Have large parameter spaces  Are time dependent  Have complex geometry  Have very complex physics  A combination of the above”
  • Lessons Learned • Making the connection • Paperwork • Software licensing • IPR, NDA and other TLAs • Plumbing • Sneakernet • Connectivity • Moonshot
  • Paperwork – Software Licensinguser@hpc-midlands.ac.uk
  • Plumbing - Connectivity
  • Plumbing - Connectivity
  • Plumbing - Connectivity
  • Plumbing - Project Moonshot• JANET(UK) initiative• Like eduroam, but for any protocol• Use your existing credentials everywhere• Securely tunnelled back to home RADIUS server• Applicable to academia and industry?• 18 month pilot, kickoff in April• https://www.ja.net/products-services/janet- futures/moonshot
  • Martin Hamilton, Centre Managerhpc-midlands.ac.uk