Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

1,973 views

Published on

A briefing on UK e-Infrastructure for research from Jisc and the UK research councils, presented at the UK/USA HPC workshop in July 2015, organized by HPC-SIG (UK) and CASC (USA).

Published in: Science
  • Be the first to comment

UK e-Infrastructure for Research - UK/USA HPC Workshop, Oxford, July 2015

  1. 1. UK e-Infrastructure for Research Michael Ball, BBSRC Frances Collingborn, NERC Martin Hamilton, Jisc David de Roure, ESRC / University of Oxford Photo credit: STFC 1UKUSAHPC - July 201509/07/2015
  2. 2. UK e-Infrastructure for research 1. UK e-Infrastructure for research – Public funding for major science facilities and institutes – Support for translation from R&D into business 2. e-Infrastructure survey – Build inventory of the e-Infrastructure – Operating systems and software environment – Funding and budgeting models – Training and support arrangements – Academic and industrial impact 3. RCUK e-Infrastructure roadmap – Vision and aspirations – Investment plan Photo credit EPCC / EPSRC
  3. 3. 1. UK e-Infrastructure for research 09/07/2015 UKUSAHPC - July 2015 3
  4. 4. UK e-Infrastructure for research HPC Project RC Amount/£M National Service EPSRC, NERC 43 Hartree Centre STFC 30 DIRAC STFC 15 GRIDPP STFC 3 The Genome Analysis Centre (TGAC) BBSRC 8 Monsoon NERC/Met Office 1 JASMIN2 & CEMS NERC, & UKSA 7.75 Regional Centres: N8, SES5, MID+, HPC Midlands, ARCHIE- WeSt EPSRC 6.5 JANET Network and Authentication Moonshot Jisc 31 HPC Data Storage EPSRC, STFC 15 Total 160 Investments by BIS, the Research Councils and HEIs have resulted in core elements of the national e- Infrastructure being put in place. » 2011-2012 - £160m Investments were made in core HPC and Networking infrastructure. In addition investments were made in the Authentication Infrastructure Moonshot (now known as Jisc Assent). » 2012-2013 - £189m » 2014-2015 - £257m
  5. 5. UK e-Infrastructure for research From www.jasmin.ac.uk
  6. 6. UK e-Infrastructure for research Big Data Project RC Amount/£M Digital transformations in arts and humanities AHRC 8 E-infrastructure for biosciences BBSRC 13 Research data facility and software Development EPSRC 8 Administrative data centres ESRC 36 Understanding populations ESRC 12 Business datasafe ESRC 14 Biomedical informatics MRC 55 Environmental virtual observatory NERC 13 Square Kilometre Array STFC 11 Energy Efficiency Computing Hartree Centre STFC 19 Total 189 Investments by BIS, the Research Councils and HEIs have resulted in core elements of the national e- Infrastructure being put in place. » 2011-2012 - £160m » 2012-2013 - £189m Big Data projects using funds announced by the Government in December 2012 were funded at this time. Major Awards have been made to 18 centres in the UK, 16 of whom are HEIs.The pre-eminent role of HEIs in managing and providing national and Large Specialist data and compute services to UK academia is emphasised by these awards. » 2014-2015 - £257m
  7. 7. UK e-Infrastructure for research Energy Efficient Computing Infrastructure (STFC) De-identified admin (including health) data Busines s data Open data (public sector) Social media data Researc h data Longitudin al survey data Open data Securely held data Environme nt data Business Datasafe (ESRC) Admin Data Research Centres (ESRC) High Performance Data Environment (NERC) Clinical data Medical Bioinformatics (MRC) Understanding Populations (ESRC) Clinical Practice Datalink (MHRA, NIHR) 100,000 Genome Project NHS) Research Data Facility (EPSRC) European Bioinformatics Institute (EMBL) Bioscience E-Infrastructure (BBSRC) Square Kilometre Array (STFC) Digital Transformations (AHRC) Archive data Open Data Institute Commercial Research Understanding Populations (ESRC) RCUK Big Data 21st century raw material
  8. 8. UK e-Infrastructure for research ESRC Big Data Network
  9. 9. UK e-Infrastructure for research Investments by BIS, the Research Councils and HEIs have resulted in core elements of the national e- Infrastructure being put in place. » 2011-2012 - £160m » 2012-2013 - £189m » 2014-2015 - £257m Three major investments dominated this period: » Centre for Cognitive Computing at the Hartree Centre. This was funded at the £115M level with a further £230M from IBM » A 10 Pflop Supercomputer for the Met Office (£100M) » AlanTuring Centre for Data Science (£42M) In addition it was announced that a further £100M would be made available to the SKA Project as part of Big Data Investments. Photo credit: EPSRC
  10. 10. April 2015 BBSRC bioscience big data infrastructure funding: » £1.79M to build a next generation image repository, to make available original scientific image data that underpins life sciences research. » £2M for big data infrastructure for crop genomics, stimulating new opportunities in crop development to help improve some of the world's most important crops. » £1.9M to establish infrastructure for functional annotation of farmed animal genomes, to help feed us in the future by providing an important framework for the discovery of genetic variation in domesticated animals and how that influences their characteristics. » £1.78M to create cyber infrastructure for the plant sciences.The UK iPlant node that will help to spread expertise and best practice between the UK and US. UK/US collaboration with University of Arizona and theTexas Advanced Computing Center. UK e-Infrastructure for research
  11. 11. UK e-Infrastructure for research bit.ly/dowlingreport bit.ly/bis8great Context: › Reviews, e.g. Pearce, Diamond, Dowling, Shadbolt – Demonstrable efficiency, effectiveness and productivity › UK Government Industrial Strategy – 8 GreatTechnologies – CatapultCentres › Cultural shifts – Open Science – OpenAccess – Open Research Data
  12. 12. UK e-Infrastructure for research bit.ly/hauserreportbit.ly/jischpc Drivers: › Shared facilities and industry access – Finding them – Using them (kit & people) › Big push for translation and consolidation – New Catapult Centres – Farr Institute, FrancisCrick Institute, AlanTuring Institute › Impact of Austerity 2.0 – Comprehensive Spending Review, Autumn 2015
  13. 13. 2. e-Infrastructure Survey 09/07/2015 UKUSAHPC - July 2015 13
  14. 14. e-Infrastructure Survey What we did: › Build an inventory of UK research e-Infrastructure – Including interconnects, storage, accelerators etc – Gathering data on use of cloud technologies › Itemize operating environment – e.g. OS distributions, schedulers, filesystems, authentication & authorization › Funding and budgeting models – Power costs, PUE, split between CAPEX/OPEX, location of scientific computing in the institution › Training and support arrangements – Where is support effort spent, role of women in HPC › Academic and industrial impact – Grants, papers, businesses using the facilities Photo credit: CC-BY HPC Midlands
  15. 15. e-Infrastructure Survey bit.ly/nei2013 bit.ly/nei2014
  16. 16. e-Infrastructure Survey › Top 9 Large & Specialist (by core count)
  17. 17. e-Infrastructure Survey › Top 9 Large & Specialist (by core count)
  18. 18. e-Infrastructure Survey › Top 9 Large & Specialist (by size of storage)
  19. 19. e-Infrastructure Survey › Top 9 Large & Specialist (by size of storage)
  20. 20. e-Infrastructure Survey › Top 9 Large & Specialist (by number of users) 1. Large and Specialist Services Organisation name System name What are the top three research areas the system is used for? Total number of processor cores in the system Total usable storage for HPC users (TB) Number of registered users Theoretical Peak Performance (Tflop/s) NERC (operated by STFC) JASMIN Climate Science, Earth Observation, environmental genomics 4,500 25 Over 10,000 STFC Hartree Centre Blue Wonder Modelling & Simulation (CFD, Materials, and Computer Aided Formulation) 24,000 9000 750 - 1,000 200 Norwich Bioscience Institutes (TGAC, JIC, IFR, TSL) Bioinformatics, mathematical modelling. 9,000 4,000 750 - 1,000 DiRAC @ University of Cambridge (HPCS) Darwin Life Sciences. Atomic structure. Computational Fluid Dynamics. 9,600 2,847 750 - 1,000 200 STFC Scientific Computing Division UK e-Science Certification Authority Supports all UK research. Major users Particle Physics 750 - 1,000 STFC Scientific Computing Division SCARF Computational Chemistry Plasma Physics, Processing Satellite images Support of ISIS, CLF, RAPSP, DLS user communities 7,000 320 500 - 750 165 STFC Hartree Centre Blue Joule Modelling & Simulation (CFD, Materials, and Computer Aided Formulation) 98,000 6000 200 - 500 1,200 EMBL-EBI - European Bioinformatics Institute Embassy Cloud Life science research 31,000 3,200 200 - 500 DiRAC @ EPCC DIRAC BG/Q QCD, Soft Matter Physics 98,304 1,000 200 - 500 1,258
  21. 21. e-Infrastructure Survey › Top 9 Large & Specialist (by number of users) 1. Large and Specialist Services Organisation name System name What are the top three research areas the system is used for? Total number of processor cores in the system Total usable storage for HPC users (TB) Number of registered users Theoretical Peak Performance (Tflop/s) NERC (operated by STFC) JASMIN Climate Science, Earth Observation, environmental genomics 4,500 25 Over 10,000 STFC Hartree Centre Blue Wonder Modelling & Simulation (CFD, Materials, and Computer Aided Formulation) 24,000 9000 750 - 1,000 200 Norwich Bioscience Institutes (TGAC, JIC, IFR, TSL) Bioinformatics, mathematical modelling. 9,000 4,000 750 - 1,000 DiRAC @ University of Cambridge (HPCS) Darwin Life Sciences. Atomic structure. Computational Fluid Dynamics. 9,600 2,847 750 - 1,000 200 STFC Scientific Computing Division UK e-Science Certification Authority Supports all UK research. Major users Particle Physics 750 - 1,000 STFC Scientific Computing Division SCARF Computational Chemistry Plasma Physics, Processing Satellite images Support of ISIS, CLF, RAPSP, DLS user communities 7,000 320 500 - 750 165 STFC Hartree Centre Blue Joule Modelling & Simulation (CFD, Materials, and Computer Aided Formulation) 98,000 6000 200 - 500 1,200 EMBL-EBI - European Bioinformatics Institute Embassy Cloud Life science research 31,000 3,200 200 - 500 DiRAC @ EPCC DIRAC BG/Q QCD, Soft Matter Physics 98,304 1,000 200 - 500 1,258
  22. 22. e-Infrastructure Survey › Regional centres (by total cores) 2. Regional Systems Organisation name System name What are the top three research areas the system is used for? Total number of processor cores in the system Total usable storage for HPC users (TB) Number of registered users Theoretical Peak Performance (Tflop/s) HPC Wales Various (distributed system) Advanced Materials & Manufacturing, Life Sciences and Energy & Environment 16,816 702 2,000 - 5,000 319 N8HPC Polaris 5,312 175 200 - 500 138 ARCHIE-WeSt ARCHIE Molecular dynamics, CFD, Plasma Physics 3,920 148 200 - 500 38 HPC Midlands Hera Advanced Materials Energy Efficient Transport 3,008 120 100 - 200 48
  23. 23. e-Infrastructure Survey › Regional centres (by total cores) 2. Regional Systems Organisation name System name What are the top three research areas the system is used for? Total number of processor cores in the system Total usable storage for HPC users (TB) Number of registered users Theoretical Peak Performance (Tflop/s) HPC Wales Various (distributed system) Advanced Materials & Manufacturing, Life Sciences and Energy & Environment 16,816 702 2,000 - 5,000 319 N8HPC Polaris 5,312 175 200 - 500 138 ARCHIE-WeSt ARCHIE Molecular dynamics, CFD, Plasma Physics 3,920 148 200 - 500 38 HPC Midlands Hera Advanced Materials Energy Efficient Transport 3,008 120 100 - 200 48
  24. 24. e-Infrastructure Survey › Top 8 HEIs (by total cores) 3. HEI Systems Organisation name System name What are the top three research areas the system is used for? Total number of processor cores in the system Total usable storage for HPC users (TB) Number of registered users Theoretica l Peak Performan ce (Tflop/s) Imperial College London cx1 21,558 2,000 750 - 1,000 University of Bristol BlueCrystal Chemistry, Aerospace Eng, Geographical Sciences 9,000 740 750 - 1,000 240 University College London Legion Chemistry, Physics, Biological Sciences (according to REF Categories) 7,816 356 500 - 750 115 Imperial College London cx2 7,000 500 0 - 100 60 University of Manchester Computational Shared Facility Computational Chemistry / MD CFD FEA 6,288 750 750 - 1,000 111 Durham University Hamilton Condensed Matter Molecular Dynamics Fluid Dynamics 5,600 350 200 - 500 75 University of Oxford Arcus-B 5,440 432 2,000 - 5,000 538 Lancaster University HEC (High End Cluster) High Energy Physics Condensed Matter Theory CFD 4,784 1,530 200 - 500
  25. 25. e-Infrastructure Survey
  26. 26. e-Infrastructure Survey
  27. 27. e-Infrastructure Survey
  28. 28. e-Infrastructure Survey
  29. 29. e-Infrastructure Survey
  30. 30. 3. RCUK e-Infrastructure roadmap 09/07/2015 UKUSAHPC - July 2015 30
  31. 31. RCUK e-Infrastructure roadmap From the roadmap document: “Our aspiration is for the UK to have an integrated e- infrastructure: one that is run and managed as a whole without silos or boundaries, where there are simple processes by which users can get access to the e-infrastructure they need across the eco-system, as appropriate for the type or stage of research they are doing.We need to consider how best to integrate: » Vertically up and down the eco-system pyramid, so users have easy access to the most appropriate type of e- infrastructure they need; » Horizontally across the different elements, as shown in the diagram; » Across the different research communities and the different stakeholders; » Internationally, across other national e-infrastructures to deliver end-to-end services in the global environment of collaborative research.” bit.ly/eroadmap
  32. 32. RCUK e-Infrastructure roadmap bit.ly/eroadmap
  33. 33. That’s all, folks… 33 Except where otherwise noted, this work is licensed under CC-BY Martin Hamilton Futurist, Jisc, London @martin_hamilton martin.hamilton@jisc.ac.uk UKUSAHPC - July 201509/07/2015

×