Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

National scale research computing and beyond pearc panel 2017

73 views

Published on

Panel at the PEARC 2017 event in New Orleans, July 11-13. Panelists were: Gregory Newby, Chief Technology Officer, Compute Canada; Florian Berberich, Member of the Board of Directors PRACE aisbl; Gergely Sipos, Customer and Technical Outreach Manager, EGI Foundation; and John Towns, Director of Collaborative eScience Programs, National Center for Supercomputing Applications.

Panel abstract: How might the international community of research computing users and stakeholders benefit from knowledge sharing among national- or international-scale research computing organizations and providers? It is common for large-scale investments in research computing systems, services and support to be guided and funded with government oversight and centralized planning. There are many commonalities, including stakeholder relations, outcomes reporting, long-range strategic planning, and governance. What trends exist currently, and how might information sharing and collaboration among resource providers be beneficial? Is there desire to form a partnership, or to build upon existing relationships? Participants in this panel will include personnel involved in US, Canadian and European research computing jurisdictions.

Published in: Education
  • Be the first to comment

National scale research computing and beyond pearc panel 2017

  1. 1. National Scale Research Computing and Beyond - PEARC New Orleans - Thursday July 13 2017
  2. 2. Abstract 2 How might the international community of research computing users and stakeholders benefit from knowledge sharing among national- or international- scale research computing organizations and providers? It is common for large- scale investments in research computing systems, services and support to be guided and funded with government oversight and centralized planning. There are many commonalities, including stakeholder relations, outcomes reporting, long-range strategic planning, and governance. What trends exist currently, and how might information sharing and collaboration among resource providers be beneficial? Is there desire to form a partnership, or to build upon existing relationships? Participants in this panel will include personnel involved in US, Canadian and European research computing jurisdictions.
  3. 3. Panelists 3 Gregory Newby Chief Technology Officer, Compute Canada Florian Berberich Member of the Board of Directors PRACE aisbl Gergely Sipos Customer and Technical Outreach Manager, EGI Foundation John Towns Director of Collaborative eScience Programs, National Center for Supercomputing Applications
  4. 4. Panel structure 4 Opening remarks from each panelist (~5 minutes each) Discussion roundtable on three topics (15-20 minutes each) Audience questions and participation
  5. 5. Opening remarks: EGI Gergely Sipos Customer and Technical Outreach Manager, EGI Foundation 5
  6. 6. The EGI federated infrastructure 300+ HTC providers 23 Cloud providers 23 members (NGIs and CERN) 650k CPU cores 285 PB online storage 2.6 Billion CPU hours/year 240 Virtual Organisations >48 000 users EGI Foundation (Amsterdam)
  7. 7. EGI Service Catalogue Compute Storage and Data Cloud Compute Run virtual machines on demand with complete control over computing resources Cloud Container Compute Run Docker containers in a lightweight virtualised environment Online Storage Store, share and access your files and their metadata on a global scale Archive Storage Back-up your data for the long term and future use in a secure environment Data Transfer Transfer large sets of data from one place to another Training FitSM training Learn how to manage IT services with a pragmatic and lightweight standard Training infrastructure Dedicated computing and storage for training and education High-Throughput Compute Execute thousands of computational tasks to analyse large datasets à Aka. ‘Grid computing’
  8. 8. EGI serves researchers and innovators Research infrastructures Size of individual groups Multinational communities, (e.g. H2020 projects) ‘Long tail of science’ WLCG ELI CTA ELIXIR EPOS EISCAT_3D BBMRI CLARIN LOFAR EMSO LifeWatch ICOS EMSO CORBEL ENVRIplus … VRE projects OpenDreamKit WeNMR DRIHM VERCE MuG AgINFRA CMMST LSGC SuperSites Exploitation Environmental sci. neuGRID … PeachNote CEBA Galaxy eLab Semiconductor design Main-belt comets Quantum pysics studies Virtual imaging (LS) Bovine tuberculosis spread Convergent evol. in genomes Geography evolution Seafloor seismic waves 3D liver maps with MRI Metabolic rate modelling Genome alignment Tapeworms infection on fish … Industry, SMEs Agroknow CloudEO CloudSME Ecohydros gnubila Sinergise SixSq TEISS Terradue Ubercloud … Introduction to EGI Support by ‘Competence Centres’
  9. 9. Allocating services and resources to communities Project/Community representing the VO Negotiator Grid provider Cloud provider Operation Level Agreement Service Level Agreement Satisfaction review (every 3/6/12 months) Storage provider Service requirements Conditions Applic. provider Performance reports SupportTraining Type, number, size, cost, availability, etc.
  10. 10. Applications On Demand service Serving the long tail of science Scientific publications Cloud, HTC, storage sites Science gateway 1 Science gateway 2 ApScience gateway 3 Science gateway X User Registration Portal (URP) Users Support teams App. 1 App. K … App. 1 App. L … App. 1 App. M … App. 1 App. N … Identity vetting EGI Accounting system Usage stats Usage stats Approve,s uspend users For Researchers Use applications from the catalogue For Application Developers Integrate applications into science gateways For HTC/cloud/storage providers Integrate resources For providers Receive credits via acknowledgements For support teams Direct contact with the long tail http://access.egi.eu
  11. 11. Next Community Event https://www.digitalinfrastructures.eu
  12. 12. EGI Foundation • Science Park 140 • 1098 XG Amsterdam • The Netherlands +31 (0)20 89 32 007 • egi.eu support@egi.eu Thank you!
  13. 13. Opening remarks: XSEDE John Towns Director of Collaborative eScience Programs, National Center for Supercomputing Applications 6
  14. 14. July 15, 2017 XSEDE Intro John Towns XSEDE Principal Investigator jtowns@ncsa.illinois.edu
  15. 15. Motivation for XSEDE: • Scientific advancement across multiple disciplines requires a variety of resources and services • XSEDE is about increased productivity of the community and providing expanded capabilities – leads to more science – is sometimes the difference between a feasible project and an impractical one – lowers barriers to adoption • XSEDE provides a comprehensive eScience infrastructure composed of expertly managed and evolving advanced heterogeneous digital resources and services integrated into a general-purpose infrastructure 2
  16. 16. XSEDE Factoids: high order bits • 5 year, US$110M project – pursuing additional funding via independent proposals – initial 5 year award: $121M project + ~$4.6M in supplements • plus $9M, 5 year Technology Investigation Service –separate award from NSF • No funding for major hardware – coordination, support and creating a national/international cyberinfrastructure – coordinate allocations, support, training and documentation for >$100M of concurrent project awards from NSF • ~90 FTE /~200 individuals funded across 19 partner institutions – this requires solid partnering! 3
  17. 17. Total Research Funding Supported by XSEDE to Date 4 $2.86 billion in research supported by XSEDE July 2011-May 2017 Research funding only. XSEDE leverages and integrates additional infrastructure, some funded by NSF (e.g. Track 2 systems) and some not (e.g. Internet2). NSF, $982.2M, 34% NIH, $613.6M, 21% DOE, $560.6M, 20% DOD, $222.1M, 8% NASA, $114.3M, 4% DOC, $57.8M, 2% All Others, $313.6M, 11%
  18. 18. Field of Science Trends 5 0 5 10 15 20 25 30 35 40 45 50 NUsUsed(Billions) Molecular Biosciences Materials Research Astronomical Sciences Physics Chemistry Chemical, Thermal Systems Atmospheric Sciences More details on domains with less usage will be discussed further in the ECSS and CEE presentations.
  19. 19. XSEDE offers efficient and effective integrated access to a variety of resources • Leading-edge distributed memory systems • Very large shared memory systems • High throughput systems, including Open Science Grid (OSG) • Support for VM’s and containers and HPC Cloud • Visualization engines • Accelerators like GPUs and Xeon PHIs • Extensive library of research applications Many scientific problems have components that call for use of more than one platform. 6
  20. 20. International Collaborations of Infrastructures • Some opportunities – science teams are international; coordinated support – sharing of operational practices and policies – organizational benchmarking – coordination of resources • Some challenges – infrastructures have varying structures, missions, finding time lines, user communities, resource and application types, … – infrastructures are typically not actually funded to support collaborations – challenges in “optics” of potential resource sharing 7
  21. 21. More information at: www.xsede.org 8
  22. 22. Opening remarks: PRACE Florian Berberich Member of the Board of Directors PRACE aisbl 7
  23. 23. 1 www.prace-ri.euPEARC - July 2017 Partnership for Advanced Computing in Europe National Scale Research Computing and Beyond PEARC New Orleans Thursday July 13 Florian Berberich PRACE Board of Directors
  24. 24. 2 www.prace-ri.euPEARC - July 2017 Partnership for Advanced Computing in Europe ▶ Open access to best-of-breed HPC-systems to EU Scientists ▶ Variety of architectures to support the different scientific communities ▶ High standards in computational science and engineering ▶ Peer review at European scale to foster scientific excellence ▶ Robust and persistent funding scheme for HPC supported by the national governments and the EC ▶ Support the development of IPR in Europe by working with public services and European industry ▶ Collaborate with European HPC industrial users and suppliers
  25. 25. 3 www.prace-ri.euPEARC - July 2017 MareNostrum: IBM BSC, Barcelona, Spain JUQUEEN: IBM BlueGene/Q GAUSS/FZJ Jülich, Germany CURIE: Bull Bullx GENCI/CEA Bruyères-le-Châtel, France SuperMUC: IBM GAUSS/LRZ Garching, Germany Hazel Hen: Cray GAUSS/HLRS, Stuttgart, Germany MARCONI: Lenovo CINECA Bologna, Italy Piz Daint: Cray XC 30 CSCS Lugano, Switzerland PRACE Tier-0 Systems #3 TOP500, 25,3 PF Peak (19,6 PF Linpack) #21, 5,9 (5,0) #40+41, 6,8 (5,7) #17, 7,4 (5,6) #14, 10,8 (6,2) #85, 1,7 (1,4) #13, 10,0 PF (6,2)
  26. 26. 4 www.prace-ri.euPEARC - July 2017 Biochemistry, Bioinformatics and Life Sciences 15% Chemical Sciences and Materials 26% Universe Sciences 16% Mathematical and Computer Sciences 4% Earth System Sciences 7% Engineering 17% Fundamental Constituents of matter 15% Supporting many scientific Domains Research Domain Pie Chart up to and including Call 14, % of total core hours awarded
  27. 27. 5 www.prace-ri.euPEARC - July 2017 PRACE Achievements to date ▶ More than 500 scientific projects enabled ▶ Over 14 thousand million core hours awarded since 2010 ▶ Of which 63% are trans-national ▶ R&D access to industrial users with >50 companies supported ▶ More than 10 000 people trained by 6 PRACE Advanced Training Centers and others PRACE events ▶ More than 60 Pflop/s of peak performance on 7 world-class systems ▶ 24 PRACE members, including 5 Hosting Members (France, Germany, Italy, Spain and Switzerland)
  28. 28. 6 www.prace-ri.euPEARC - July 2017 Diagram: Augusto BURGUEÑO ARJONA Virtual Research Environment Information Infrastructure Trends in Europe
  29. 29. 7 www.prace-ri.euPEARC - July 2017 Trends in Europe EOSC EDI € € Int. RI, ESFRI, CERN, EMBL Commercial MS Governance: • Rules of Engagements • Standard setting • Agenda setting • User interface • Catalogue of services for research • Core service provision • Brookerage of ext. services • User interface • Catalogue of services for research • Core service provision Researcher EU/MS EU/MS Governance: • Rules of Engagements • Standard setting • Agenda setting Researcher EGI EUDAT Open AIRE RDA PRACE GÉANT …
  30. 30. 8 www.prace-ri.euPEARC - July 2017 Trends in Europe EOSC EDI
  31. 31. Opening remarks: Compute Canada Gregory Newby Chief Technology Officer, Compute Canada 8
  32. 32. Canada’s Only National Provider of Shared Essential Digital Research Infrastructure ● Compute Canada (CC) leads Canada’s national advanced research computing (ARC) platform. ● There is no other major supplier in Canada. ● CC is a not-for-profit corporation. The membership includes 35 of Canada’s major research institutions and hospitals. ● Funding is through a federal grant with matching funds from provincial and institutional partners (40% federal / 60% provinces and institutions), which is the basis of the federated Canadian model. ● We provide shared services to over 11,000 researchers across Canada. No fees. Large requests based on a merit-based access system. 9
  33. 33. Exciting things are happening up North! Canada’s largest research computing update ever, with $CAD 125M over ~3 years. Funded by CFI with match from provinces & campuses. From 27 campus-based data centers, to 5 national data centers with larger, more integrated systems Stage 1: Four sites Stage 2: One additional site (TBA) 10
  34. 34. 11 Serving Researchers in all Disciplines
  35. 35. Continued Growth in User Base 12
  36. 36. 13 International Rankings - Log Scale GF = Gigaflop/s
  37. 37. 14 Impact Affirmation: Bibliometrics Field-Weighted Citation Impact (FWCI) of CC-enabled papers
  38. 38. Topic 1 15 How similar or different are your organization’s USERS and APPLICATION AREAS to other panelists’ organizations? What are some key similarities or differences?
  39. 39. Topic 2 16 Which of your organization’s PRACTICES might be suitable for sharing or coordination with other organizations?
  40. 40. Topic 3 17 What are biggest challenges facing your organization today, for which some aspects of INTERNATIONAL COLLABORATION or cooperation might be beneficial?
  41. 41. Audience Questions and Participation 18 Questions should be brief and focused Use the program feedback form to add longer observations, external links, etc.: https://pearc17.sched.com/event/Asod
  42. 42. Panel conclusion 19 Thanks to all panelists Thanks for audience participation and engagement Follow-up and discussions on next steps are welcome Use the program feedback form: https://pearc17.sched.com/event/Asod

×