PLGrid
National Grid and Cloud Infrastructure for Science
ACK CYFRONET AGH
Competence Centre for Cloud and Grid Computing
Lukasz Dutka
Open Science Fair, Athens 2017
Poznan Supercomputing and
Networking Center, Institute of
Bioorganic Chemistry Polish
Academy of Sciences
Wroclaw Centre for Networking
and Supercomputing
The Academic Computer
Centre Cyfronet AGH
(coordinator)
Interdisciplinary Centre for
Mathematcal and Computatinal
Modelling
Academic Computer
Centre in Gdansk
Gdansk
TASK
Poznan
PCSS
Wroclaw
WCSS
Krakow
Cyfronet
Warszawa
ICM
RI / eInfrastructure Development
coordinated by Cyfronet
 PLGrid PLUS (2011–2015)
 Outcome:
 Focus on users (training, helpdesk…)
 Domain specific solutions: 13
 PL-Grid (2009–2012)
 Outcome: Common base infrastructure
 PLGrid NG (2014–2015)
 Outcome:
 Optimization of resources usage, training
 Extension of domain specific by: 14
 PLGrid CORE (2014–2015)
 Outcome: Competence Center
 Open Science paradigm
(large workflow app., data farming
mass comp., ……)
 End-user services
Polish National Academic Network PIONIER
Cluster
Data
Respositories
Cluster
PL-Grid Middleware and Services
DomainGridAppliations
DomainGridAppliations
DomainGridAppliations
DomainGridAppliations
DomainGridAppliations
DomainGridAppliations
Summary of Projects Results
Research Infrastructures vs. e-Infrastructures
 Synergy between domain specific researchers
and IT experts. Close collaboration.
 Ecosystem for making science with pillars
 software
 tools, environments, services
 cloud
 Distributed data management
 hardware
 HPC, HTC, data intensive
 expertise (from communities)
 domain specific and IT
 community involvement in all activities
 helpdesk (efficiency in operation), training, marketing
 financial issues
Domain grids 27 solutions
Summary of Projects Results
Active users and grants (up-to-date-2017)
Number of all Infrastructure users
Number of active Infrastructure users
Others
2017
2012
Onedata - Distributed Data Management
for Close and Open Data
External
Systems
Legacy
Applications
Other
API
POSIX
FILE
SYST
EM
LONG
POLLI
NG
API
REST
API
HTTP
CDMI
API
WEB
GUI
External
Systems
Many IdPs
Distributed
Data
Fine Grain
Access Control
Migration
between
Locations
Metadata
AnnotationData Discovery
Adhoc
Grouping
Hierarchical
Groups Synced
with IdPs
Multiple
Replicas
Policy
Mgmt.
Data CachingDecentralization
Multi-level
Scalability
No Data
Lock-in
Virtual
Filesystem
No-Barrier
Data
Sharing
Open Data
Sharing
DOI/Handle
Minting
Storage
Heterogeneity
Unified
Sharing Existing
Data Sets
Decoupled
Collections
My Data
Opportunities and Ideas
 Working set of e-services should be reused
 User support plans and helpdesk should widen
 Complex solutions for “payment” and “billing” working in scientific world based on grants
 Distributed data processing and preservation services based on e-services including
Onedata
 Support for Open Science collection OAI-PMH protocols for integration data collection with
open science discovery tools.
QUESTIONS?

OSFair2017 Workshop | PLGrid: National Grid and Cloud Infrastructure for Science

  • 1.
    PLGrid National Grid andCloud Infrastructure for Science ACK CYFRONET AGH Competence Centre for Cloud and Grid Computing Lukasz Dutka Open Science Fair, Athens 2017
  • 2.
    Poznan Supercomputing and NetworkingCenter, Institute of Bioorganic Chemistry Polish Academy of Sciences Wroclaw Centre for Networking and Supercomputing The Academic Computer Centre Cyfronet AGH (coordinator) Interdisciplinary Centre for Mathematcal and Computatinal Modelling Academic Computer Centre in Gdansk Gdansk TASK Poznan PCSS Wroclaw WCSS Krakow Cyfronet Warszawa ICM
  • 3.
    RI / eInfrastructureDevelopment coordinated by Cyfronet  PLGrid PLUS (2011–2015)  Outcome:  Focus on users (training, helpdesk…)  Domain specific solutions: 13  PL-Grid (2009–2012)  Outcome: Common base infrastructure  PLGrid NG (2014–2015)  Outcome:  Optimization of resources usage, training  Extension of domain specific by: 14  PLGrid CORE (2014–2015)  Outcome: Competence Center  Open Science paradigm (large workflow app., data farming mass comp., ……)  End-user services Polish National Academic Network PIONIER Cluster Data Respositories Cluster PL-Grid Middleware and Services DomainGridAppliations DomainGridAppliations DomainGridAppliations DomainGridAppliations DomainGridAppliations DomainGridAppliations
  • 4.
    Summary of ProjectsResults Research Infrastructures vs. e-Infrastructures  Synergy between domain specific researchers and IT experts. Close collaboration.  Ecosystem for making science with pillars  software  tools, environments, services  cloud  Distributed data management  hardware  HPC, HTC, data intensive  expertise (from communities)  domain specific and IT  community involvement in all activities  helpdesk (efficiency in operation), training, marketing  financial issues Domain grids 27 solutions
  • 5.
    Summary of ProjectsResults Active users and grants (up-to-date-2017) Number of all Infrastructure users Number of active Infrastructure users Others 2017 2012
  • 6.
    Onedata - DistributedData Management for Close and Open Data External Systems Legacy Applications Other API POSIX FILE SYST EM LONG POLLI NG API REST API HTTP CDMI API WEB GUI External Systems Many IdPs Distributed Data Fine Grain Access Control Migration between Locations Metadata AnnotationData Discovery Adhoc Grouping Hierarchical Groups Synced with IdPs Multiple Replicas Policy Mgmt. Data CachingDecentralization Multi-level Scalability No Data Lock-in Virtual Filesystem No-Barrier Data Sharing Open Data Sharing DOI/Handle Minting Storage Heterogeneity Unified Sharing Existing Data Sets Decoupled Collections My Data
  • 7.
    Opportunities and Ideas Working set of e-services should be reused  User support plans and helpdesk should widen  Complex solutions for “payment” and “billing” working in scientific world based on grants  Distributed data processing and preservation services based on e-services including Onedata  Support for Open Science collection OAI-PMH protocols for integration data collection with open science discovery tools.
  • 8.