slides (PPT)
Upcoming SlideShare
Loading in...5
×
 

slides (PPT)

on

  • 1,392 views

 

Statistics

Views

Total Views
1,392
Views on SlideShare
1,392
Embed Views
0

Actions

Likes
0
Downloads
10
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Na tym rysunku mamy schemat architektury. WP2 i WP3 – to nasze nowe serwisy i uslugi. Trzeba będzie zdefiniować interfejsy między nimi.
  • Data gathering, processing and interpretation in health and environment is driven by modern technology. The geographical distribution of data generators (Medical scanners, Patient Databases, Environment-Actuators and Monitors) requires and integrated approach to access and process these data. With the advent of Grid-based technology many new possibilities for data presentation and fast decision making come within reach. It is the goal of the CrossGrid project to explore, incorporate adapt and validate this technology for those application areas that require interactive access to resources, be it Databases, Super Computers, Visualisation Engines, Medical Scanners, or environmental data input devices

slides (PPT) slides (PPT) Presentation Transcript

  • The CrossGrid Project Marcel Kunze, FZK representing the X#-Collaboration
  • Main Objectives
    • New category of Grid enabled applications
      • Computing and data intensive
      • Distributed
      • Interactive, near real time response (a person in a loop)
      • Layered
    • New programming tools
    • Grid more user friendly, secure and efficient
    • Interoperability with other Grids
    • Implementation of standards
  • CrossGrid Collaboration Poland: Cyfronet & INP Cracow PSNC Poznan ICM & IPJ Warsaw Portugal: LIP Lisbon Spain: CSIC Santander Valencia & RedIris UAB Barcelona USC Santiago & CESGA Ireland: TCD Dublin Italy: DATAMAT Netherlands: UvA Amsterdam Germany: FZK Karlsruhe TUM Munich USTU Stuttgart Slovakia: II SAS Bratislava Greece: Algosystems Demo Athens AuTh Thessaloniki Cyprus: UCY Nikosia Austria: U.Linz 21 institut e s 11 countries View slide
  • IST Grid Project Space Applications Middleware & Tools Underlying Infrastructu re s - Links with European National efforts - Links with US projects (GriPhyN, PPDG, iVDGL,…) GRIDLAB GRIA EGSO DATATAG CROSSGRID DATAGRID GRIP EUROGRID DAMIEN Science Industry / business View slide
  • Collaboration with other # Projects
    • Objective – Exchange of
      • Information
      • Software components
    • Partners
      • DATAGRID
      • DATATAG
      • GRIDLAB
      • EUROGRID and GRIP
    • GRIDSTART
    • Participation in GGF
  • Project Phases M 1 - 3: requirements definition and merging M 4 - 12: first development phase: design, 1st prototypes, refinement of requirements M 13 -24: second development phase: integration of components, 2nd prototypes M 25 -32: third development phase: complete integration, final code versions M 33 -36: final phase: demonstration and documentation
  • Structure Overview Network infrastructure, archivers, HPC/HPV systems, Labour instruments, etc. Local domain services Protocols, Authentication, Authorization, Access policy, Resource management, etc. Remote Data Access Optimization Monitoring Schedulers Roaming Access Portals Interactive simulation and visualisation of a biomedical system Flooding crisis team support Distributed Data Analysis in High Energy Physics Weather forecast and air pollution modeling FABRIC INFRASTRUCTURE Grid SERVICES TOOLS APPLICATIONS GLOBUS TOOLKIT , Condor-G, ... DATAGRID SET OF TOOLS Grid Visualization Kernel Benchmarks
  • CrossGrid Architecture Applications And Supporting Tools Applications Development Support Grid Common Services Grid Visualisation Kernel Data Mining on Grid Interactive Distributed Data Access Globus Replica Manager Roaming Access Grid Resource Management Grid Monitoring MPICH-G Distributed Data Collection User Interaction Service DataGrid Replica Manager Datagrid Job Manager GRAM GSI Replica Catalog GASS MDS GridFTP Globus-IO Resource Manager CPU Resource Manager Resource Manager Secondary Storage Resource Manager Scientific Instruments (Medical Scaners, Satelites, Radars) Resource Manager Detector Local High Level Trigger Resource Manager VR systems (Caves, immerse desks) Resource Manager Visualization tools Optimization of Data Access Tertiary Storage Local Resources Biomedical Application Portal Performance Analysis MPI Verification Metrics and Benchmarks HEP High LevelTrigger Flood Application HEP Interactive Distributed Data Access Application HEP Data Mining on Grid Application Weather Forecast application
  • Layered Structure Interactive and Data Intensive Applications (WP1)
    • I nteractive simulation and visualization of
    • a biomedical system
    • Flooding crisis team support
    • Distributed data analysis in HEP
    • Weather forecast and air pollution modeling
    Grid Application Programming Environment (WP2)
    • MPI code debugging and
    • verification
    • Metrics and benchmarks
    • Interactive and semiautomatic
    • performance evaluation tools
    Grid Visualization Kernel Data Mining New CrossGrid Services (WP3) Globus Middleware Fabric Infrastructure (Testbed WP4) Data G rid GriPhyN ... Services HLA
    • Portals and roaming access
    • Grid resource management
    • Grid monitoring
    • Optimization of data access
  • Scope of Applications
    • Applications in health and environment
      • Data federation, processing and interpretation in geographically distributed locations
      • Fast, interactive decision making
    • Interactive access to distributed
      • Databases
      • Super computers and High Performance Clusters
      • Visualisation engines
      • Medical scanners
      • Environmental data input devices
  • Application Requirements
    • High quality presentation
    • High frame rate
    • Intuitive interaction
    • Real-time response
    • Interactive algorithms
    • High performance computing and networking
    • Distributed resources and data
  • Role of Network Latency Communication delay and rendering delay are negligible
  • CrossGrid Application Development (WP1)
    • Interactive simulation and visualisation of a biomedical system
      • Grid-based system for pre-treatment planning in vascular interventional and surgical procedures through real-time interactive simulation of vascular structure and flow.
    • Flooding crisis team support
    • Distributed interactive data analysis in HEP
      • F ocus on LHC experiments (ALICE, ATLAS, CMS and LHCb)
    • Weather forecast and air pollution modelling
      • Porting distributed/parallel codes on Grid
      • Coupled Ocean/Atmosphere Mesoscale Prediction System
      • STEM-II Air Pollution Code
  • Interactive Simulation and Visualisation of a Biomedical System
    • Grid-based prototype system for treatment planning in vascular interventional and surgical procedures through near real-time interactive simulation of vascular structure and flow.
    • The system will consist of a distributed near real-time simulation environment, in which a user interacts in Virtual Reality (VR) and other interactive display environments.
    • A 3D model of the arteries, derived using medical imaging techniques, will serve as input to a simulation environment for blood flow calculations.
    • The user will be allowed to change the structure of the arteries, thus mimicking an interventional or surgical procedure.
  • Current Situation Observation Diagnosis & Planning Treatment
  • Experimental Setup
  • Simulation Based Planning and Treatment Angio w/ Fem-Fem & Fem-Pop AFB w/ E-S Prox. Anast. Angio w/ Fem-Fem AFB w/ E-E Prox. Anast. Preop Alternatives
  • VR-Interaction
  • Flood Crisis Prevention
    • Support system for establishment and operation of Virtual Organization for Flood Forecasting associating a set of individuals and institutions involved in flood prevention and protection.
    • The system will employ a Grid technology to seamlessly connect together the experts, data and computing resources needed for quick and correct flood management decisions.
    • The main component of the system will be a highly automated early warning system based on hydro-meteorological (snowmelt) rainfall-runoff simulations.
    • System will integrate the advanced communication techniques allowing the crisis management teams to consult the decisions with various experts. The experts will be able to run the simulations with changed parameters and analyze the impact.
  • Virtual Organization for Flood Forecasting Storage systems databases surface automatic meteorological and hydrological stations systems for acquisition and processing of satellite information meteorological radars
    • External sources of information
    • Global and regional centers GTS
    • EUMETSAT and NOAA
    • Hydrological services of other countries
    Data sources meteorological models hydrological models hydraulic models HPC, HTC Grid infrastructure Flood crisis teams
    • meteorologists
    • hydrologists
    • hydraulic engineers
    Users
    • river authorities
    • energy
    • insurance companies
    • navigation
    • media
    • public
  • Flood Crisis Prevention V á h River Pilot Site
    • Water stages/discharges in the real time operating hydrological stations
    • Mapping of the flooded areas
    V á h River Catchment Area: 19700km 2 , 1/3 of Slovakia (Inflow point) Nosice Strečno (Outflow point ) Pilot Site Catchment Area: 2500km 2 (above Strečno : 5500km 2 )
  • F lood S imulations R esults Flow + water depths
  • Distributed Analysis in High Energy Physics
    • Challenging points
      • Access to large distributed databases in the Grid
      • Development of distributed data-mining techniques
      • Definition of a layered application structure
      • Integration of user-friendly interactive access (based on PROOF)
    • F ocus on LHC experiments (ALICE, ATLAS, CMS and LHCb)
  • PROOF Local Remote Selection Parameters Procedure Proc.C Proc.C Proc.C Proc.C Proc.C PROOF CPU CPU CPU CPU CPU CPU TagDB RDB DB1 DB4 DB5 DB6 DB3 DB2
  • Weather Forecast and Air Pollution Modeling
    • Integration of distributed databases into Grid
    • Migration of data mining algorithms to Grid
    • Porting distributed atmospheric & wave models to Grid
    • Porting parallel codes for air quality models to Grid
    • Integration, testing and demonstration of the application in the testbed environment
  • COAMPS Coupled Ocean/Atmosphere Mesoscale Prediction System: Atmospheric Components
    • Complex Data Quality Control
    • Analysis:
      • Multivariate Optimum Interpolation Analysis of Winds and Heights
      • Univariate Analyses of Temperature and Moisture
      • Optimum Interpolation Analysis of Sea Surface Temperature
    • Initialization:
      • Variational Hydrostatic Constraint on Analysis Increments
      • Digital Filter
    • Atmospheric Model:
      • Numerics: Nonhydrostatic, Scheme C, Nested Grids, Sigma-z
      • Physics: Convection, Explicit Moist Physics, Radiation, Surface Layer
    • Features:
      • Globally Relocatable (5 Map Projections)
      • User-Defined Grid Resolutions, Dimensions, and Number of Nested Grids
      • 6 or 12 Hour Incremental Data Assimilation Cycle
      • Can be Used for Idealized or Real-Time Applications
      • Single Configuration Managed System for All Applications
      • Operational at:
        • 7 Areas, Twice Daily, using 81/27/9 km or 81/27 km grids
        • Forecasts to 72 hours
      • Operational at all Navy Regional Centers (w/GUI Interface)
  • Status Quo … Quo Vadis ?
    • Current state (briefly)
    • S imulation done on a single system or local clusters
    • V isualisation on a single system, locally
    • What we are going to achieve
    • HPC, HTC, HPV in geographically distributed environment
    • I mproved interaction with the end user
    • N ear real time simulations
    • D ifferent visualisation equipments (adaptive according to the end-user needs), like
      • PDA
      • W orkstations
      • VR studio (e.g. CAVE)
  • Grid Application Programming Environment (WP2)
    • MPI code debugging and verification
    • Metrics and benchmarks
    • Interactive and semiautomatic performance evaluation tools
    • Specify, develop, integrate, test tools for HPC and HTC applications on the Grid
  • New Grid Services and Tools (WP3)
    • Portals and roaming access
    • Grid resource management
    • Grid monitoring
    • Optimisation of data access
    • Objectives
      • To develop interactive compute- and data-intensive applications
      • To develop user-friendly Grid environments
      • To offer easy access to the applications and Grid
      • To have reasonable trade-off between resource usage efficiency and application speedup
      • To support management issues while accessing resources
  • International Testbed Organisation (WP4) 15 si t e s
    • Testbed setup & incremental evolution
    • Integration with DataGrid
    • Infrastructure support
    • Verification & quality control
    Auth Thessaloniki U v Amsterdam FZK Karlsruhe TCD Dublin U A Barcelona LIP Lisbon CSIC Valencia CSIC Madrid USC Santiago CSIC Santander DEMO Athens UCY Nikosia CYFRONET Cracow II SAS Bratislava PSNC Poznan ICM & IPJ Warsaw
  • Summary
    • Layered structure of all X# applications
    • Reuse of SW from DataGrid and other # projects
    • Globus as the bottom layer of the middleware
    • Heterogeneous computer and storage systems
    • Distributed development and testing of SW
      • 12 partners in applications
      • 14 partners in middleware
      • 15 partners in testbeds
  • 1980s: Internet 1990s: Web 2000s: Grid
    • Where do we need to get to ?
      • Applications to support an “e-society” (“Cyber-Infrastructure”)
      • An international Grid infrastructure which hides the complexities from the users (“Invisible Computing”)
      • A powerful and flexible network infrastructure
    • Where do we need to invest ?
      • Applications targeted at realistic problems in “e-science”
      • Prototypes of Grid infrastructures
      • Maintain and improve the GEANT network
    • Expression of Interest for EU FP6 program:
      • “ Enabling Grids and e-Science in Europe (EGEE)”
    Grid-enabled Applications Prototype Grid Infrastructures G èant: World Class Networking
  • EGEE Project Space Applications Middleware & Tools Underlying Infrastructu re s GRIDLAB GRIA EGSO DATATAG CROSSGRID DATAGRID GRIP EUROGRID DAMIEN Science Industry / business Enabling Grids and E-Science in Europe (EGEE)
  • First results of EGEE Brainstorming C Jones, D O Williams, et al