PPT

667 views

Published on

  • Be the first to comment

  • Be the first to like this

PPT

  1. 1. Development of GRID environment for interactive applications Jesús Marco de Lucas ( [email_address] ) Instituto de Física de Cantabria, IFCA Consejo Superior de Investigaciones Científicas, CSIC , Santander, SPAIN DATAGRID DISSEMINATION DAY 14-V-2003 BARCELONA
  2. 2. The EU CrossGrid Project <ul><li>European Project ( ~5 M€, 3 year project started March 2002 ) </li></ul><ul><ul><li>proposed to CPA9, 6th IST call, V FP </li></ul></ul><ul><ul><li>Polish (Cracow & Poznan) / Spanish (CSIC & CESGA) / German (FZK) initiative with the support of CERN ( thanks to Fab!) </li></ul></ul><ul><ul><li>CYFRONET (Cracow) is the coordinator of the project (Michal Turala, project leader) </li></ul></ul><ul><li>Objectives: </li></ul><ul><ul><li>Extension of GRID in Europe, assuring interoperability with DataGrid </li></ul></ul><ul><ul><li>Interactive Applications (“human in the loop”): </li></ul></ul><ul><ul><ul><li>Environmental fields (meteorology/air pollution, flooding crisis management) </li></ul></ul></ul><ul><ul><ul><li>High Energy Physics (interactive analysis over distributed datasets) </li></ul></ul></ul><ul><ul><ul><li>Medicine (vascular surgery preparation) </li></ul></ul></ul><ul><ul><li>Need: </li></ul></ul><ul><ul><ul><li>Develop corresponding middleware and tools </li></ul></ul></ul><ul><ul><ul><li>Deploy on a pan-european testbed </li></ul></ul></ul><ul><li>Partners: </li></ul><ul><ul><li>Poland (CYFRONET, PSNC, ICM, INP, INS), Spain (CSIC: IFCA, IFIC, RedIRIS, UAB, USC), Germany (FZK, USTUTT, TUM), Slovakia (II SAS), Ireland (TCD), Portugal (LIP), Austria (U.Linz), The Nederlands(UvA), Greece (DEMO, AuTH), Cyprus (UCY) </li></ul></ul><ul><ul><li>Industry: Datamat (I), Algosystems (Gr) </li></ul></ul>
  3. 3. Surgical Planning <ul><li>Problem: vascular diseases </li></ul><ul><li>Solution: placement of a bypass by a surgeon </li></ul><ul><li>Planning for intervention is based on 3D images obtained from MRI or CT scans. </li></ul><ul><li>The attainable improvement in blood flow should determine which possibility is the best for a particular patient. </li></ul><ul><li>A 3D arterial model is built on the basis of the images, and presented to the surgeon in an inmersive intuitive environment </li></ul>A CT scanner Stenosis (narrowing of an artery) Viewing the arterial structure in an immersive 3D environment Observation
  4. 4. Surgical Planning <ul><li>Goal: </li></ul><ul><ul><li>Simulate vascular reconstruction </li></ul></ul><ul><li>Method: </li></ul><ul><ul><li>Interactive Virtual Reality Environment to </li></ul></ul><ul><ul><ul><li>View scanned data </li></ul></ul></ul><ul><ul><ul><li>Define proposed interventions </li></ul></ul></ul><ul><ul><ul><li>View simulation results </li></ul></ul></ul><ul><ul><li>Advanced fluid code to simulate flows </li></ul></ul>Arterial structures from scans with proposed bypasses Simulated flows <ul><li>Need Grid in interactive mode (the surgeon should not wait long…) </li></ul><ul><ul><li>Access distributed computational resources for flow simulation and visualization, so get a high performance environment at low cost </li></ul></ul><ul><ul><ul><li>Distribute simulations for different bypass configurations </li></ul></ul></ul>
  5. 5. Flood management <ul><li>Problem: flooding crisis in Slovakia </li></ul><ul><li>Solution: monitoring, forecasting, simulation, real-time actions </li></ul><ul><li>Precipitation forecasts based on meteorological simulations of different resolution from the meso-scale to the storm-scale. </li></ul><ul><li>For flash floods, high-resolution (1 km) regional atmospheric models have to be used along with remote sensing data (satellite, radar) </li></ul><ul><li>From the quantitative precipitation forecast, hydrological models are used to determine the discharge from the affected area. </li></ul><ul><li>Then hydraulic models simulate water flow through various river structures to predict the impact of the flood </li></ul><ul><li>C risis management teams should consult various experts, before making any decisions. The experts should be able to run simulations with different parameters and analyze the impact ( “ what-if ” analysis). </li></ul>monitoring forecasting simulation
  6. 6. Flood management <ul><li>Goal: </li></ul><ul><ul><li>Flooding risk prediction </li></ul></ul><ul><li>Method: </li></ul><ul><ul><li>Cascade of simulations </li></ul></ul><ul><ul><ul><li>Meteorological </li></ul></ul></ul><ul><ul><ul><li>Hydrological </li></ul></ul></ul><ul><ul><ul><li>Hydraulic </li></ul></ul></ul><ul><ul><li>Virtual Organization </li></ul></ul><ul><li>Need Grid in interactive mode (simulation results for “what-if” ) </li></ul><ul><ul><li>seamlessly connect together experts, data and computing resources needed for quick decisions </li></ul></ul><ul><ul><li>highly automated early warning system, based on hydro-meteorological (snowmelt) rainfall-runoff simulations </li></ul></ul>
  7. 7. Flood management <ul><li>Web portal for access </li></ul><ul><ul><li>Job submission </li></ul></ul><ul><ul><li>Visualization </li></ul></ul><ul><ul><li>See DEMO outside </li></ul></ul>
  8. 8. HEP interactive analysis <ul><li>The next Large Hadron Collider (LHC) at CERN, will accelerate protons to an energy enough to produce a particle hundreds times heavier: the Higgs Boson , the last piece in the Standard Model, key for understanding the origin of the mass. </li></ul><ul><li>Problem: All collisions will be recorded by sophisticated detectors, and the information stored in distributed databases with a volume of millions of gigabytes. But only few of those complex collisions will produce a Higgs Boson… </li></ul><ul><li>Solution: On-line filtering techniques + sophisticated mathematical algorithms for physics analysis, like neural networks </li></ul><ul><li>Physicists across the world are collaborating in this search… </li></ul>level 1 - special hardware 40 MHz (40 TB/sec) 75 KHz (75 GB/sec) 5 KHz (5 GB/sec) 100 Hz (100 MB/sec) data recording & offline analysis level 2 - embedded processors level 3 - PCs
  9. 9. HEP interactive analysis <ul><li>Goal: </li></ul><ul><ul><li>Physics analysis on large distributed databases </li></ul></ul><ul><li>Method: </li></ul><ul><ul><li>Distributed computing for </li></ul></ul><ul><ul><ul><li>Access to databases </li></ul></ul></ul><ul><ul><ul><li>Complex algorithms, like Neural Networks </li></ul></ul></ul><ul><ul><li>Use Web Portal as GUI </li></ul></ul><ul><li>Need Grid in interactive mode (physicists try different hypos) </li></ul><ul><ul><li>Reduce the waiting time to test a new algorithm or a new hypothesis from hours down to minutes by processing in distributed mode (DEMO TODAY) </li></ul></ul>
  10. 10. Meteo and Air Pollution <ul><li>Problem: Improve local predictions and refine air-pollution modeling close to a thermical power plant. </li></ul><ul><li>Solution: data-mining on databases of outputs from atmospheric circulation models, to improve downscaling </li></ul>Typical database ( ERA-15, ECMWF) Daily forecasts on a reticule covering the globe from 1979-1993 Atmospheric circulation pattern: v=( T ( 1ooomb ), T ( 850mb ),..., Z,H ... ) The dimension can reach 10 4
  11. 11. Meteo and Air Pollution <ul><li>Goal: </li></ul><ul><ul><li>Data-mining on databases and improvement on air-pollution prediction </li></ul></ul><ul><li>Method: </li></ul><ul><ul><li>Distributed computing for </li></ul></ul><ul><ul><ul><li>Data-Mining algorithm SOM </li></ul></ul></ul><ul><ul><ul><li>Air-Pollution STEM II </li></ul></ul></ul><ul><li>Need Grid in interactive mode (so the power plant reacts on time) </li></ul><ul><ul><li>Try different air-pollution estimations according to meteo predictions </li></ul></ul>Atmospheric circulation pattern: v=( T ( 1ooomb ), T ( 850mb ),..., Z,H ... ) The dimension can reach 10 4 SIMILAR PATTERNS close in the grid and in the CPs space!! 2/1/1979 1/1/1979
  12. 12. Application development <ul><li>Good interaction with final user community (clear use cases) </li></ul><ul><ul><li>Vascular Surgery: Leiden Hospital </li></ul></ul><ul><ul><li>Flooding crisis management: authorities in Slovakia </li></ul></ul><ul><ul><li>HEP interactive physics analysis: LHC physicists </li></ul></ul><ul><ul><li>Meteo and Air Pollution: power plant managers </li></ul></ul><ul><li>Middleware and Tools (significative effort) : </li></ul><ul><ul><li>Basic middleware: Globus 2 + DataGrid </li></ul></ul><ul><ul><li>Distributed computing using MPI: MPICH-G2 </li></ul></ul><ul><ul><ul><li>Support for correct use of MPI: profiling interface (MARMOT) </li></ul></ul></ul><ul><ul><ul><li>Benchmarking on a grid context and performance prediction </li></ul></ul></ul><ul><ul><li>Optimization of data access </li></ul></ul><ul><ul><li>Monitoring: </li></ul></ul><ul><ul><ul><li>the application itself, the network use, and the hardware </li></ul></ul></ul><ul><ul><li>Scheduling: </li></ul></ul><ul><ul><ul><li>Support for allocation with priority of resources needed for MPI </li></ul></ul></ul><ul><ul><li>Portals and Roaming Access </li></ul></ul><ul><ul><ul><li>Web Portal + VNC (Migrating Desktop) </li></ul></ul></ul><ul><li>Testbed: </li></ul><ul><ul><li>Support development, test and deployment of applications, tools, and middleware </li></ul></ul>
  13. 13. Migrating Desktop Multiple Sites Application OCM-G Data Access Portal and Roaming Access Infrastructure Monitoring Scheduling Agent DataGrid Job Management DataGrid Data Management Benchmark Globus Toolkit User Interaction Services Grid Visualization Kernel Tool ( Parallel ) Application Running Simulation Output Architecture
  14. 14. The CrossGrid Testbed <ul><li>16 sites (small & large) in 9 countries, connected through Géant + NReNs </li></ul><ul><li>+ Grid Services: EDG middleware (based on Globus) RB, VO, RC… </li></ul>UCY Nikosia DEMO Athens Auth Thessaloniki CYFRONET Cracow ICM & IPJ Warsaw PSNC Poznan CSIC IFIC Valencia UAB Barcelona CSIC-UC IFCA Santander CSIC RedIris Madrid LIP Lisbon USC Santiago TCD Dublin UvA Amsterdam FZK Karlsruhe Géant II SAS Bratislava
  15. 15. Using the Testbed <ul><li>Parallel Jobs (HEP Prototype using MPICH-G2) </li></ul><ul><ul><li>Running Across Sites </li></ul></ul>Grid Services (LIP) Site 1 Site i … network II JSS LB Globus Globus Globus Globus Globus Globus
  16. 16. Testbed Status http://mapcenter.lip.pt
  17. 17. User Support <ul><li>Software repository </li></ul><ul><li>http://gridportal.fzk.de </li></ul><ul><ul><li>Customized GNU Savannah (based on SourceForge ) </li></ul></ul><ul><ul><li>CVS browsable repository </li></ul></ul><ul><ul><li>Main current usage: </li></ul></ul><ul><ul><ul><li>ca. 1000 web-hits per day 7000 files, 356MB, 850.000 code-lines, 15.000 doc-lines + 174 doc/pdf-files </li></ul></ul></ul>
  18. 18. Integration work…
  19. 19. IST Demonstration <ul><li>CrossGrid has participated in the World Grid demonstration involving European and US sites from CrossGrid, DataGrid, GriPhyN and PPDG, that took place in November 2002. </li></ul><ul><li>It was the largest grid testbed in the world. </li></ul><ul><li>Applications from the CERN/LHC experiments CMS and Atlas </li></ul><ul><li>CrossGrid participated with 3 sites: </li></ul><ul><ul><li>LIP - Lisbon </li></ul></ul><ul><ul><li>FZK - Karlsruhe </li></ul></ul><ul><ul><li>IFIC - Valencia </li></ul></ul>
  20. 20. Extending the GRID in Europe <ul><li>Close collaboration and complementarity with DataGrid </li></ul><ul><ul><li>Interactive and parallel applications </li></ul></ul><ul><ul><li>Extending the GRID into new countries and communities </li></ul></ul><ul><ul><li>Keeping interoperability , in particular for the testbed </li></ul></ul><ul><li>Outreach and dissemination (visit our booth outside !!!): </li></ul><ul><ul><li>High impact at the national research level: </li></ul></ul><ul><ul><ul><li>See Poland, Germany, Spain, Greece examples </li></ul></ul></ul><ul><ul><li>ACROSSGRID conference in Santiago de Compostela, great success! </li></ul></ul><ul><ul><li>Dissemination effort to new communities (i.e. SouthEast Europe, Latin America) </li></ul></ul><ul><ul><li>New application areas start to be interested </li></ul></ul><ul><ul><li>Reforcing effort via GridStart (concertation meeting in June, 18-19) </li></ul></ul><ul><ul><li>Starting to establish company and final user contacts: </li></ul></ul><ul><ul><ul><li>Companies interested in middleware and tools </li></ul></ul></ul><ul><ul><ul><li>Institutions and companies interested as final users </li></ul></ul></ul><ul><li>Involved in proposals for new 6th FP: </li></ul><ul><ul><li>HealthGrid </li></ul></ul><ul><ul><li>FloodGrid </li></ul></ul><ul><ul><li>RT Grids… </li></ul></ul>
  21. 21. Extending the GRID in Europe <ul><li>… and pushing for a common grid infrastructure for e-Science in Europe: </li></ul><ul><li>EGEE </li></ul><ul><li>Keep in contact with us: </li></ul><ul><li>http://www.eu-crossgrid.org </li></ul><ul><li>Thanks in advance for your interest! </li></ul>

×