ppt
Upcoming SlideShare
Loading in...5
×
 

ppt

on

  • 794 views

 

Statistics

Views

Total Views
794
Views on SlideShare
721
Embed Views
73

Actions

Likes
0
Downloads
12
Comments
0

1 Embed 73

http://www.cloud24by7.com 73

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • TeraGrid'06 June 2006 Charlie Catlett (cec@uchicago.edu)
  • TeraGrid'06 June 2006 Charlie Catlett (cec@uchicago.edu) DEEP- To ensure that we are harnessing the enormous power of the TeraGrid resources as an integrated system we have the Advanced Support for TeraGrid Applications (ASTA) program that involves our distributed user support team (~20 people across 8 sites, coordinated by GIG). The ASTA program places 1/4 to 1/2 of a support person in a scientific application team for 4 to 12 months, working with that team to exploit TeraGrid capabilities. The ASTA program supports roughly a dozen teams at a time, with a goal of 20-25 teams supported per year. WIDE- For over 2 decades the NSF high performance computing program, including TeraGrid, has served several thousand users very effectively. However, NSF alone funds tens of thousands of scientists, most of whom have computational requirements that do not frequently require supercomputers. The TeraGrid Science Gateways program is a set of partnerships with discipline-specific teams who are providing computational infrastructure for their science communities. The partnerships involve integrating TeraGrid as a computational and data management “service provider” embedded in the science-community infrastructure. Over a twenty we portal style science gateways are part of this program, with more being added continually. In addition, this program includes a peer-grid interoperation effort with Open Science Grid and a desktop application effort that is leveraging an NIH-funded biomedical project directed by Rick Stevens at the University of Chicago. OPEN- TeraGrid began as an infrastructure involving four partner sites, grew to nine sites, and is currently organized as a set of resource providers and a central “grid infrastructure group” (GIG) providing central services and support as well as management and operations. This structure allows TeraGrid to grow to dozens of resource provider sites, making it an “open” partnership. In addition, TeraGrid architecture is service-oriented, stressing open source standards such as are deployed with key software including the Globus Toolkit GT4, Condor, and other tools. In order to partner with the broader community of universities and other service and resource providers, TeraGrid is defining a set of “campus partnerships” during 2006. The goal of these partnership programs is to work with campuses (where most TeraGrid users reside) to improve and streamline TeraGrid access from campus systems, as well as to work with campuses to develop a set of frameworks that can be used to create national-scale cyberinfrastructure for computation and for data federation. These programs will explicitly reach beyond the R1 institutions to include the broader R&E community.
  • All the Cyberinfrastructure functions and resources have to work as a complete system
  • Usage and allocations by scientific discipline. Left-hand side shows usage and allocation by service units (SU, roughly 1 CPU-hour). Right-hand side shows the number of projects (principal investigators) by discipline. Discipline data is from the field selected by the PI upon application for an allocation, and does not necessarily correspond to the source of funding (either within NSF or from other agencies) for the PI. The column chart shows growth in requests for Development Allocations (DAC) awards, which allow up to 10,000 SU’s for exploratory work on TeraGrid, allowing a new PI to port and test codes while writing a proposal for a full allocation. DAC awards typically take 2-3 weeks to be granted, while full allocations are reviewed on a quarterly basis.
  • This Google Map mashup (using www.mapbuilder.net) shows TeraGrid principal investigators for roughly 1,000 allocated projects as of June 2006. Different icons represent different numbers of Pis per institution.
  • Two examples of applications harnessing both the high-performance supercomputers and large data-handling capabilities of TeraGrid.
  • TeraGrid'06 June 2006 Charlie Catlett (cec@uchicago.edu) The objectives of the TeraGrid Science Gateways initiative include working with many gateway projects and peer grid projects to define interaction protocols, mechanisms and policies such that gateways can be developed to operate with TeraGrid as well as other grid infrastructures and facilities. This collaborative approach involves international workshops and TeraGrid leadership in initiatives such as the “Grid Interoperation Now” (GIN) group, consisting of some 20 grid facilities from around the world pursuing interoperation in several specific areas during 2006. GIN uses the Global Grid Forum as an administrative home for the activities, see forge.ggf.org.
  • TeraGrid'06 June 2006 Charlie Catlett (cec@uchicago.edu) Voice over that we are expecting to pass a TeraGrid persistent Storage allocation Policy this meeting.
  • Two examples of applications harnessing both the high-performance supercomputers and large data-handling capabilities of TeraGrid.

ppt ppt Presentation Transcript

  • TeraGrid Overview Cyberinfrastructure Days Internet2 10/9/07 Mark Sheddon Resource Provider Principal Investigator San Diego Supercomputer Center [email_address]
  • TeraGrid
    • Funded by the Office of Cyberinfrastructure (OCI) within the National Science Foundation (NSF)
      • Grid Infrastructure Group (GIG)
        • Coordination software development and deployment
        • Integration and tracking of general partnership activities
        • Lead by University of Chicago/Argonne National Labs (Dane Skow)
      • Nine Computational Resource Providers (RP’s)
      • Four Software Integration Partners
  • TeraGrid Facility Partners August 2007 SDSC TACC UC/ANL NCSA ORNL PU IU PSC NCAR Caltech USC/ISI UNC/RENCI UW Resource Provider (RP) Software Integration Partner Grid Infrastructure Group (UChicago)
  • TeraGrid Objectives
    • DEEP Science: Enable Petascale Science
      • Make science more productive through an integrated set of very-high capability computational resources
        • Address key scientific challenges prioritized by users
    • WIDE Impact: Empower Scientific Communities
      • Bring TeraGrid capabilities to the broad science community
        • Partner with science community leaders – e.g. “Science Gateways”
    • Create an OPEN Infrastructure, OPEN Partnership
      • Provide a coordinated, general purpose, reliable set of services and resources
        • Partner with campuses and facilities
  • More Than Just Fast Computers ASTA Human Support Central Help Desk Education And Outreach Training Science Gateways Accounting Scientific Instruments ORNL SNS Control Data Generation Over 300 Tflops Computing 9 Resource Providers Applications Packages Security Start-up and Large Allocations Training Visualization Servers . Display Tools Software 2D and 3D Search Data Storage & Collections Security Retrieval Input Schema Metadata Over 100 Collections Ontologies Archive Security and Access Authentication Shibboleth Authorization Policy Governance Resource Providers Campus Partners Researchers Educators Faculty Over 3,200 Users Students Over 1,000 PIs
  • Users Come From Many Scientific Disciplines
  • TeraGrid Projects by Institution Blue: 10 or more PI’s Red: 5-9 PI’s Yellow: 2-4 PI’s Green: 1 PI 1000 projects, 3200 users TeraGrid allocations are available to researchers at any US educational institution by peer review. Exploratory allocations can be obtained through a biweekly review process. See www.teragrid.org.
  • How Can You Get Involved?
    • Apply for an Allocation
      • Computing Resources
      • Applications Support
    • Access through a Science Gateway
    • Participate in EOT Activities
      • Institutes, Workshops, Online Tutorials
      • TG Conference
    • Become a Resource Provider
  • Requesting Allocations of Computer Time
    • TeraGrid resources are provided for free to academic researchers and educators through peer review process
      • Development Allocations Committee (DAC) for start-up accounts up to 30,000 hours of time are requests processed in two weeks
      • Medium Resource Allocations Committee (MRAC) for requests of up to 500,000 hours of time are reviewed four times a year
      • Large Resource Allocations Committee (LRAC) for requests of over 500,000 hours of time are reviewed twice a year
  • Helping Applications Take Best Advantage of TG Resources Advanced Support for TeraGrid Applications (ASTA) Large Data; Virtualized Resources : Earthquake Simulation Olsen (SDSU), Okaya (USC), Jordan (USC), Southern California Earthquake Center Movie SDSC
  • TeraGrid Science Gateways Initiative: Community Interface to Grids
    • Common Web Portal or application interfaces (database access, computation, workflow, etc).
    • “ Back-End” use of TeraGrid computation, information management, visualization, or other services.
    TeraGrid Grid-X Grid-Y
  • Gateways are Growing in Numbers
    • 10 initial projects as part of TG proposal
    • >20 Gateway projects today
    • No limit on how many gateways can use TG resources
      • Prepare services and documentation so developers can work independently
    • Open Science Grid (OSG)
    • Special PRiority and Urgent Computing Environment (SPRUCE)
    • National Virtual Observatory (NVO)
    • Linked Environments for Atmospheric Discovery (LEAD)
    • Computational Chemistry Grid (GridChem)
    • Computational Science and Engineering Online (CSE-Online)
    • GEON(GEOsciences Network)
    • Network for Earthquake Engineering Simulation (NEES)
    • SCEC Earthworks Project
    • Network for Computational Nanotechnology and nanoHUB
    • GIScience Gateway (GISolve)
    • Biology and Biomedicine Science Gateway
    • Open Life Sciences Gateway
    • The Telescience Project
    • Grid Analysis Environment (GAE)
    • Neutron Science Instrument Gateway
    • TeraGrid Visualization Gateway, ANL
    • BIRN
    • Gridblast Bioinformatics Gateway
    • Earth Systems Grid
    • Astrophysical Data Repository (Cornell)
  • Wide Variety of HPC Training/Education Options
    • Live/Access Grid Sessions include:
      • Introduction to UT Grid Rodeo
      • Using the National Microbial Pathogen Data Resource
      • BlueGene Applications Workshop
      • Introduction to Parallel Computing
      • Summer Institutes
      • Toward Multicore Petascale Applications
      • Introduction to Scientific Visualization
    • On-line Self-Paced Tutorials
      • Over 35 topics and growing
    • Curricular Focused Workshops
      • Introduction to Interdisciplinary Computational Science Education for Educators
      • Computational Biology for Biology Educators
      • Computational Chemistry for Chemistry Educators and
      • Computational Physics for Physics Educators
      • Computing in the Humanities, Arts and Social Sciences
      • Parallel and Cluster Computing
  • Workshop Sites 2007 TeraGrid RP Minority Serving Institution Research 1 Univ. 2/4 Yr. College Workshop Conference Tutorial (e.g. SC, AAAS, AAPT) TeraGrid ‘07 Conf.
  • Broadening Participation in TeraGrid Science, Technology, and Outreach Sessions Student Competitions for high school, undergraduate and graduate students! www.teragrid.org/events/teragrid07 Keynote: Anita Jones (U. VA) Technology Keynote: Paul Strong (eBay) Science Keynote: Phil Maechling (USC/SCEC) Over 350 Attendees
  • Coming Down the Track TeraGrid Resource Futures
    • TACC
      • Sun (Ranger), 500 TF (peak), Dec 2007
    • LSU
      • IBM Blade, 25 TF (peak), 1Q08
    • U. Tennessee
      • Cray XT, 1 PF (peak), 1Q09
    • NCSA
      • IBM Power, ~1 PF (sustained), 2011
  • ONWARD! www.teragrid.org
  •  
  • Helping Applications Take Best Advantage of TG Resources Advanced Support for TeraGrid Applications (ASTA) Large Data; Virtualized Resources : Earthquake Simulation Olsen (SDSU), Okaya (USC), Southern California Earthquake Center Sources: Tom Jordan (USC). Images SDSC.