―21st Century e-Knowledge Requires
           a High Performance e-Infrastructure‖

                          Keynote Presentation
                 40-year anniversary Celebration of SARA
                         Amsterdam, Netherlands
                            December 9, 2011



                                    Dr. Larry Smarr
Director, California Institute for Telecommunications and Information Technology
                              Harry E. Gruber Professor,
                    Dept. of Computer Science and Engineering
                        Jacobs School of Engineering, UCSD
                                                                             1
                                 http://lsmarr.calit2.net
Abstract
Over the next decade, advances in high performance computing will usher in an
era of ultra-realistic scientific and engineering simulation-- in fields as varied as
climate sciences, ocean observatories, radio
astronomy, cosmology, biology, and medicine. Simultaneously, distributed
scientific instruments, high-resolution video streaming, and the global
computational and storage cloud all generate terabytes to petabytes of data.
Over the last decade, the U.S. National Science Foundation funded the
OptIPuter project to research how user-controlled 10Gbps dedicated lightpaths
(or ―lambdas‖) could provide direct access to global data repositories, scientific
instruments, and computational resources from ―OptIPortals,‖ PC clusters
which provide scalable visualization, computing, and storage in the user's
campus laboratory. All of these components can be integrated into a seamless
high performance e-infrastructure required to support a next generation e-
knowledge data-driven society. In the Netherlands SARA and its partner
SURFnet has taken a global leadership role in building out and supporting such
a future-oriented e-infrastructure, enabling powerful computing, data
processing, networking, and visualization e-science services, necessary for the
pursuit of solutions to an increasingly difficult set of scientific and societal
challenges
Leading Edge Applications of Petascale Computers
Today Are Critical for Basic Research and Practical Apps




    Flames
                        Supernova             Fusion
       Parkinson’s
Supercomputing the Future
      of Cellulosic Ethanol Renewable Fuels
 Atomic-Detail Model of the Lignocellulose of Softwoods.
The model was built by Loukas Petridis of the ORNL CMB
Molecular Dynamics of Cellulose (Blue) and Lignin (Green)




          Computing the Lignin Force Field
   & Combining With the Known Cellulose Force Field
              Enables Full Simulations
             of Lignocellulosic Biomass

          www.scidacreview.org/0905/pdf/biofuel.pdf
Supercomputers
are Designing Quieter Wind Turbines




            Simulation of an Infinite-Span ―Flatback"
                       Wind Turbine Airfoil
                          Designed by
         the Netherlands Delft University of Technology
                Using NASA's FUN3D CFD Code
                    Modified by Georgia Tech
                 to Include a Hybrid RANS/LES
                        Turbulence model
  Georgia Institute of Technology Professor Marilyn Smith
    www.ncsa.illinois.edu/News/Stories/Windturbines/
Increasing the Efficiency of Tractor Trailers
                     Using Supercomputers
Oak Ridge Leadership Computing Facility & the Viz Team
  (Dave Pugmire, Mike Matheson, and Jamison Daniel)




                                                               BMI Corporation,
                                                         an engineering services firm
                                                                has teamed up
                                                              with ORNL, NASA,
                                                               and several BMI
                                                           corporate partners with
                                                             large trucking fleets
Realistic Southern California
Earthquake Supercomputer Simulations




                 Magnitude 7.7 Earthquake
 http://visservices.sdsc.edu/projects/scec/terashake/2.1/
Tornadogenesis From Severe Thunderstorms
       Simulated by Supercomputer




    Source: Donna Cox, Robert Patterson, Bob Wilhelmson, NCSA
Improving Simulation of the Distribution of Water Vapor
               in the Climate System




        ORNL Simulations by Jim Hack; Visualizations by Jamison Daniel
          http://users.nccs.gov/~d65/CCSM3/TMQ/TMQ_CCSM3.html
21st Century e-Knowledge Cyberinfrastructure:
       Built on a 10Gbps ―End-to-End‖ Lightpath Cloud
                                                                             HD/4k Live Video




                                                                                   HPC
                                Local or Remote
                                  Instruments
           End User
           OptIPortal

                   10G
                   Lightpaths




Campus
Optical Switch




                   Data Repositories & Clusters   HD/4k Video Repositories
The Global Lambda Integrated Facility--
  Creating a Planetary-Scale High Bandwidth Collaboratory

Research Innovation Labs Linked by 10G Dedicated Lambdas




         www.glif.is/publications/maps/GLIF_5-11_World_2k.jpg
SURFnet – a SuperNetwork Connecting to
  the Global Lambda Integrated Facility




          www.glif.is




               Visualization courtesy of
           Donna Cox, Bob Patterson, NCSA.
The OptIPuter Project: Creating High Resolution Portals
Over Dedicated Optical Channels to Global Science Data
                                                                              Scalable
                                OptIPortal                                    Adaptive
                                                                              Graphics
                                                                              Environment
                                                                              (SAGE)




                                                                              Picture
                                                                              Source:
                                                                              Mark
                                                                              Ellisman,
                                                                              David Lee,
                                                                              Jason Leigh


       Calit2 (UCSD, UCI), SDSC, and UIC Leads—Larry Smarr PI
       Univ. Partners: NCSA, USC, SDSU, NW, TA&M, UvA, SARA, KISTI, AIST
       Industry: IBM, Sun, Telcordia, Chiaro, Calient, Glimmerglass, Lucent
The Latest OptIPuter Innovation:
Quickly Deployable Nearly Seamless OptIPortables
 45 minute setup, 15 minute tear-down with two people (possible with one)




                                   Shipping
                                    Case




                  Image From the Calit2 KAUST Lab
The OctIPortable
Calit2/KAUST at SIGGRAPH 2011




        Photo:Tom DeFanti
3D Stereo Head Tracked OptIPortal:
            NexCAVE




  Array of JVC HDTV 3D LCD Screens
    KAUST NexCAVE = 22.5MPixels
 www.calit2.net/newsroom/article.php?id=1584
       Source: Tom DeFanti, Calit2@UCSD
Green
                                                    Initiative:

                                                    Can Optical
                                                   Fiber Replace
                                                   Airline Travel
                                                  for Continuing
                                                  Collaborations
                                                         ?




Source: Maxine Brown, OptIPuter Project Manager
EVL’s SAGE OptIPortal VisualCasting
              Multi-Site OptIPuter Collaboratory
                                   CENIC CalREN-XD Workshop Sept. 15, 2008
           Total Aggregate VisualCasting Bandwidth for Nov. 18, 2008
EVL-UI Chicago
           Sustained 10,000-20,000 Mbps!
             At Supercomputing 2008 Austin, Texas
             November, 2008                                        Streaming 4k

             SC08 Bandwidth Challenge Entry

                                       Remote:
 On site:
                                       U of Michigan
 SARA (Amsterdam)                      UIC/EVL
U Michigan
 GIST / KISTI (Korea)                  U of Queensland
 Osaka Univ. (Japan)                   Russian Academy of Science
                                       Masaryk Univ. (CZ)
              Requires 10 Gbps Lightpath to Each Site



              Source: Jason Leigh, Luc Renambot, EVL, UI Chicago
High Definition Video Connected OptIPortals:
Virtual Working Spaces for Data Intensive Research
                                                   2010


                                                              NASA Supports
                                                                Two Virtual
                                                                 Institutes




                                                               LifeSize HD

          Calit2@UCSD 10Gbps Link to
NASA Ames Lunar Science Institute, Mountain View, CA

           Source: Falko Kuester, Kai Doerr Calit2;
           Michael Sims, Larry Edwards, Estelle Dodson NASA
Genomic Sequencing
 is Driving Big Data




  November 30, 2011
BGI—The Beijing Genome Institute
          is the World’s Largest Genomic Institute




• Main Facilities in Shenzhen and Hong Kong, China
   – Branch Facilities in Copenhagen, Boston, UC Davis
• 137 Illumina HiSeq 2000 Next Generation Sequencing Systems
   – Each Illumina Next Gen Sequencer Generates 25 Gigabases/Day
• Supported by Supercomputing ~160TF, 33TB Memory
   – Large-Scale (12PB) Storage
Using Advanced Info Tech and Telecommunications
       to Accelerate Response to Wildfires




         Early on October 23, 2007, Harris Fire San Diego




             Photo by Bill Clayton, http://map.sdsu.edu/
NASA’s Aqua Satellite’s MODIS Instrument
        Pinpoints the 14 SoCal Fires
Calit2, SDSU, and NASA Goddard Used NASA Prioritization and OptIPuter Links
               to Cut time to Receive Images from 24 to 3 Hours




                                    October 22, 2007
      Moderate Resolution Imaging Spectroradiometer (MODIS)

                       NASA/MODIS Rapid Response
               www.nasa.gov/vision/earth/lookingatearth/socal_wildfires_oct07.html
High Performance Sensornets
                                                                        WIDC
                                                         PSAP
                                                                                                 KYVW
                                                                      COTD
                                          KNW
                                          B08
                                          1                             BDC
                                            GVDA                PFO
                                        WMC    Santa
                                               Rosa
                                 RDM                  AZRY
                                          CRY
                                            SND       BZN
                                                                                           KSW
                                                FRD
                      SMER
                                            SO                                                          DHL
            P474
                                                                                                 SLMS
                                MPO
                                                      LVA2
                                                                                                                                         Hans-Werner
                         P478
                                                                   BVDA                                 SCS           GLRS               Braun, HPWREN PI
                                                                             P486
                                                MTGY MVFD
                                                                                                                                        P510
                                                      P483                                                    CRRS     WLA
                                                                                                                                                             GMPK
              DSME                 RMNA                 CWC                         USGC
                                                                                                                                P506

                                                                                                                                 P499
                                                                        P480
                                                                                                                                         P509
                        CE                              MONP
            UCSD
70+ miles                                          DESC         MLO                                                      P497
to SCI                                                                                                         P494
                                P473                                                                                                               IID2
                       SDSU

                                                                                                                                        P500
             CNM
   155Mbps FDX PL
                6  GHz FCC licensed                     POTR                   P066
   155Mbps FDX 11 GHz FCC licensed
                               NSS
       to CI and
    45Mbps FDX 6 GHz FCC licensed
                               S
       PEMEX
    45Mbps FDX 11 GHz FCC licensed
                                                               approximately 50 miles:                                                          Backbone/relay node
    45Mbps FDX 5.8 GHz unlicensed
                                                                                                                                                Astronomy science site
    45Mbps-class HDX 4.9GHz
    45Mbps-class HDX 5.8GHz unlicensed
                                                                                                                                                Biology science site
    ~8Mbps HDX 2.4/5.8 GHz unlicensed             HPWREN Topology,                                                                              Earth science site
    ~3Mbps HDX 2.4 GHz unlicensed                                                                                                               University site
   115kbps HDX 900 MHz unlicensed                   August 2008                                                                                 Researcher location
     56kbps via RCS network                                                                                                                     Native American site
   dashed = planned                                                                                                                             First Responder site
Situational Awareness for Wildfires: Combining HD VTC
   with Satellite Images, HPWREN Cameras & Sensors

                               Ron Robers, San Diego County Supervisor




                               Howard Windsor, San Diego CalFIRE Chief




                 Source: Falko Kuester, Calit2@UCSD
The NSF-Funded Ocean Observatory Initiative With a
Cyberinfrastructure for a Complex System of Systems




       Source: Matthew Arrott, Calit2 Program Manager for OOI CI
From Digital Cinema to Scientific Visualization:
       JPL Simulation of Monterey Bay
                                                   4k Resolution




       Source: Donna Cox, Robert Patterson, NCSA
            Funded by NSF LOOKING Grant
OOI CI
           is Built on NLR/I2 Optical Infrastructure
              Physical Network Implementation
 Source: John
Orcutt, Matthew
Arrott, SIO/Calit2
A Near Future Metagenomics
Fiber Optic Cable Observatory




                    Source John Delaney, UWash
NSF Funds a Big Data Supercomputer:
         SDSC’s Gordon-Dedicated Dec. 5, 2011
• Data-Intensive Supercomputer Based on
  SSD Flash Memory and Virtual Shared Memory SW
  – Emphasizes MEM and IOPS over FLOPS
  – Supernode has Virtual Shared Memory:
     – 2 TB RAM Aggregate
     – 8 TB SSD Aggregate
     – Total Machine = 32 Supernodes
     – 4 PB Disk Parallel File System >100 GB/s I/O
• System Designed to Accelerate Access
  to Massive Data Bases being Generated in
  Many Fields of Science, Engineering, Medicine,
  and Social Science

            Source: Mike Norman, Allan Snavely SDSC
Rapid Evolution of 10GbE Port Prices
   Makes Campus-Scale 10Gbps CI Affordable
    • Port Pricing is Falling
    • Density is Rising – Dramatically
    • Cost of 10GbE Approaching Cluster HPC Interconnects
$80K/port
Chiaro
(60 Max)



                 $ 5K
                 Force 10
                 (40 max)                                     ~$1000
                                                              (300+ Max)

                                      $ 500
                                      Arista                  $ 400
                                      48 ports                Arista
                                                              48 ports
2005               2007                2009            2010




            Source: Philip Papadopoulos, SDSC/Calit2
Arista Enables SDSC’s Massive Parallel
             10G Switched Data Analysis Resource
10Gbps
            OptIPuter                           UCSD
                                                 RCI                Radical Change Enabled by
                                Co-Lo
                                                                      Arista 7508 10G Switch
                            5                                            384 10G Capable
                                  8                        CENIC/
                                            2
                 32                                         NLR
   Triton                                         4

                                                                               Existing
                                                       8
                                                                              Commodity
  Trestles 32                           2                                      Storage
   100 TF        12                                                             1/3 PB

                                            40128
                  8
   Dash
                                                                            2000 TB
                                            Oasis Procurement (RFP)
                                                                           > 50 GB/s
                      128         • Phase0: > 8GB/s Sustained Today
  Gordon                          • Phase I: > 50 GB/sec for Lustre (May 2011)
                                   :Phase II: >100 GB/s (Feb 2012)

                       Source: Philip Papadopoulos, SDSC/Calit2
The Next Step for Data-Intensive Science:
       Pioneering the HPC Cloud

21st Century e-Knowledge Requires a High Performance e-Infrastructure

  • 1.
    ―21st Century e-KnowledgeRequires a High Performance e-Infrastructure‖ Keynote Presentation 40-year anniversary Celebration of SARA Amsterdam, Netherlands December 9, 2011 Dr. Larry Smarr Director, California Institute for Telecommunications and Information Technology Harry E. Gruber Professor, Dept. of Computer Science and Engineering Jacobs School of Engineering, UCSD 1 http://lsmarr.calit2.net
  • 2.
    Abstract Over the nextdecade, advances in high performance computing will usher in an era of ultra-realistic scientific and engineering simulation-- in fields as varied as climate sciences, ocean observatories, radio astronomy, cosmology, biology, and medicine. Simultaneously, distributed scientific instruments, high-resolution video streaming, and the global computational and storage cloud all generate terabytes to petabytes of data. Over the last decade, the U.S. National Science Foundation funded the OptIPuter project to research how user-controlled 10Gbps dedicated lightpaths (or ―lambdas‖) could provide direct access to global data repositories, scientific instruments, and computational resources from ―OptIPortals,‖ PC clusters which provide scalable visualization, computing, and storage in the user's campus laboratory. All of these components can be integrated into a seamless high performance e-infrastructure required to support a next generation e- knowledge data-driven society. In the Netherlands SARA and its partner SURFnet has taken a global leadership role in building out and supporting such a future-oriented e-infrastructure, enabling powerful computing, data processing, networking, and visualization e-science services, necessary for the pursuit of solutions to an increasingly difficult set of scientific and societal challenges
  • 3.
    Leading Edge Applicationsof Petascale Computers Today Are Critical for Basic Research and Practical Apps Flames Supernova Fusion Parkinson’s
  • 4.
    Supercomputing the Future of Cellulosic Ethanol Renewable Fuels Atomic-Detail Model of the Lignocellulose of Softwoods. The model was built by Loukas Petridis of the ORNL CMB Molecular Dynamics of Cellulose (Blue) and Lignin (Green) Computing the Lignin Force Field & Combining With the Known Cellulose Force Field Enables Full Simulations of Lignocellulosic Biomass www.scidacreview.org/0905/pdf/biofuel.pdf
  • 5.
    Supercomputers are Designing QuieterWind Turbines Simulation of an Infinite-Span ―Flatback" Wind Turbine Airfoil Designed by the Netherlands Delft University of Technology Using NASA's FUN3D CFD Code Modified by Georgia Tech to Include a Hybrid RANS/LES Turbulence model Georgia Institute of Technology Professor Marilyn Smith www.ncsa.illinois.edu/News/Stories/Windturbines/
  • 6.
    Increasing the Efficiencyof Tractor Trailers Using Supercomputers Oak Ridge Leadership Computing Facility & the Viz Team (Dave Pugmire, Mike Matheson, and Jamison Daniel) BMI Corporation, an engineering services firm has teamed up with ORNL, NASA, and several BMI corporate partners with large trucking fleets
  • 7.
    Realistic Southern California EarthquakeSupercomputer Simulations Magnitude 7.7 Earthquake http://visservices.sdsc.edu/projects/scec/terashake/2.1/
  • 8.
    Tornadogenesis From SevereThunderstorms Simulated by Supercomputer Source: Donna Cox, Robert Patterson, Bob Wilhelmson, NCSA
  • 9.
    Improving Simulation ofthe Distribution of Water Vapor in the Climate System ORNL Simulations by Jim Hack; Visualizations by Jamison Daniel http://users.nccs.gov/~d65/CCSM3/TMQ/TMQ_CCSM3.html
  • 10.
    21st Century e-KnowledgeCyberinfrastructure: Built on a 10Gbps ―End-to-End‖ Lightpath Cloud HD/4k Live Video HPC Local or Remote Instruments End User OptIPortal 10G Lightpaths Campus Optical Switch Data Repositories & Clusters HD/4k Video Repositories
  • 11.
    The Global LambdaIntegrated Facility-- Creating a Planetary-Scale High Bandwidth Collaboratory Research Innovation Labs Linked by 10G Dedicated Lambdas www.glif.is/publications/maps/GLIF_5-11_World_2k.jpg
  • 12.
    SURFnet – aSuperNetwork Connecting to the Global Lambda Integrated Facility www.glif.is Visualization courtesy of Donna Cox, Bob Patterson, NCSA.
  • 13.
    The OptIPuter Project:Creating High Resolution Portals Over Dedicated Optical Channels to Global Science Data Scalable OptIPortal Adaptive Graphics Environment (SAGE) Picture Source: Mark Ellisman, David Lee, Jason Leigh Calit2 (UCSD, UCI), SDSC, and UIC Leads—Larry Smarr PI Univ. Partners: NCSA, USC, SDSU, NW, TA&M, UvA, SARA, KISTI, AIST Industry: IBM, Sun, Telcordia, Chiaro, Calient, Glimmerglass, Lucent
  • 14.
    The Latest OptIPuterInnovation: Quickly Deployable Nearly Seamless OptIPortables 45 minute setup, 15 minute tear-down with two people (possible with one) Shipping Case Image From the Calit2 KAUST Lab
  • 15.
    The OctIPortable Calit2/KAUST atSIGGRAPH 2011 Photo:Tom DeFanti
  • 16.
    3D Stereo HeadTracked OptIPortal: NexCAVE Array of JVC HDTV 3D LCD Screens KAUST NexCAVE = 22.5MPixels www.calit2.net/newsroom/article.php?id=1584 Source: Tom DeFanti, Calit2@UCSD
  • 17.
    Green Initiative: Can Optical Fiber Replace Airline Travel for Continuing Collaborations ? Source: Maxine Brown, OptIPuter Project Manager
  • 18.
    EVL’s SAGE OptIPortalVisualCasting Multi-Site OptIPuter Collaboratory CENIC CalREN-XD Workshop Sept. 15, 2008 Total Aggregate VisualCasting Bandwidth for Nov. 18, 2008 EVL-UI Chicago Sustained 10,000-20,000 Mbps! At Supercomputing 2008 Austin, Texas November, 2008 Streaming 4k SC08 Bandwidth Challenge Entry Remote: On site: U of Michigan SARA (Amsterdam) UIC/EVL U Michigan GIST / KISTI (Korea) U of Queensland Osaka Univ. (Japan) Russian Academy of Science Masaryk Univ. (CZ) Requires 10 Gbps Lightpath to Each Site Source: Jason Leigh, Luc Renambot, EVL, UI Chicago
  • 19.
    High Definition VideoConnected OptIPortals: Virtual Working Spaces for Data Intensive Research 2010 NASA Supports Two Virtual Institutes LifeSize HD Calit2@UCSD 10Gbps Link to NASA Ames Lunar Science Institute, Mountain View, CA Source: Falko Kuester, Kai Doerr Calit2; Michael Sims, Larry Edwards, Estelle Dodson NASA
  • 20.
    Genomic Sequencing isDriving Big Data November 30, 2011
  • 21.
    BGI—The Beijing GenomeInstitute is the World’s Largest Genomic Institute • Main Facilities in Shenzhen and Hong Kong, China – Branch Facilities in Copenhagen, Boston, UC Davis • 137 Illumina HiSeq 2000 Next Generation Sequencing Systems – Each Illumina Next Gen Sequencer Generates 25 Gigabases/Day • Supported by Supercomputing ~160TF, 33TB Memory – Large-Scale (12PB) Storage
  • 22.
    Using Advanced InfoTech and Telecommunications to Accelerate Response to Wildfires Early on October 23, 2007, Harris Fire San Diego Photo by Bill Clayton, http://map.sdsu.edu/
  • 23.
    NASA’s Aqua Satellite’sMODIS Instrument Pinpoints the 14 SoCal Fires Calit2, SDSU, and NASA Goddard Used NASA Prioritization and OptIPuter Links to Cut time to Receive Images from 24 to 3 Hours October 22, 2007 Moderate Resolution Imaging Spectroradiometer (MODIS) NASA/MODIS Rapid Response www.nasa.gov/vision/earth/lookingatearth/socal_wildfires_oct07.html
  • 24.
    High Performance Sensornets WIDC PSAP KYVW COTD KNW B08 1 BDC GVDA PFO WMC Santa Rosa RDM AZRY CRY SND BZN KSW FRD SMER SO DHL P474 SLMS MPO LVA2 Hans-Werner P478 BVDA SCS GLRS Braun, HPWREN PI P486 MTGY MVFD P510 P483 CRRS WLA GMPK DSME RMNA CWC USGC P506 P499 P480 P509 CE MONP UCSD 70+ miles DESC MLO P497 to SCI P494 P473 IID2 SDSU P500 CNM 155Mbps FDX PL 6 GHz FCC licensed POTR P066 155Mbps FDX 11 GHz FCC licensed NSS to CI and 45Mbps FDX 6 GHz FCC licensed S PEMEX 45Mbps FDX 11 GHz FCC licensed approximately 50 miles: Backbone/relay node 45Mbps FDX 5.8 GHz unlicensed Astronomy science site 45Mbps-class HDX 4.9GHz 45Mbps-class HDX 5.8GHz unlicensed Biology science site ~8Mbps HDX 2.4/5.8 GHz unlicensed HPWREN Topology, Earth science site ~3Mbps HDX 2.4 GHz unlicensed University site 115kbps HDX 900 MHz unlicensed August 2008 Researcher location 56kbps via RCS network Native American site dashed = planned First Responder site
  • 25.
    Situational Awareness forWildfires: Combining HD VTC with Satellite Images, HPWREN Cameras & Sensors Ron Robers, San Diego County Supervisor Howard Windsor, San Diego CalFIRE Chief Source: Falko Kuester, Calit2@UCSD
  • 26.
    The NSF-Funded OceanObservatory Initiative With a Cyberinfrastructure for a Complex System of Systems Source: Matthew Arrott, Calit2 Program Manager for OOI CI
  • 27.
    From Digital Cinemato Scientific Visualization: JPL Simulation of Monterey Bay 4k Resolution Source: Donna Cox, Robert Patterson, NCSA Funded by NSF LOOKING Grant
  • 28.
    OOI CI is Built on NLR/I2 Optical Infrastructure Physical Network Implementation Source: John Orcutt, Matthew Arrott, SIO/Calit2
  • 29.
    A Near FutureMetagenomics Fiber Optic Cable Observatory Source John Delaney, UWash
  • 30.
    NSF Funds aBig Data Supercomputer: SDSC’s Gordon-Dedicated Dec. 5, 2011 • Data-Intensive Supercomputer Based on SSD Flash Memory and Virtual Shared Memory SW – Emphasizes MEM and IOPS over FLOPS – Supernode has Virtual Shared Memory: – 2 TB RAM Aggregate – 8 TB SSD Aggregate – Total Machine = 32 Supernodes – 4 PB Disk Parallel File System >100 GB/s I/O • System Designed to Accelerate Access to Massive Data Bases being Generated in Many Fields of Science, Engineering, Medicine, and Social Science Source: Mike Norman, Allan Snavely SDSC
  • 31.
    Rapid Evolution of10GbE Port Prices Makes Campus-Scale 10Gbps CI Affordable • Port Pricing is Falling • Density is Rising – Dramatically • Cost of 10GbE Approaching Cluster HPC Interconnects $80K/port Chiaro (60 Max) $ 5K Force 10 (40 max) ~$1000 (300+ Max) $ 500 Arista $ 400 48 ports Arista 48 ports 2005 2007 2009 2010 Source: Philip Papadopoulos, SDSC/Calit2
  • 32.
    Arista Enables SDSC’sMassive Parallel 10G Switched Data Analysis Resource 10Gbps OptIPuter UCSD RCI Radical Change Enabled by Co-Lo Arista 7508 10G Switch 5 384 10G Capable 8 CENIC/ 2 32 NLR Triton 4 Existing 8 Commodity Trestles 32 2 Storage 100 TF 12 1/3 PB 40128 8 Dash 2000 TB Oasis Procurement (RFP) > 50 GB/s 128 • Phase0: > 8GB/s Sustained Today Gordon • Phase I: > 50 GB/sec for Lustre (May 2011) :Phase II: >100 GB/s (Feb 2012) Source: Philip Papadopoulos, SDSC/Calit2
  • 33.
    The Next Stepfor Data-Intensive Science: Pioneering the HPC Cloud

Editor's Notes

  • #6 The team is placing large eddy simulation (LES) methods into classic computational fluid dynamics (CFD) Reynolds-averaged Navier-Stokes (RANS) codes to see if they can more accurately model the physics than what is currently possible with the original turbulence models. The team is modifying two NASA simulation platforms currently used by the wind and rotorcraft industries, OVERFLOW and FUN3D, in order to better account for some of the dominant sources of aeroacoustically generated noise, such as turbulence-induced and blade-vortex-induced noise. The team is placing large eddy simulation (LES) methods into classic computational fluid dynamics (CFD) Reynolds-averaged Navier-Stokes (RANS) codes to see if they can more accurately model the physics than what is currently possible with the original turbulence models. The team is modifying two NASA simulation platforms currently used by the wind and rotorcraft industries, OVERFLOW and FUN3D, in order to better account for some of the dominant sources of aeroacoustically generated noise, such as turbulence-induced and blade-vortex-induced noise."Whereas the RANS models usually can pick up the gross features, if there is a lot of turbulence involved—a lot of flow separation—then they miss a lot of the secondary and tertiary features that can be important for performance, for vibration, and also for noise," Smith explains.The team is also investigating the efficiency and accuracy of combining LES and adaptive mesh refinement (AMR) methods in the turbine wakes. Blades rotating against the stationary tower and nacelle—the housing for the gears and dynamo—generate a very dynamic and turbulent wake dominated by powerful tip vortices and separated flow closer to the center. This dynamic nature often means there are regions of the wake with either too much or too little grid resolution just for an instant, increasing the computation time or degrading the results. Linking LES with adaptive meshing, refining, and coarsening focuses computational power where and when it is needed. This task won't be easy; adaptive meshing presents many complications, such as feature detection and load-balancing. But, once these difficulties are overcome, the outcome will be a better result at a lower price, according to Christopher Stone, a local small‑business subcontractor of Computational Science and Engineering (LLC) and member of Smith's Georgia Tech team. The team is running their simulations on NCSA's Mercury cluster, an Intel Itanium 2 cluster with1,774 nodes, 10TF. "Whereas the RANS models usually can pick up the gross features, if there is a lot of turbulence involved—a lot of flow separation—then they miss a lot of the secondary and tertiary features that can be important for performance, for vibration, and also for noise," Smith explains.The team is also investigating the efficiency and accuracy of combining LES and adaptive mesh refinement (AMR) methods in the turbine wakes. Blades rotating against the stationary tower and nacelle—the housing for the gears and dynamo—generate a very dynamic and turbulent wake dominated by powerful tip vortices and separated flow closer to the center. This dynamic nature often means there are regions of the wake with either too much or too little grid resolution just for an instant, increasing the computation time or degrading the results. Linking LES with adaptive meshing, refining, and coarsening focuses computational power where and when it is needed. This task won't be easy; adaptive meshing presents many complications, such as feature detection and load-balancing. But, once these difficulties are overcome, the outcome will be a better result at a lower price, according to Christopher Stone, a local small‑business subcontractor of Computational Science and Engineering (LLC) and member of Smith's Georgia Tech team.