The OptIPuter and Its Applications


Published on

Invited Talk
Cyberinfrastructure for Humanities, Arts, and Social Sciences, A Summer Institute, SDSC
Title: The OptIPuter and Its Applications
La Jolla, CA

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

The OptIPuter and Its Applications

  1. 1. The OptIPuter and Its Applications Invited Talk Cyberinfrastructure for Humanities, Arts, and Social Sciences A Summer Institute SDSC UCSD July 26, 2006 Dr. Larry Smarr Director, California Institute for Telecommunications and Information Technologies Harry E. Gruber Professor, Dept. of Computer Science and Engineering Jacobs School of Engineering, UCSD
  2. 2. From “Supercomputer–Centric” to “Supernetwork-Centric” Cyberinfrastructure Megabit/s Gigabit/s Terabit/s Network Data Source: Timothy Lance, President, NYSERNet 32x10Gb “Lambdas” 1 GFLOP Cray2 60 TFLOP Altix Bandwidth of NYSERNet Research Network Backbones T1 Optical WAN Research Bandwidth Has Grown Much Faster Than Supercomputer Speed! Computing Speed (GFLOPS)
  3. 3. The OptIPuter Project – Creating a “SuperWeb” for Data Intensive Researchers <ul><li>NSF Large Information Technology Research Proposal </li></ul><ul><ul><li>Calit2 (UCSD, UCI) and UIC Lead Campuses—Larry Smarr PI </li></ul></ul><ul><ul><li>Partners: SDSC, USC, SDSU, NCSA, NW, TA&M, UvA, SARA, NASA Goddard, KISTI, AIST, CRC(Canada), CICESE (Mexico) </li></ul></ul><ul><li>Industrial Partners </li></ul><ul><ul><li>IBM, Sun, Telcordia, Chiaro, Calient, Glimmerglass, Lucent </li></ul></ul><ul><li>$13.5 Million Over Five Years—Now In the Fourth Year </li></ul>NIH Biomedical Informatics NSF EarthScope and ORION Research Network
  4. 4. What is the OptIPuter? <ul><li>Applications Drivers  Interactive Analysis of Large Data Sets </li></ul><ul><li>OptIPuter Nodes  Scalable PC Clusters with Graphics Cards </li></ul><ul><li>IP over Lambda Connectivity  Predictable Backplane </li></ul><ul><li>Open Source LambdaGrid Middleware  Network is Reservable </li></ul><ul><li>Data Retrieval and Mining  Lambda Attached Data Servers </li></ul><ul><li>High Defn. Vis., Collab. SW  High Performance Collaboratory </li></ul>See Nov 2003 Communications of the ACM for Articles on OptIPuter Technologies
  5. 5. Dedicated Optical Channels Makes High Performance Cyberinfrastructure Possible Parallel Lambdas are Driving Optical Networking The Way Parallel Processors Drove 1990s Computing ( WDM) Source: Steve Wallach, Chiaro Networks “ Lambdas”
  6. 6. National Lambda Rail (NLR) and TeraGrid Provides Cyberinfrastructure Backbone for U.S. Researchers NLR 4 x 10Gb Lambdas Initially Capable of 40 x 10Gb wavelengths at Buildout Links Two Dozen State and Regional Optical Networks DOE, NSF, & NASA Using NLR San Francisco Pittsburgh Cleveland San Diego Los Angeles Portland Seattle Pensacola Baton Rouge Houston San Antonio Las Cruces / El Paso Phoenix New York City Washington, DC Raleigh Jacksonville Dallas Tulsa Atlanta Kansas City Denver Ogden/ Salt Lake City Boise Albuquerque UC-TeraGrid UIC/NW-Starlight Chicago International Collaborators NSF’s TeraGrid Has 4 x 10Gb Lambda Backbone
  7. 7. Creating a North American Superhighway for High Performance Collaboration Next Step: Adding Mexico to Canada’s CANARIE and the U.S. National Lambda Rail
  8. 8. OptIPuter Scalable Adaptive Graphics Environment (SAGE) Allows Integration of HD Streams OptIPortal– Termination Device for the OptIPuter Global Backplane
  9. 9. OptIPortal– Termination Device for the OptIPuter Global Backplane <ul><li>20 Dual CPU Nodes, 20 24” Monitors, ~$50,000 </li></ul><ul><li>1/4 Teraflop, 5 Terabyte Storage, 45 Mega Pixels--Nice PC! </li></ul><ul><li>Scalable Adaptive Graphics Environment ( SAGE) Jason Leigh, EVL-UIC </li></ul>Source: Phil Papadopoulos SDSC, Calit2
  10. 10. The World’s Largest Tiled Display Wall— Calit2@UCI’s HIPerWall Zeiss Scanning Electron Microscope Center of Excellence in Calit2@UCI Albert Yee, PI Calit2@UCI Apple Tiled Display Wall Driven by 25 Dual-Processor G5s 50 Apple 30” Cinema Displays 200 Million Pixels of Viewing Real Estate! Falko Kuester and Steve Jenks, PIs Featured in Apple Computer’s “ Hot News” HDTV Digital Cameras Digital Cinema
  11. 11. 3D Videophones Are Here! The Personal Varrier Autostereo Display <ul><li>Varrier is a Head-Tracked Autostereo Virtual Reality Display </li></ul><ul><ul><li>30” LCD Widescreen Display with 2560x1600 Native Resolution </li></ul></ul><ul><ul><li>A Photographic Film Barrier Screen Affixed to a Glass Panel </li></ul></ul><ul><ul><ul><li>The Barrier Screen Reduces the Horizontal Resolution To 640 Lines </li></ul></ul></ul><ul><li>Cameras Track Face with Neural Net to Locate Eyes </li></ul><ul><li>The Display Eliminates the Need to Wear Special Glasses </li></ul>Source: Daniel Sandin, Thomas DeFanti, Jinghua Ge, Javier Girado, Robert Kooima, Tom Peterka—EVL, UIC
  12. 12. How Do You Get From Your Lab to the National LambdaRail? “ Research is being stalled by ‘information overload,’ Mr. Bement said, because data from digital instruments are piling up far faster than researchers can study. In particular, he said, campus networks need to be improved. High-speed data lines crossing the nation are the equivalent of six-lane superhighways, he said. But networks at colleges and universities are not so capable. “Those massive conduits are reduced to two-lane roads at most college and university campuses,” he said. Improving cyberinfrastructure, he said, “will transform the capabilities of campus-based scientists.” -- Arden Bement, the director of the National Science Foundation
  13. 13. To Build a Campus Dark Fiber Network— First, Find Out Where All the Campus Conduit Is!
  14. 14. UCSD Campus-Scale Routed OptIPuter with Nodes for Storage, Computation and Visualization
  15. 15. The New Optical Core of the UCSD Campus-Scale Testbed: Evaluating Packet Routing versus Lambda Switching <ul><li>Goals by 2007: </li></ul><ul><li>>= 50 endpoints at 10 GigE </li></ul><ul><li>>= 32 Packet switched </li></ul><ul><li>>= 32 Switched wavelengths </li></ul><ul><li>>= 300 Connected endpoints </li></ul>Approximately 0.5 TBit/s Arrive at the “Optical” Center of Campus Switching will be a Hybrid Combination of: Packet, Lambda, Circuit -- OOO and Packet Switches Already in Place Funded by NSF MRI Grant Lucent Glimmerglass Force10
  16. 16. OptIPuter@UCI is Up and Working Created 09-27-2005 by Garrett Hildebrand Modified 11-03-2005 by Jessica Yu 10 GE SPDS Catalyst 3750 in CSI ONS 15540 WDM at UCI campus MPOE (CPL) 10 GE DWDM Network Line Engineering Gateway Building, Catalyst 3750 in 3 rd floor IDF MDF Catalyst 6500 w/ firewall, 1 st floor closet Wave-2 : layer-2 GE. UCSD address space Floor 2 Catalyst 6500 Floor 3 Catalyst 6500 Floor 4 Catalyst 6500 Wave-1 : UCSD address space NACS-reserved for testing ESMF Catalyst 3750 in NACS Machine Room (Optiputer) Viz Lab Wave 1 1GE Wave 2 1GE Kim-Jitter Measurements This Week! Calit2 Building UCInet HIPerWall Los Angeles 1 GE DWDM Network Line Tustin CENIC Calren POP UCSD Optiputer Network
  17. 17. Calit2/SDSC Proposal to Create a UC Cyberinfrastructure of OptIPuter “On-Ramps” to TeraGrid Resources UC San Francisco UC San Diego UC Riverside UC Irvine UC Davis UC Berkeley UC Santa Cruz UC Santa Barbara UC Los Angeles UC Merced OptIPuter + CalREN-XD + TeraGrid = “OptiGrid” Source: Fran Berman, SDSC , Larry Smarr, Calit2 Creating a Critical Mass of End Users on a Secure LambdaGrid
  18. 18. OptIPuter Software Architecture--a Service-Oriented Architecture Integrating Lambdas Into the Grid GTP XCP UDT LambdaStream CEP RBUDP Globus XIO GRAM GSI Source: Andrew Chien, UCSD DVC Configuration Distributed Virtual Computer (DVC) API DVC Runtime Library Distributed Applications/ Web Services Telescience Vol-a-Tile SAGE JuxtaView Visualization Data Services LambdaRAM DVC Services DVC Core Services DVC Job Scheduling DVC Communication Resource Identify/Acquire Namespace Management Security Management High Speed Communication Storage Services IP Lambdas Discovery and Control PIN/PDC RobuStore
  19. 19. PI Larry Smarr Announced January 17, 2006 $24.5M Over Seven Years
  20. 20. Marine Genome Sequencing Project Measuring the Genetic Diversity of Ocean Microbes CAMERA’s Sorcerer II Data Will Double Number of Proteins in GenBank!
  21. 21. Announced January 17, 2006
  22. 22. CAMERA’s Direct Access Core Architecture Will Create Next Generation Metagenomics Server Traditional User Response Request Source: Phil Papadopoulos, SDSC, Calit2 + Web Services <ul><ul><li>Sargasso Sea Data </li></ul></ul><ul><ul><li>Sorcerer II Expedition (GOS) </li></ul></ul><ul><ul><li>JGI Community Sequencing Project </li></ul></ul><ul><ul><li>Moore Marine Microbial Project </li></ul></ul><ul><ul><li>NASA Goddard Satellite Data </li></ul></ul><ul><ul><li>Community Microbial Metagenomics Data </li></ul></ul>Flat File Server Farm W E B PORTAL Dedicated Compute Farm (1000 CPUs) TeraGrid: Cyberinfrastructure Backplane (scheduled activities, e.g. all by all comparison) (10000s of CPUs) Web (other service) Local Cluster Local Environment Direct Access Lambda Cnxns Data- Base Farm 10 GigE Fabric
  23. 23. The Future Home of the Moore Foundation Funded Marine Microbial Ecology Metagenomics Complex First Implementation of the CAMERA Complex Photo Courtesy Joe Keefe, Calit2 Major Buildout of Calit2 Server Room Underway
  24. 24. Calit2 and the Venter Institute Will Combine Telepresence with Remote Interactive Analysis Live Demonstration of 21st Century National-Scale Team Science OptIPuter Visualized Data HDTV Over Lambda 25 Miles Venter Institute
  25. 25. UIC/UCSD 10GE CAVEWave o n the National LambdaRail Emerging OptIPortal Sites CAVEWave Connects Chicago to Seattle to San Diego…and Washington D.C. as of 4/1/06 and JCVI as of 5/15/06 NEW! NEW! SunLight CICESE UW JCVI MIT SIO UCSD SDSU UIC EVL UCI OptIPortals
  26. 26. <ul><li>September 26-30, 2005 </li></ul><ul><li>Calit2 @ University of California, San Diego </li></ul><ul><li>California Institute for Telecommunications and Information Technology </li></ul>Borderless Collaboration Between Global University Research Centers at 10Gbps T H E G L O B A L L A M B D A I N T E G R A T E D F A C I L I T Y Maxine Brown, Tom DeFanti, Co-Chairs 100Gb of Bandwidth into the Calit2@UCSD Building More than 150Gb GLIF Transoceanic Bandwidth! 450 Attendees, 130 Participating Organizations 20 Countries Driving 49 Demonstrations 1- or 10- Gbps Per Demo i Grid 2005
  27. 27. CineGrid  Leverages OptIPuter Cyberinfrastructure to Enable Global “Extreme Media” Collaboration <ul><li>CineGrid  Experiments Aim to Push the State of the Art in: </li></ul><ul><ul><li>Streaming & Store-and-Forward File Transfer </li></ul></ul><ul><ul><ul><li>Using High-speed, Low-latency Network Protocols </li></ul></ul></ul><ul><ul><li>HDTV in Various Formats for: </li></ul></ul><ul><ul><ul><li>Teleconferencing </li></ul></ul></ul><ul><ul><ul><li>Telepresence </li></ul></ul></ul><ul><ul><ul><li>Production </li></ul></ul></ul><ul><ul><li>2K And 4K Digital Cinema Workflows and Distribution </li></ul></ul><ul><ul><li>Stereo </li></ul></ul><ul><ul><ul><li>In High Resolution (2K, 4K) </li></ul></ul></ul><ul><ul><ul><li>Virtual Reality In Higher Resolution (24 Megapixel) </li></ul></ul></ul><ul><ul><li>Distributed Tiled Displays With 20-100 Megapixels </li></ul></ul><ul><ul><li>Long term Digital Archiving </li></ul></ul><ul><li>International Workshop: </li></ul><ul><ul><li>Tokyo July 2006 </li></ul></ul><ul><ul><li>Calit2 Dec 2006 </li></ul></ul>Source: Tom DeFanti, Laurin Herr
  28. 28. OptIPuter 4K Telepresence over IP at iGrid 2005 Demonstrated Technical Basis for CineGrid  Lays Technical Basis for Global Digital Cinema Sony NTT SGI New Calit2 Digital Cinema Auditorium Keio University President Anzai UCSD Chancellor Fox
  29. 29. Calit2 Works with CENIC to Provide the California Optical Core for CineGrid  Calit2 UCI USC SFSU UCB <ul><li>Plus, 1Gb and 10Gb Connections to: </li></ul><ul><li>Seattle, Canada, Japan, Asia, Australia, New Zealand </li></ul><ul><li>Chicago, Canada, Japan, Europe, Russia, China </li></ul><ul><li>Tijuana </li></ul>CineGrid TM will Link UCSD/Calit2 and USC School of Cinema TV with Keio University Research Institute for Digital Media and Content Extended SoCal OptIPuter to USC School of Cinema-Television Calit2 UCSD Prototype of CineGrid TM Digital Archive of Films Partnering with SFSU’s Institute for Next Generation Internet Source: Laurin Herr, Pacific Interface CineGrid TM Project Leader
  30. 30. Calit2 and the Venter Institute Test CineGrid™ with HDTV Movie by John Carter Live Demonstration of 21st Century Entertainment Delivery June 14, 2006 JCVI Sony HDTV JH-3 JC Venter Institute Rockville, MD Calit2 Auditorium StarLight Chicago
  31. 31. iGrid 2005 Kyoto Nijo Castle Source: Toppan Printing Interactive VR Streamed Live from Tokyo to Calit2 Over Dedicated GigE and Projected at 4k Resolution
  32. 32. iGrid 2005 Cultural Heritage China and USA <ul><li>Great Wall Cultural Heritage </li></ul><ul><li>International Media Centre, China </li></ul><ul><li>San Diego State University, USA </li></ul><ul><li>Great Wall Society, China </li></ul><ul><li>San Diego Supercomputer Center, USA </li></ul><ul><li>Chinese Academy of Sciences, China </li></ul><ul><li>GLORIAD, USA </li></ul><ul><li>Chinese Institute of Surveying and Mapping, China </li></ul><ul><li>Cybermapping Lab, University of Texas-Dallas, USA </li></ul><ul><li>GEON viz/3D-scanning lab, University of Idaho, USA </li></ul><ul><li>Stanford University, USA </li></ul> Data Acquisition from Laser Scanning Combined with Photogrammetry Enables the Construction of Unique Cultural Heritage Images from China. 3D Designs Of The Great Wall Are Combined With 3D Scans Of Physical Images And Satellite Imagery Stored On Servers In China And San Diego. Source : Maxine Brown, EVL UIC
  33. 33. NSF’s Ocean Observatories Initiative (OOI) Envisions Global, Regional, and Coastal Scales LEO15 Inset Courtesy of Rutgers University, Institute of Marine and Coastal Sciences $300M in President’s Budget for OOI
  34. 34. Coupling Regional and Coastal Ocean Observatories Using OptIPuter and Web/Grid Services LOOKING: ( L aboratory for the O cean O bservatory K nowledge In tegration G rid) LOOKING Funded by NSF ITR- John Delaney, UWash, PI)
  35. 35. Using the OptIPuter to Couple Data Assimilation Models to Remote Data Sources Including Biology Regional Ocean Modeling System (ROMS) NASA MODIS Mean Primary Productivity for April 2001 in California Current System
  36. 36. Interactive Remote Data and Visualization Services <ul><li>Multiple Scalable Displays </li></ul><ul><li>Hardware Pixel Streaming </li></ul><ul><li>Distributed Collaboration </li></ul><ul><li>Scientific-Info Visualization </li></ul><ul><li>AMR Volume Visualization </li></ul><ul><li>Glyph and Feature Vis </li></ul><ul><li>Visualization Services </li></ul><ul><li>Data Mining for Areas of Interest </li></ul><ul><li>Analysis and Feature Extraction </li></ul><ul><li>Data Mining Services </li></ul>NCSA Altix Data and Vis Server Linking to OptIPuter An SDSC/NCSA Data Collaboration National Laboratory for Advanced Data Research
  37. 37. The Synergy of Digital Art and Science Visualization of JPL Simulation of Monterey Bay Source: Donna Cox, Robert Patterson, NCSA Funded by NSF LOOKING Grant 4k Resolution
  38. 38. First Remote Interactive High Definition Video Exploration of Deep Sea Vents-Prototyping NEPTUNE Source John Delaney & Deborah Kelley, UWash Canadian-U.S. Collaboration
  39. 39. High Definition Still Frame of Hydrothermal Vent Ecology 2.3 Km Deep White Filamentous Bacteria on 'Pill Bug' Outer Carapace Source: John Delaney and Research Channel, U Washington 1 cm.