An Instantaneous Introduction to the Alliance Access Grid

328 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
328
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

An Instantaneous Introduction to the Alliance Access Grid

  1. 1. An Instantaneous Introduction to the Alliance Access Grid Michael Grobe Assistant Director Academic Computing Services The University of Kansas September 2000
  2. 2. The Access Grid is an Internet-based model for video conferencing developed by the Future Lab (FL) within the Mathematics and Computer Science (MCS) division of Argonne National Laboratories (ANL). The Access Grid is an extension of the Alliance Computational Grid which is a distributed computing environment designed to provide convenient access to high performance computer systems to any network user.
  3. 3. <ul><li>As described on the Access Grid web site: </li></ul><ul><li>&quot;The Access Grid is the ensemble of resources that can be used to support human interaction across the grid. It consists of: </li></ul><ul><ul><li>multimedia display, presentation and interaction environments, </li></ul></ul><ul><ul><li>interfaces to grid middleware, </li></ul></ul><ul><ul><li>interfaces to visualization environments </li></ul></ul>
  4. 4. The Access Grid will support large-scale distributed meetings, collaborative work sessions, seminars, lectures, tutorials and training. The Access Grid design point is group-to-group communication (thus differentiating it from desktop to desktop based tools that are focused on individual communication).“ The Access Grid includes the notion of a &quot;persistent&quot; video conferencing venue, a conferencing site operating continuously and accessible to a wide audience of users on an ad hoc basis
  5. 5. <ul><li>Basic functionality </li></ul><ul><li>An Access Grid &quot;node&quot; will be a small conference room or auditorium, provisioned with the equipment to participate in a multipoint video conference. The basic functionality provided within the node is: </li></ul><ul><ul><li>Audio encoding using one or more microphones </li></ul></ul><ul><ul><li>Video encoding or &quot;capture&quot; using one or more cameras </li></ul></ul><ul><ul><li>Audio presentation using one or more speakers </li></ul></ul><ul><ul><li>Video display via one or more computer monitors and/or video projection techniques </li></ul></ul><ul><ul><li>Display of PowerPoint &quot;slides&quot; under the control of a presenter located either on-site or at a remote site. </li></ul></ul>
  6. 6. To achieve this functionality the Access Grid model relies upon the ability to send and receive Internet Multicast traffic to and from all conference nodes. The Access Grid is based on software (vic and rat) developed as part of the Internet Multicast backbone, or MBONE, which provided multicast services over the unicast Internet backbone (using &quot;tunnels&quot;, or &quot;bridges&quot;, between multicast nexus sites).
  7. 7. <ul><li>Software components </li></ul><ul><li>The Access Grid model revolves around two pieces of software: </li></ul><ul><ul><li>vic: the video conferencing tool, and </li></ul></ul><ul><ul><li>rat: the robust audio tool. </li></ul></ul><ul><li>and involves several other applications </li></ul><ul><ul><li>Distributed PowerPoint </li></ul></ul><ul><ul><li>a MUD </li></ul></ul><ul><ul><li>the Multicast Beacon </li></ul></ul><ul><ul><li>Virtual Venue </li></ul></ul><ul><ul><li>Virtual Network Computing </li></ul></ul>
  8. 8. <ul><li>Video Conference (vic) </li></ul><ul><li>Vic was developed by Steve McCanne and Van Jacobson at the Lawrence Berkeley Labs. It is intended to link multiple sites with multiple simultaneous video streams over a multicast infrastructure. </li></ul><ul><li>vic CAN perform 2 basic functions: </li></ul><ul><ul><li>take data from video capture cards in the PC to which cameras (or other video devices) are attached and send it over the network. </li></ul></ul><ul><ul><li>receive data from the network and display it on a video monitor or on some other attached video device such as a video projector. </li></ul></ul>
  9. 9. Note that vic may be run in such a way that it only receives video transmissions or only sends transmissions; it is not required to do both at the same time. For more information about vic see: http://www-mice.cs.ucl.ac.uk/multimedia/software/vic
  10. 10. <ul><li>Robust Audio Tool (Rat) </li></ul><ul><li>rat is a recent version of the Visual Audio Tool, also developed by Steve McCanne and Van Jacobson at the Lawrence Berkely Labs. rat allows multiple users to engage in a audio conference over the Internet in multicast mode. rat can perform 2 basic functions: </li></ul><ul><ul><li>take data from the sound card in the PC to which microphones, headphones, or some other audio devices are attached and send it over the network. </li></ul></ul><ul><ul><li>receive data from the network and send it to speakers, headphones, or other attached sound processing device, such as a tape recorder, etc. </li></ul></ul>
  11. 11. rat displays a list of connected participants and identifies who is speaking and who is listening at any given time. For more information about rat see http://www-mice. cs . ucl .ac. uk /multimedia/software/rat and the Access Grid web site.
  12. 12. The Gentner AP400 Echo Canceller Within the Access Grid model, signals from and to attached audio equipment are funneled through an &quot;echo canceller&quot; made by the Gentner Communications Corporation , to eliminate certain kinds of echoes produced during networked conferencing. It is probably fair to say that the Gentner echo canceller is the major component of the audio conferencing system Networks of Gentners work together to provide useful audio signal exchanges.
  13. 13. <ul><li>The Gentners can use 3 different connectivity infrastructures: </li></ul><ul><ul><li>a point-to-point telephone connection, </li></ul></ul><ul><ul><li>a telephone connection to a telephone bridge, </li></ul></ul><ul><ul><li>a computer network, or </li></ul></ul><ul><ul><li>Gentner's own local area network, called G-link. </li></ul></ul><ul><li>When a Gentner uses a computer network to connect to other Gentners, it connects to the computer just as it would to a simple Codec (compression/decompression device). </li></ul>
  14. 14. The Distributed PowerPoint software The Argonne Distributed PowerPoint software allows a single presenter at one node to control PowerPoint applications running on computer systems located at other Access Grid nodes. For example, a conference speaker can run PowerPoint along with the Distributed PowerPoint master software on her laptop computer at the podium of one of the AG sites. When the speaker changes slides, the master will notify the DPPT server, which will notify DPPT clients running on systems at other nodes which will, in turn, direct their local PowerPoint programs to change slides.
  15. 15. Note that this approach requires that some PowerPoint features be removed or disabled prior to presentation, because Distributed PowerPoint cannot deal with them. (See later discussions of VNC and &quot;scan conversion&quot; for alternatives.) The DPPT clients can operate on PowerPoint slidesets published on a Web server, or on local copies of the slidesets.
  16. 16. The MUD software Operators at each site involved in an Access Grid conference typically keep in touch by using software originally developed for online &quot;role-playing&quot; games generically called Multi-User dragons and Dungeons&quot; games, or &quot;MUDs&quot;. (MUD functionality is similar to that of Inter net Relay Chat operating with access control.) Argonne runs a MUD server for use by Access Grid operators who run MUD clients on their desktop systems. tkMOO-lite is currently the recommended MUD client for this purpose, but others, such as Tiny-Fugue in the Unix environment can be used as well. tkMOO will run on both Windows and Linux systems, so it may be be run on any of the AG component systems described below.
  17. 17. <ul><li>The Multicast Beacon </li></ul><ul><li>To help diagnose multicast network problems during conferences, Argonne promotes the use of the NLANR multicast &quot;Beacon&quot; monitoring system, which includes three pieces of software: </li></ul><ul><ul><li>a Beacon to be run at each AG node, </li></ul></ul><ul><ul><li>a server to collect transmission statistics from a collection of Beacons, and </li></ul></ul><ul><ul><li>a Beacon viewer that displays data collected by the server. </li></ul></ul>
  18. 18. The Beacon at each node connects to a Multicast group and collects latency, loss, and packet misordering statistics from all other beacons connected to that Multicast group and sends them to the Beacon server. The Beacon viewer displays these traffic statistics as a matrix showing traffic to and from each Beacon attached to the server. (There is also a web-based Beacon.) At KU the Beacon is running on the AG node's video capture system.
  19. 19. <ul><li>The Virtual Venue software </li></ul><ul><li>Coordinating multiple group conferences can be complicated. Argonne has developed a collection of web pages and Java applications that can simplify the process. </li></ul><ul><li>The Virtual Venue is basically a web-page that lets users select a &quot;conference&quot; to attend. In this context a &quot;conference&quot; is composed of </li></ul><ul><ul><li>a vic multicast address, </li></ul></ul><ul><ul><li>a rat multicast address, and </li></ul></ul><ul><ul><li>a MUD identifier. </li></ul></ul>
  20. 20. If your systems are Virtual Venue-enabled, the display system operator can click on a conference room name and the vic, rat and MUD applications running on the video display, video capture and audio processing systems will all be started with target addresses and settings appropriate to the selected conference room. This coordination is accomplished by running an &quot;event server&quot; and the event controller on the display system, along with &quot;event listeners&quot; on the video capture and audio processing systems.
  21. 21. Virtual Network Computing (VNC) VNC allows users to share monitor screens over the Internet in a variety of modes. In the Access Grid environment, VNC allows a speaker to share his/her podium laptop with Access Grid display systems which can then project it at remote nodes. This is useful when a speaker wishes to give real-time demonstrations or present PowerPoint slides that include &quot;fancy&quot; features, such as animations, that cannot be displayed using Distributed PowerPoint. VNC employs a client server architecture, and there are clients and servers available for Windows98/NT/2000 and Unix operating systems.
  22. 22. Although not part of the original Access Grid canon, VNC has been employed during several Access Grid conferences, and shows promise for future applications. VNC eliminates the coordination effort required to display Distributed PowerPoint slide sets. (No files need to be downloaded ahead of time and no slide synchronization is required.) In general, update times are a function of the number of pixels changed and the number of remote viewers (as well as avaible bandwidth), so VNC will not be appropriate for all applications. Instructions for setting up a VNC relay, are presented in Using Unix-based VNC to relay other VNC traffic .
  23. 23. Basic system configurations The AG model uses a collection of commodity components to provide various services. To assure optimal responsiveness individual functions (video capture, video display, audio capture and presentation) are placed on separate computer systems. There is a variety of hardware and software configurations that can provide the required video conferencing functionality:
  24. 24. <ul><li>This section shows one such configuration: </li></ul><ul><ul><li>1 computer system running Linux for audio capture and presentation </li></ul></ul><ul><ul><li>1 computer system running Linux for video capture </li></ul></ul><ul><ul><li>1 computer running Windows2000 video display through 1 or more video cards controlling one or more video projectors illuminating 1 or more screens. </li></ul></ul><ul><ul><li>1 computer system running Win98 for controlling the audio echo-canceller/mixer </li></ul></ul><ul><ul><li>1 speaker's podium computer running Windows2000 or NT to control remote PowerPoint displays and/or give real-time demonstrations using some Windows application </li></ul></ul>
  25. 25. <ul><li>Audio capture and presentation computer </li></ul><ul><li>The audio capture computer: </li></ul><ul><ul><li>Converts analog audio from mixers and mics to digital form for transmission by rat over the multicast network </li></ul></ul><ul><ul><li>Converts digital audio to analog audio for distribution to room speakers and/or headsets . </li></ul></ul><ul><li>Software </li></ul><ul><ul><li>RedHat Linux version 6.1 </li></ul></ul><ul><ul><li>rat </li></ul></ul><ul><ul><li>AudioResourceManager from the Virtual Venue suite </li></ul></ul>
  26. 26. <ul><li>Video capture computer </li></ul><ul><li>The video capture computer system converts analog video from cameras and/or VCRs, etc. to digital for transmission by vic over the multicast network. </li></ul><ul><li>Software </li></ul><ul><ul><li>RedHat Linux version 6.1 </li></ul></ul><ul><ul><li>Stock kernel with the BTTV drivers </li></ul></ul><ul><ul><li>vic </li></ul></ul><ul><ul><li>VideoResourceMonitor from the Virtual Venue suite </li></ul></ul>
  27. 27. <ul><li>Video display computer </li></ul><ul><li>Receives video content over the network and displays it on the PC monitor as well as one or more other monitors and/or video projectors if desirable (using the ability of Win2K to display its console screen across multiple video cards) </li></ul><ul><ul><li>Decodes Video streams </li></ul></ul><ul><ul><li>Runs collaboration applications such as Distributed PowerPoint and the VNC viewer </li></ul></ul>
  28. 28. <ul><li>Software </li></ul><ul><ul><li>Windows 2000 </li></ul></ul><ul><ul><li>vic </li></ul></ul><ul><ul><li>PowerPoint </li></ul></ul><ul><ul><li>Argonne Distributed PowerPoint client </li></ul></ul><ul><ul><li>EventServerMonitor and DisplayResourceManager from the Virtual Venue suite </li></ul></ul><ul><ul><li>VNC viewer </li></ul></ul>
  29. 29. Echo canceller control computer The audio control computer runs Windows 98 and uses custom Genter Control Software to control the Gentner mixer/echo canceller. See http://www.gentner.com for more details. Within the KU ACS node, this function is provided by a 133MHz PC.
  30. 30. <ul><li>Speaker's podium computer </li></ul><ul><li>The speaker's podium computer runs: </li></ul><ul><ul><li>Windows98/NT/2000 </li></ul></ul><ul><ul><li>PowerPoint </li></ul></ul><ul><ul><li>the Argonne Distributed PowerPoint master software, and the </li></ul></ul><ul><ul><li>VNC server </li></ul></ul><ul><li>Configuration suggested by Argonne: Some laptop powerful enough to run PowerPoint </li></ul>
  31. 31. The KU ACS Podium laptop is connected to a &quot;scan converter&quot; that can convert the VGA/SVGA signal generated by the laptop to NTSC video expected by video capture cards. The CORIOscan Select from TVONE is lists for around $495, and can be used to produce a reasonably high-resolution image (1280x860).
  32. 32. <ul><li>Alternatives for displaying speaker slidesets </li></ul><ul><li>As mentioned earlier, the Access Grid provides several methods for displaying speaker slidesets. </li></ul><ul><ul><li>use Distributed PowerPoint. </li></ul></ul><ul><ul><li>This is the &quot;standard&quot; method and provides high quality representation at every site with very little network traffic. Using DPPT means getting each slide set prior to use, stripping it of special PPT features and publishing it on a Web server for distribution to each remote site. This approach may not work well if the speaker relies on special features (such as fancy animations) or launches other applications during the talk. </li></ul></ul>
  33. 33. <ul><ul><li>use a VNC server running on the Podium laptop and a VNC relay (as discussed earlier). </li></ul></ul>This approach provides high quality video including simple animations and all PowerPoint features, but introduces some update delay, and generates much more network traffic than the other alternatives. (If a version of VNC were produced to employ Multicast for image distribution network traffic would be significantly reduced.)
  34. 34. <ul><ul><li>split the Podium laptop video output, send one channel to a local projector for the local audience, and one to a scan converter and then to a video capture card for distribution over vic from the video capture machine. </li></ul></ul><ul><ul><li>This will give excellent update speed both locally and remotely, but relatively poor image quality at remote sites. Text smaller than 20 points is usually not legible, but animations and videos present well (as long as high resolution is not necessary). This approach could be a very effective, general solution IF vic could be used with a higher quality codec than the usual H.261. An MPEG-1 codecs is apparently under development and should provide a significant improvement. </li></ul></ul>
  35. 35. <ul><ul><li>use commercial streaming video package. </li></ul></ul><ul><ul><li>For example, during the Kansas issue of Alliance Chautauqua 2000, Cisco IPTV was employed to present full-motion animations at high resolution. However, setting up for IPTV broadcasts is complex and requires access to an IPTV server, so this alternative will not be available to all. </li></ul></ul>
  36. 36. <ul><li>Ancillary Servers </li></ul><ul><li>You may need to run some of the ancillary servers mentioned earlier on separate computer systems. For example, you may need boxes to run a </li></ul><ul><ul><li>Distributed PowerPoint server, </li></ul></ul><ul><ul><li>VNC relay server, </li></ul></ul><ul><ul><li>MUD, and/or a </li></ul></ul><ul><ul><li>Virtual Venue server (should you wish to define your own Virtual Venues). </li></ul></ul>
  37. 37. <ul><li>Operators </li></ul><ul><li>You will need from 1 to 4 operators, depending on how you apportion duties, to run an Access Grid node. With one operator per basic function you will need an operator for: </li></ul><ul><ul><li>video display, </li></ul></ul><ul><ul><li>video capture, </li></ul></ul><ul><ul><li>audio control, and </li></ul></ul><ul><ul><li>network monitoring. </li></ul></ul><ul><li>To some degree there is a trade-off between system costs and operator costs, and the staffing requirements will vary with the complexity of the presentations being offered at a site. </li></ul>
  38. 38. Additional Info The Access Grid web site: http://www.fp.mcs.anl.gov/fl/accessgrid/ For a more detailed version of this talk see: http://www.cc.ukans.edu/~acs/docs/access-grid-node/ Acknowledgments Some of the material for this web page has been taken from the Argonne Labs web site listed above, or from documents provided via that site.

×