Allosphere@CNSI: Towards a fully Immersive and Interactive Scientific Experience
  in partnership with the  California Nanosystems Institute   MAT/CNSI allosphere @ CNSI What is a  Digital Media   Center...
<ul><li>A team of digital media researchers at UCSB has been fostering a  cross-disciplinary field  that unites  science  ...
Allosphere Steering Committe <ul><li>JoAnn Kuchera-Morin (Media Arts and Technology Initiatives)‏ </li></ul><ul><li>Xavier...
<ul><li>The  Allosphere  </li></ul><ul><ul><li>synthesis, manipulation,  exploration  and analysis of large-scale data set...
<ul><li>Allosphere and other labs hosted in UCSB’s  California Nanosystems Institute  (CNSI)‏ </li></ul>The Building
<ul><li>The  space  itself is already a part of the final instrument:  </li></ul>The Space <ul><ul><li>three-story anechoi...
<ul><li>Once equipped, the CNSI Allosphere will be one of the largest immersive instruments in the world.  </li></ul>The F...
<ul><li>The AlloSphere is situated at one corner of the CNSI building, surrounded by different media labs. </li></ul><ul><...
Research in the Allosphere
<ul><li>Inherent research  comprises all of the activities that use the instrument as a research framework for immersive, ...
<ul><li>Sensor and Camera Tracking Systems </li></ul><ul><ul><li>research related with  computer vision  as well as innova...
<ul><li>System Design and Integrated Software/Hardware Research </li></ul><ul><ul><li>integration of the different hardwar...
<ul><li>Immersive Visual Systems Research </li></ul><ul><ul><li>re-creation of an immersive visual space in a spherical en...
<ul><li>Immersive Audio Systems Research </li></ul><ul><ul><li>re-creation of a  virtual 3D sound environment  in which so...
<ul><li>Functional research  includes those activities that will use the Allosphere as a tool for scientific exploration: ...
<ul><li>Multidimensional knowledge discovery </li></ul><ul><ul><li>deal with issues such as highly dimensional feature des...
<ul><li>Analysis of complex structures and systems </li></ul><ul><ul><li>Constructing the next generation of engineering p...
<ul><li>Human perception, behavior and cognition </li></ul><ul><ul><li>valuable instrument for  behavioural  scientists in...
<ul><li>Cartographic display and Information Visualization </li></ul><ul><ul><li>remote sensing and geographic information...
<ul><li>Artistic scientific visualization/auralization </li></ul><ul><ul><li>artistic principles are driving research into...
<ul><li>Most of the research in the Allosphere (Functional and Inherent) has a direct mapping into future forms of Enterta...
Prototype Projects in the Allosphere
Prototype-driven System <ul><li>State-of-the-art system: still many open research questions need to be addressed. </li></u...
The Allobrain <ul><li>In collaboration with UCLA Brain Imaging Institute, Marcos Novak and many MAT/CREATE students (view ...
Quantum Spin Precession <ul><li>In collaboration with Prof. David Awschalom and Spintronics lab. Audiovisual model for coh...
Multicenter Hydrogen Bond <ul><li>With Anderson Genotti – Matrerials Researcher and discoverer of the Hydrogen Bond – and ...
NanoCAD in the Allosphere <ul><li>In collaboration with BinanGroup's NanoCAD  </li></ul>
Alloproteins <ul><li>In collaboration with the Chemistry/CS department using Chromium and Vmd </li></ul>
An Engineering Challenge
Innovation <ul><li>The Allosphere presents innovative aspects in respect to existing environments such as The Cave </li></...
Innovation <ul><ul><li>Completely   interactive and multimodal   environment, including camara tracking systems, audio rec...
An Engineering Challenge
An Engineering Challenge The Visual subsystem
<ul><li>Allosphere display can only be compared to high-end state of the art planetariums (Gates planetarium at Denver Mus...
Overview <ul><li>Key Design Parameters </li></ul><ul><ul><li>Display quality/performance </li></ul></ul><ul><ul><li>Mechan...
Display Brightness <ul><li>What is required? </li></ul><ul><ul><li>Eyestrain-free operation over a decent range of color v...
<ul><li>Simulation results </li></ul><ul><ul><li>~10 cd/m 2  screen luminance per 42,000 lumen of total light input </li><...
<ul><li>Active stereo introduces more than 50% loss in brightness but ... </li></ul><ul><ul><li>5 cd/m 2  is still in the ...
<ul><li>“ Eye-limiting resolution” is not feasible (right now)‏ </li></ul><ul><ul><li>Approx. 150M pixels required to achi...
<ul><li>Design requirements relate to all aspects of system design </li></ul><ul><ul><li>Projector side </li></ul></ul><ul...
An Engineering Challenge The Audio subsystem
Audio Requirements <ul><li>“ Ear-limited” audio rendering </li></ul><ul><ul><li>Flat freq. response 20 Hz – 22kHz </li></u...
Spatial Audio <ul><li>Examples of Spatial audio: stereo, surround... </li></ul><ul><li>Geometrical model-based spatializat...
Wavefield Synthesis <ul><li>Huygens principle of superposition: many closely spaced speakers create a coherent wavefront w...
Spatial Techniques <ul><li>All these techniques present pros/cons and interesting research problems </li></ul><ul><ul><li>...
Input Sensing & Multimodal HCI
Interactivity <ul><li>Dynamic ,  user-driven  environment – how to best give users the ability to interact with data in ef...
Computing Infrastructure
Integration <ul><li>A typical multi-modal AlloSphere application will integrate services running on multiple hosts on the ...
Integration <ul><li>Still need software infrastructure to distribute the different graphic pipes from the generation engin...
Video Generation Subsystem <ul><li>In order to generate high resolution (1920x1200@120Hz) in active stereo we need high en...
Video Distribution
Video Distribution
Audio Generation Subsystem <ul><li>Problem: Distribute 500+ channels of hi-fi audio to speakers </li></ul><ul><ul><li>Dist...
Audio Generation Subsystem
Audio Generation Subsystem <ul><li>Synthesis/Processing Software: Audio team has extensive experience in developing such s...
Open Research Areas and People  <ul><li>Graphics (Hollerer)‏ </li></ul><ul><li>Audio (Amatriain)‏ </li></ul><ul><ul><li>Au...
Open Research Areas and People  <ul><li>Nanoscale systems representation (Oster, Garcia-Cervera)‏ </li></ul><ul><li>Brain ...
http://www.mat.ucsb.edu/allosphere THANKS!
Upcoming SlideShare
Loading in …5
×

The Allosphere

1,618 views

Published on

Presentation of the Allosphere project at UCSB. Imagine a 3 story high sphere suspended in a cube where 3D video and audio are used for scientific discovery and exploration.

Published in: Technology, Art & Photos
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,618
On SlideShare
0
From Embeds
0
Number of Embeds
24
Actions
Shares
0
Downloads
32
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

The Allosphere

  1. 1. Allosphere@CNSI: Towards a fully Immersive and Interactive Scientific Experience
  2. 2. in partnership with the California Nanosystems Institute MAT/CNSI allosphere @ CNSI What is a Digital Media Center doing in a Nanosystems institute?
  3. 3. <ul><li>A team of digital media researchers at UCSB has been fostering a cross-disciplinary field that unites science and engineering through the use of new media </li></ul><ul><li>Allosphere = Integration + availability to a larger community </li></ul>Description and Goals
  4. 4. Allosphere Steering Committe <ul><li>JoAnn Kuchera-Morin (Media Arts and Technology Initiatives)‏ </li></ul><ul><li>Xavier Amatriain (Media Arts and Technology Initiatives)‏ </li></ul><ul><li>Jim Blascovich (Psychology)‏ </li></ul><ul><li>Forrest Brewer (Electrical and Computer Engineering)‏ </li></ul><ul><li>Keith Clarke (Geography)‏ </li></ul><ul><li>Steve Fisher (Life Sciences)‏ </li></ul><ul><li>B.S. Manjunath (Electrical and Computer Engineering)‏ </li></ul><ul><li>Marcos Novak (Media Arts and Technology/Arts)‏ </li></ul><ul><li>Matthew Turk (Media Arts and Technology/Computer Science)‏ </li></ul><ul><li>T.B.D. (California Nanosystems Institute)‏ </li></ul>
  5. 5. <ul><li>The Allosphere </li></ul><ul><ul><li>synthesis, manipulation, exploration and analysis of large-scale data sets .... </li></ul></ul><ul><ul><li>environment that can simulate virtually real sensorial perception providing multi-user immersive interactive interfaces </li></ul></ul><ul><ul><ul><li>research into </li></ul></ul></ul><ul><ul><ul><ul><li>scientific visualization, numerical simulations, data mining, visual/aural abstract data representations, knowledge discovery, systems integration and human perception </li></ul></ul></ul></ul>Description and Goals
  6. 6. <ul><li>Allosphere and other labs hosted in UCSB’s California Nanosystems Institute (CNSI)‏ </li></ul>The Building
  7. 7. <ul><li>The space itself is already a part of the final instrument: </li></ul>The Space <ul><ul><li>three-story anechoic sphere, ten meters in diameter, containing a built-in spherical screen. </li></ul></ul>
  8. 8. <ul><li>Once equipped, the CNSI Allosphere will be one of the largest immersive instruments in the world. </li></ul>The Features <ul><ul><li>unique features: true 3D spherical projection of visual and aural data, and sensing and camera tracking for interactivity . </li></ul></ul>
  9. 9. <ul><li>The AlloSphere is situated at one corner of the CNSI building, surrounded by different media labs. </li></ul><ul><ul><li>Visual Computing </li></ul></ul><ul><ul><li>Interactive Installation </li></ul></ul><ul><ul><li>Immersion/Eversion </li></ul></ul><ul><ul><li>Robotics </li></ul></ul><ul><ul><li>Plurilabs </li></ul></ul>Other MAT Labs at CNSI
  10. 10. Research in the Allosphere
  11. 11. <ul><li>Inherent research comprises all of the activities that use the instrument as a research framework for immersive, multimodal environments: </li></ul>Inherent Research
  12. 12. <ul><li>Sensor and Camera Tracking Systems </li></ul><ul><ul><li>research related with computer vision as well as innovative interfaces and sensor networks that might be used to capture user interaction </li></ul></ul>Inherent Research. Interactivity
  13. 13. <ul><li>System Design and Integrated Software/Hardware Research </li></ul><ul><ul><li>integration of the different hardware and software components at play </li></ul></ul>Inherent Research. Systems
  14. 14. <ul><li>Immersive Visual Systems Research </li></ul><ul><ul><li>re-creation of an immersive visual space in a spherical environment </li></ul></ul>Inherent Research. Visual
  15. 15. <ul><li>Immersive Audio Systems Research </li></ul><ul><ul><li>re-creation of a virtual 3D sound environment in which sources can be placed at arbitrary points in space with convincing synthesis and that allows to simulate the acoustics of real spaces </li></ul></ul>Inherent Research. Audio
  16. 16. <ul><li>Functional research includes those activities that will use the Allosphere as a tool for scientific exploration: </li></ul>Functional Research
  17. 17. <ul><li>Multidimensional knowledge discovery </li></ul><ul><ul><li>deal with issues such as highly dimensional feature descriptors, similarity metrics, and indexing </li></ul></ul><ul><ul><li>Machine learning , image data mining and understanding... </li></ul></ul>Functional Research. Knowledge
  18. 18. <ul><li>Analysis of complex structures and systems </li></ul><ul><ul><li>Constructing the next generation of engineering paradigms requires a mechanism for rapid simulation, visualization and exploration supporting phenomena at multiple physical and temporal scales </li></ul></ul>Functional Research. Complex Systems
  19. 19. <ul><li>Human perception, behavior and cognition </li></ul><ul><ul><li>valuable instrument for behavioural scientists interested on the impact of virtual environments, large scale visualization, or spatial hearing. </li></ul></ul>Functional Research. Psychology
  20. 20. <ul><li>Cartographic display and Information Visualization </li></ul><ul><ul><li>remote sensing and geographic information science the opportunity to explore the potential of “inside-out” global data displays as tools for collective decision-making </li></ul></ul>Functional Research. Cartography
  21. 21. <ul><li>Artistic scientific visualization/auralization </li></ul><ul><ul><li>artistic principles are driving research into real-time interactivity and human manipulation of complex scientific data structures </li></ul></ul>Functional Research. Artistic Visualization
  22. 22. <ul><li>Most of the research in the Allosphere (Functional and Inherent) has a direct mapping into future forms of Entertainment and Edutainment . </li></ul><ul><ul><li>We envision collaboration from the Entertainment Industry </li></ul></ul>The Future of Entertainment
  23. 23. Prototype Projects in the Allosphere
  24. 24. Prototype-driven System <ul><li>State-of-the-art system: still many open research questions need to be addressed. </li></ul><ul><li>We want content to drive the system design. </li></ul><ul><li>For that reason we are prototyping the instrument with different projects/requirements. </li></ul>
  25. 25. The Allobrain <ul><li>In collaboration with UCLA Brain Imaging Institute, Marcos Novak and many MAT/CREATE students (view video)‏ </li></ul>
  26. 26. Quantum Spin Precession <ul><li>In collaboration with Prof. David Awschalom and Spintronics lab. Audiovisual model for coherent electron spin precession in a quantum dot </li></ul>
  27. 27. Multicenter Hydrogen Bond <ul><li>With Anderson Genotti – Matrerials Researcher and discoverer of the Hydrogen Bond – and Prof. Van De Walle. Visualization and multi-modal representation of unique atomic bonds for alternative fuel sources (view video). </li></ul>
  28. 28. NanoCAD in the Allosphere <ul><li>In collaboration with BinanGroup's NanoCAD </li></ul>
  29. 29. Alloproteins <ul><li>In collaboration with the Chemistry/CS department using Chromium and Vmd </li></ul>
  30. 30. An Engineering Challenge
  31. 31. Innovation <ul><li>The Allosphere presents innovative aspects in respect to existing environments such as The Cave </li></ul><ul><ul><li>Spherical environment with 360 degrees of visual stereophonic information: spherical immersive systems enhance subjective feelings of immersion, naturalness, depth and ``reality''. </li></ul></ul><ul><ul><li>It is fully multimedia as it combines latest techniques both on virtual audio and visual data spatialization. Combined audio-visual information can help information understanding but most existing immersive environments focus on visual data. </li></ul></ul>
  32. 32. Innovation <ul><ul><li>Completely interactive and multimodal environment, including camara tracking systems, audio recognition and sensor networks. </li></ul></ul><ul><ul><li>Pristine scientific instrument - e.g. the containing cube is fully anechoic chamber and details such as room modes or screen reflectivity have been studied. </li></ul></ul><ul><ul><li>Multiuser : Its size allows for up to 15 people to interact and collaborate on a common research task. </li></ul></ul>
  33. 33. An Engineering Challenge
  34. 34. An Engineering Challenge The Visual subsystem
  35. 35. <ul><li>Allosphere display can only be compared to high-end state of the art planetariums (Gates planetarium at Denver Museum of Nature&Science or Griffith Observatory in LA) </li></ul><ul><li>Some AlloSphere requirements are considerably more demanding </li></ul><ul><ul><li>Variety of types of graphics including smaller size text </li></ul></ul><ul><ul><li>Bright backgrounds and accurate color </li></ul></ul><ul><ul><li>Stereo projection </li></ul></ul><ul><ul><li>Excellent system flexibility and expandability </li></ul></ul>Overview
  36. 36. Overview <ul><li>Key Design Parameters </li></ul><ul><ul><li>Display quality/performance </li></ul></ul><ul><ul><li>Mechanical/facilities constraints </li></ul></ul><ul><ul><li>Overall system architecture, configuration management, automation, calibration </li></ul></ul><ul><ul><li>Cost </li></ul></ul><ul><li>Secondary Concerns </li></ul><ul><ul><li>Aging </li></ul></ul><ul><ul><li>Maintenance </li></ul></ul><ul><ul><li>Upgrades </li></ul></ul><ul><ul><li>Acoustic performance (of video equipment)‏ </li></ul></ul>The Visual subsystem
  37. 37. Display Brightness <ul><li>What is required? </li></ul><ul><ul><li>Eyestrain-free operation over a decent range of color values </li></ul></ul><ul><ul><li>Brightness levels at or above photopic threshold for good contrast and color acuity </li></ul></ul><ul><ul><li>High resolution </li></ul></ul><ul><ul><li>Stereo/mono operation </li></ul></ul><ul><li>Given: </li></ul><ul><ul><li>Screen area: ~320m 2 </li></ul></ul><ul><ul><li>Projector overlap factor: 1.7 </li></ul></ul><ul><ul><li>Screen gain, direction averaged: 0.12 </li></ul></ul><ul><ul><li>14 projectors with a max. 3K lumens/projector </li></ul></ul>
  38. 38. <ul><li>Simulation results </li></ul><ul><ul><li>~10 cd/m 2 screen luminance per 42,000 lumen of total light input </li></ul></ul><ul><li>Recommendations </li></ul><ul><ul><li>0.7 – 5 cd/m 2 recommended for multimedia domes </li></ul></ul><ul><ul><li>50 cd/m 2 for cinema projection (SMPTE)‏ </li></ul></ul><ul><li>Conclusion </li></ul><ul><ul><li>42K lumens is good enough for most applications </li></ul></ul>Display Brightness
  39. 39. <ul><li>Active stereo introduces more than 50% loss in brightness but ... </li></ul><ul><ul><li>5 cd/m 2 is still in the high-end of recommendations for domes </li></ul></ul><ul><ul><li>Active stereo introduces a dramatic gain in subjective quality perception. </li></ul></ul><ul><li>On the other hand, we cannot project much more than that because of: </li></ul><ul><ul><li>Back reflections </li></ul></ul><ul><ul><li>Cross-reflections </li></ul></ul>Display Brightness
  40. 40. <ul><li>“ Eye-limiting resolution” is not feasible (right now)‏ </li></ul><ul><ul><li>Approx. 150M pixels required to achieve in 30lp/deg (1 arc minute) in all directions </li></ul></ul><ul><li>11 lp/deg (3 arc minute) is the recommended value for domes </li></ul><ul><ul><li>20M pixels, 14 projectors </li></ul></ul>Resolution
  41. 41. <ul><li>Design requirements relate to all aspects of system design </li></ul><ul><ul><li>Projector side </li></ul></ul><ul><ul><ul><li>Best image quality, usually combined with color correction. </li></ul></ul></ul><ul><ul><ul><li>Limited configuration </li></ul></ul></ul><ul><ul><ul><li>Lower cost and higher flexibility </li></ul></ul></ul><ul><ul><li>Dedicated hardware </li></ul></ul><ul><ul><ul><li>Lower latencies </li></ul></ul></ul><ul><ul><ul><li>DLP projectors are problematic due to the extra frame buffer latency </li></ul></ul></ul><ul><ul><li>Custom Software Infrastructure? </li></ul></ul>Image Warping and Blending
  42. 42. An Engineering Challenge The Audio subsystem
  43. 43. Audio Requirements <ul><li>“ Ear-limited” audio rendering </li></ul><ul><ul><li>Flat freq. response 20 Hz – 22kHz </li></ul></ul><ul><ul><li>Dynamic Range 120 dB </li></ul></ul><ul><ul><li>SNR>90 dB </li></ul></ul><ul><ul><li>T60 < 0.75 sec </li></ul></ul><ul><ul><li>Spatial Accuracy: 3 ° in horizontal axis and 10° in elevation </li></ul></ul>
  44. 44. Spatial Audio <ul><li>Examples of Spatial audio: stereo, surround... </li></ul><ul><li>Geometrical model-based spatialization </li></ul><ul><ul><li>Mono source + dynamic positioning </li></ul></ul><ul><ul><li>Three “standard” techniques: </li></ul></ul><ul><ul><ul><li>Vector-based amplitude panning </li></ul></ul></ul><ul><ul><ul><li>Ambisonic spatialization </li></ul></ul></ul><ul><ul><ul><li>Wafefield Synthesis </li></ul></ul></ul>
  45. 45. Wavefield Synthesis <ul><li>Huygens principle of superposition: many closely spaced speakers create a coherent wavefront with an arbitrary source position </li></ul><ul><li>3D WFS has still not been attempted because of computational complexity (3D KH integral) : can use Ambisonics on the z axis </li></ul>
  46. 46. Spatial Techniques <ul><li>All these techniques present pros/cons and interesting research problems </li></ul><ul><ul><li>We already have a framework that can effectively combine them </li></ul></ul><ul><li>Spatial audio: huge success in the near future </li></ul><ul><li>Number of speakers depends on the specific technique </li></ul><ul><ul><li>but in order to have a reasonable spatial resolution we need ~ 500 speakers </li></ul></ul><ul><li>The technology to use (electrostats, ribbon, tweeter array...) is also still under discussion. </li></ul>
  47. 47. Input Sensing & Multimodal HCI
  48. 48. Interactivity <ul><li>Dynamic , user-driven environment – how to best give users the ability to interact with data in effective, compelling, and natural ways? </li></ul><ul><li>Powerful techniques for navigation, selection, manipulation, and signaling </li></ul><ul><li>Sense and perceive human movement, gesture, and speech via a network of sensors </li></ul><ul><ul><li>Cameras, microphones, haptic devices, etc. </li></ul></ul><ul><ul><li>Multimodal interaction! </li></ul></ul>
  49. 49. Computing Infrastructure
  50. 50. Integration <ul><li>A typical multi-modal AlloSphere application will integrate services running on multiple hosts on the LAN that implement a distributed system composed of: </li></ul><ul><ul><li>input sensing (camera, sensor, microphone), </li></ul></ul><ul><ul><li>gesture recognition/control mapping, </li></ul></ul><ul><ul><li>interface to a remote (scientific, numerical, simulation, data mining) application, </li></ul></ul><ul><ul><li>back-end processing (data/content accessing), </li></ul></ul><ul><ul><li>A/V rendering and projection management. </li></ul></ul>
  51. 51. Integration <ul><li>Still need software infrastructure to distribute the different graphic pipes from the generation engine to the render farm </li></ul><ul><li>Develop ad-hoc visual generation software engine and interconnection with data streams. </li></ul><ul><li>Efforts need to be put forward into building this intermediate integration/coordination layer by combining several specialized packages </li></ul><ul><ul><li>Cyberinfrastructure grant presented (Hollerer, Wolski and Shea)‏ </li></ul></ul>
  52. 52. Video Generation Subsystem <ul><li>In order to generate high resolution (1920x1200@120Hz) in active stereo we need high end video cards </li></ul><ul><li>Sample rendering farm for 14 stereo channels: 7 servers with one Quadro FX5600 in each one. </li></ul><ul><li>Blending and warping managed mostly at the projector side. </li></ul>Sample video generation unit with a render Linux Box with an NVidia Quadro 5600 (still to appear ) feeding two Christie Mirage S2K+
  53. 53. Video Distribution
  54. 54. Video Distribution
  55. 55. Audio Generation Subsystem <ul><li>Problem: Distribute 500+ channels of hi-fi audio to speakers </li></ul><ul><ul><li>Distributed rendering </li></ul></ul><ul><ul><ul><li>~1.3 Gbps (at 24bit/96kHz)‏ </li></ul></ul></ul><ul><ul><ul><li>Multichannel audio streaming over network: Yamaha's mLan, Gibson's Global Information Carrier, Sony's Supermac... </li></ul></ul></ul><ul><ul><ul><li>Sample synchronous output: Steve Butner's EtherSync </li></ul></ul></ul><ul><ul><ul><li>Network interface box to be custom built </li></ul></ul></ul><ul><ul><li>Single Render Point </li></ul></ul><ul><ul><ul><li>Develop custom DSP hardware </li></ul></ul></ul><ul><ul><ul><li>Harder signal distribution </li></ul></ul></ul>
  56. 56. Audio Generation Subsystem
  57. 57. Audio Generation Subsystem <ul><li>Synthesis/Processing Software: Audio team has extensive experience in developing such software and has ready-to-use frameworks such as CLAM (Amatriain, ACM MM Best Open Source Software 2006); CSL (Pope)‏ </li></ul>
  58. 58. Open Research Areas and People <ul><li>Graphics (Hollerer)‏ </li></ul><ul><li>Audio (Amatriain)‏ </li></ul><ul><ul><li>Auralization (Roads)‏ </li></ul></ul><ul><ul><li>3D Audio (Pope)‏ </li></ul></ul><ul><li>Systems (Hollerer, Brewer, Butner, Pope, Amatriain)‏ </li></ul><ul><li>Interactivity (Turk, Kuchera-Morin, Amatriain)‏ </li></ul><ul><li>Experiential Signal Processing (Gibson)‏ </li></ul><ul><li>HPC, Optimization (Wolski, Krintz)‏ </li></ul><ul><li>Content Creation </li></ul><ul><ul><li>Visual (Legrady)‏ </li></ul></ul><ul><ul><li>Music (Kuchera-Morin)‏ </li></ul></ul><ul><ul><li>VW (Novak)‏ </li></ul></ul>
  59. 59. Open Research Areas and People <ul><li>Nanoscale systems representation (Oster, Garcia-Cervera)‏ </li></ul><ul><li>Brain Imaging (Grafton)‏ </li></ul><ul><li>Molecular Dynamics (Shea)‏ </li></ul><ul><li>GIS (Clarke et al.)‏ </li></ul><ul><li>Bio-imaging: (Fisher, Manjunath)‏ </li></ul><ul><li>Perception (Loomis, Beall...)‏ </li></ul>
  60. 60. http://www.mat.ucsb.edu/allosphere THANKS!

×