Successfully reported this slideshow.



Published on

  • Be the first to comment

  • Be the first to like this


  1. 1. IntroductionWhat is Virtual Reality?Virtural Reality (VR) is a human-computer interface that simulates a realistic environment andallows users to interact with it. It is studying 3-D images on a 2-D computer screen. It can becompared to looking through a glass-bottom boat. VR allows one to put on the scuba equipmentand interact with the surroundings without getting wet.Why is it used?VR offers invaluable educational advantages. It is ideal for showing clients products that areotherwise too expensive to physically demonstrate. The 3-D sensory impact is unmistakableVIrtual Visual Environment Display - VIVEDDeveloped by NASA Johnson Space Center (JSC)/LinCom Corporation, the University of TexasMedical Branch at Galveston and the Galveston Independent School District. VIVID provides aunique educational experience using VR technologies. It integrates prescribed fly-throughs of thehuman skull and heart with other interactive multi-media (audio, video, etc.) capabilities.Steps required to create a Virtual Reality experience through the human body.
  2. 2. HardwareA Silicon Graphics Reality Engine computer was used to turn Computerized Axial Tomograhy(CAT/CT) and Magnetic Resonance Imaging slices into 3-D volumetric images and movies ofthe observer "flying through" the body.Viewing the final 3-D images was done on a Macintosh IIcx computer with 16M ram. A Macwas chosen because it is relatively affordable, it has widespread use in school systems, it is theleading engine of desktop multi-media, and there is a wide variety of software and hardwareavailable for this task.The VR movie can be stored either on a hard drive or transferred onto video tape and viewedthrough red-blue glasses. It can also be viewed using a VR Head-Mounted Display (HMD) orBinocular Omni Orientational Monitors (BOOM system). The final images can be stored on CD-ROM or laser disc.A limitation due to the high resolution of the body images prohibits the observer from "flying"through the human body at will. Only the technology to create prescribed "fly throughs" iscurrently available. The technology is available for full interactive virtual reality experience withless data-intense applications, i.e. with applications requiring less resolution.SoftwareFile Conversion and Data PreparationThe University of Texas Medical Branch at Galveston provided 1.5 mm thick CAT/CT slices ofthe human skull and MRI slices of the heart. These slices were used to create images. The skullis held in place during the CT scan by a foam band which created extraneous data. Scans of theskull were performed resulting in a data set of over 120 slices through the skull and 60 slicesthrough the mandible (jaw). A MRI scan of the human heart resulted in a data set containing 200slices.The data files created at the Medical Branch were then transferred to JSC IGOAL (IntegratedGraphics, Operations, and Analysis Laboratory). There the scans were cropped to eliminate asmuch extraneous data as possible without losing any critical information. IGOAL developed atool called "ctimager" which used thresholding to remove unwanted noise and extraneous datafrom each slice.Data Filtering and Conversion of Volume Data to Polygonal DataThe volume data was then converted, using a tool developed by IGOAL called "dispfly", into aform that can be displayed directly by the computer. This tool used multiple filtering algorithmsto prepare CT and MRI data for conversion to polygonal form. The anatomical models weregenerated based on the marching cubes algorithm.
  3. 3. The filtering process typically consisted of thresholding the data to eliminate most of the noise.A low pass filter was used to minimize the high-frequency noise that would produce an irregularbumpy surface when input to the algorithm. This process produced a relatively smooth surfacethat approximated the scanned specimen and reduced the number of noise generated polygons. Aunique filter was created for the heart data which only smoothed the data between scans, no otherfiltering was needed.Due to the large number of slices in both the heart and skull data sets, several models were made,each of which represented a small number of slices. A meshing algorithm, "meshit", wasdeveloped to improve the display performance. This algorithm converted the raw collection oftriangles into efficient strips. An average of over 100 triangles composed each triangle strip.Generating Stereo ImagesStereo sequences were rendered after the models were made. IGOAL developed a tool calledOOM (Object Orientation Manipulator) which generated the sequences by rendering each frameto disk. The images used red and blue color separation for representing stereo images. Once thesequence was recorded to disk it was converted to Macintosh .pict format and transferred to aMac. Full color image sequences were also transferred to the Mac for non-stereo viewing.Stereo Images and Multi-MediaOn the Mac the images were edited to produce a desired effect, such as digitized cadaveroverlays or text inserts describing what is being viewed. Using Apples QuickTime extension, theimages were converted into QuickTime movies for animation on the Mac.ResultThe result of all this is a self-contained educational experience giving students a new method oflearning as they interact with the subject matter through VR. A series of "fly throughs" werecreated allowing the observer to experience VR during a tour through the human heart or skull.Applications and Current Research in VIVEDCurrent research emphasizes creating a high resolution VR simulator of the human body foreducational purposes. Applications for this technology include any area in which complex 3-Drelationships must be understood, for example: Anatomy education, Education for mechanics ofall types, Education for chemistry students, Pathology studies for surgeons, Simulation of plasticand reconstructive surgery, Endoscopic training for surgeons.Other ApplicationsThe University of North Carolina at Chapel Hill has created a "predictive" modeling of aradiation treatment using dynamic images created by ultra sound, MRI and x-ray.
  4. 4. The Dartmouth Medical School in Hanover, N.H. has created computational models of thehuman face and lower extremities to examine the effects and outcomes of surgical procedures.Greenleaf Medical Systems in Palo Alto, CA has created EVAL and Glove Talker. Eval usessensor-lined data gloves and data suits to obtain range-of-motion and strength measurements ofinjured and disabled patients. The Glove Talker is a data glove sign-language device used forrehabilitation which allows someone without voice (stroke or cerebral palsy patient) to makegestures the computer understands. Using HMD, the patient can relearn how to open a door,walk, point or turn around in space.ConclusionCT scanned medical images of bone (i.e. the skull) can be generated into high quality VRimaging for prescribed fly-throughs on the Macintosh computer using either a HMD or BOOMsystem. A heart VR model that has been generated from MRI data is being developed.Preliminary results have shown that a high resolution model can be developed using this type ofimaging data. In order to maintain the goal of high quality VR imaging, some problems whichwere caused by the amount of data needed to deal with frame-by-frame sequencing of theprescribed fly-throughs had to be overcom. Alternative hardware and software solutions arebeing explored to alleviate this problem.Another problem has been the technology for the HMD display systems. The LCD displays donot have the resolution needed to maintain a high quality VR experience. The CRT displays arereaching the resolution needed however the cost is prohibitive for multiple education platforms.Surgical simulations may become routine especially to rehearse strategies for intricate and rareoperations.References"NASA TECHNOLOGY TRANSFER Commercial Applications of Aerospace Technology",National Aeronautics and Space Administration, Technology Applications.Porter, Stephen, "Virtual Reality", Computer Graphics World, (March, 1992), 42-54.Sprague, Laurie A., Bell, Brad, Sullivan, Tim, and Voss, Mark, "Virtural Reality In MedicalEducation and Assessment", Technology 2003, December 1993.