Virtual Reality Systems and Applications

  • 250 views
Uploaded on

 

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
250
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. VIRTUAL REALITY DEFINITION: 1.1 Virtual reality (VR) is a technology which allows a user to interact with a computer-simulated environment, whether that environment is a simulation of the real world or an imaginary world. Most current virtual reality environments are primarily visual experiences, displayed either on a computer screen or through special or stereoscopic displays, but some simulations include additional sensory information, such as sound through speakers or headphones. Some advanced, hepatic systems now include tactile information, generally known as force feedback, in medical and gaming applications. Users can interact with a virtual environment or a virtual artifact (VA) either through the use of standard input devices such as a keyboard and mouse, or through multimodal devices such as a wired glove, the boom arm, and Omni directional treadmill. The simulated environment can be similar to the real world, for example, simulations for pilot or combat training, or it can differ significantly from reality, as in VR games. Virtual reality can be divided into: 1. The simulation of a real environment for training and education. 2. The development of an imagined environment for a game or interactive story. 1.2 TERMS OF VIRTUAL REALITY: • "Virtual" refers to its computer-generated existence; some prefer the term "cyber" to reinforce the world. • "Reality" is the more controversial term. Realism debates whirl around what levels of realistic detail are needed and affordable. Practitioners can choose types and amounts of reality varying from "objective" to "novel" and from specific to variable, or nonspecific. VR has five main components which are variable according per the instructional context requirements: 1
  • 2. VIRTUAL REALITY • dimensionality, • motion or animation, • interaction, • viewpoint or frame of reference, and • Immersion, or embodiment, through enhanced multisensory experiences. 1.3 WHY IS VR USEFUL? VR technologies address a wide range of interaction and immersion capabilities. Interaction varies learner control during the VR experience. Immersion varies from first-, second-, or third-person experiences and in physical, perceptual, and psychological options. 2
  • 3. VIRTUAL REALITY • The concept of virtual reality has been around for decades, even though the public really only became aware of it in the early 1990s. In the mid 1950s, a cinematographer named Morton Heilig envisioned a theatre experience that would stimulate all his audiences’ senses, drawing them in to the stories more effectively. He built a single user console in 1960 called the Sensorama that included a stereoscopic display, fans, odor emitters, stereo speakers and a moving chair. He also invented a head mounted television display designed to let a user watch television in 3-D. Users were passive audiences for the films, but many of Heilig’s concepts would find their way into the VR field. • Philco Corporation engineers developed the first HMD in 1961, called the Head sight. The helmet included a video screen and tracking system, which the engineers linked to a closed circuit camera system. They intended the HMD for use in dangerous situations -- a user could observe a real environment remotely, adjusting the camera angle by turning his head. Bell Laboratories used a similar HMD for helicopter pilots. They linked HMDs to infrared cameras attached to the bottom of helicopters, which allowed pilots to have a clear field of view while flying in the dark. In 1965, a computer scientist named Ivan Sutherland envisioned what he called the “Ultimate Display.” Using this display, a person could look into a virtual world that would appear as real as the physical world the user lived in. This vision guided almost all the developments within the field of virtual reality. Sutherland’s concept included: A virtual world that appears real to any observer, seen through an HMD and • augmented through three-dimensional sound and tactile stimuli • A computer that maintains the world model in real time • The ability for users to manipulate virtual objects in a realistic, intuitive way In 1966, Sutherland built an HMD that was tethered to a computer system. The computer provided all the graphics for the display. 3
  • 4. VIRTUAL REALITY Although it is difficult to categorize all VR systems, most configurations fall into three main categories and each category can be ranked by the sense of immersion, or degree of presence it provides. Immersion or presence can be regarded as how powerfully the attention of the user is focused on the task in hand. Immersion presence is generally believed to be the product of several parameters including level of interactivity, image complexity, stereoscopic view, and field of regard and the update rate of the display. For example, providing a stereoscopic rather than monoscopic view of the virtual environment will increase the sense of immersion experienced by the user. It must be stressed that no one parameter is effective in isolation and the level of immersion achieved is due to the complex interaction of the many factors involved. 3.1 Window on World Systems (WoW) • Some systems use a conventional computer monitor to display the visual world. This sometimes called Desktop VR or a Window on a World (WoW). This concept traces its lineage back through the entire history of computer graphics. In 1965, Ivan Sutherland laid out a research program for computer graphics in a paper called "The Ultimate Display" that has driven the field for the past nearly thirty years. • "One must look at a display screen," he said, "as a window through which one beholds a virtual world. The challenge to computer graphics is to make the picture in the window look real, sound real and the objects act real." 4
  • 5. VIRTUAL REALITY 3.2 Telepresence Telepresence is a variation on visualizing complete computer generated worlds. This technology links remote sensors in the real world with the senses of a human operator. The remote sensors might be located on a robot, or they might be on the ends of WALDO like tools. Fire fighters use remotely operated vehicles to handle some dangerous conditions. Surgeons are using very small instruments on cables to do surgery without cutting a major hole in their patients. The instruments have a small video camera at the business end. Robots equipped with Telepresence systems have already changed the way deep sea and volcanic exploration is done. NASA plans to use telerobotics for space exploration. There is currently a joint US/Russian project researching Telepresence for space rover exploration. 3.3 Mixed Reality • Merging the Telepresence and Virtual Reality systems gives the Mixed Reality or Seamless Simulation systems. Here the computer generated inputs are merged with Telepresence inputs and/or the users view of the real world. A surgeon's view of a brain surgery is overlaid with images from earlier CAT scans and real-time 5
  • 6. VIRTUAL REALITY ultrasound. A fighter pilot sees computer generated maps and data displays inside his fancy helmet visor or on cockpit displays. • The phrase "fish tank virtual reality" was used to describe a Canadian VR system reported in the 1993 InterCHI proceedings. It combines a stereoscopic monitor display using liquid crystal shutter glasses with a mechanical head tracker. The resulting system is superior to simple stereo-WoW systems due to the motion parallax effects introduced by the head tracker. 3.4 Immersive Systems • The ultimate VR systems completely immerse the user's personal viewpoint inside the virtual world. These "immersive" VR systems are often equipped with a Head Mounted Display (HMD). This is a helmet or a face mask that holds the visual and auditory displays. The helmet may be free ranging, tethered, or it might be attached to some sort of a boom armature. • A nice variation of the immersive systems use multiple large projection displays to create a 'Cave' or room in which the viewer(s) stand. An early implementation was called "The Closet Cathedral" for the ability to create the impression of an immense 6
  • 7. VIRTUAL REALITY environment. Within a small physical space. The Holodeck used in the television series "Star Trek: The Next Generation" is afar term extrapolation of this technology. 3.5 Non-Immersive Systems • Non-immersive systems, as the name suggests, are the least immersive implementation of VR techniques. Using the desktop system, the virtual environment is viewed through a portal or window by utilizing a standard high resolution monitor. Interaction with the virtual environment can occur by conventional means such as keyboards, mice and trackballs or may be enhanced by using 3D interaction devices such as a Space Ball; or Data Glove. • The non-immersive system has advantages in that they do not require the highest level of graphics performance, no special hardware and can be implemented on high specification PC clones. This means that these systems can be regarded as the lowest cost VR solution which can be used for many applications. However, this low cost means that these systems will always be outperformed by more sophisticated implementations, provide almost no sense of immersion and are limited to a certain extent by current 2D interaction devices. 3.6 Semi-Immersive Systems Semi-immersive systems are a relatively new implementation of VR technology and borrow considerably from technologies developed in the flight simulation field. A semiimmersive system will comprise of a relatively high performance graphics computing system which can be coupled with either: • A large screen monitor • A large screen projector system • Multiple television projection systems In many ways, these projection systems are similar to the IMAX theatres. Using a wide field of view, these systems increase the feeling of immersion or presence 7
  • 8. VIRTUAL REALITY experienced by the user. However, the quality of the projected image is an important consideration. It is important to calibrate the geometry of the projected image to the shape of the screen to prevent distortions and the resolution will determine the quality of textures, colors, the ability of define shapes and the ability of the user to read text onscreen. The resolutions of projection systems range from 1000 - 3000 lines but to achieve the highest levels it may be necessary to use multiple projection systems which are more expensive. 3.7 Fully Immersive Systems The most direct experience of virtual environments is provided by fully immersive VR systems. These systems are probably the most widely known VR implementation where the user either wears an HMD or uses some form of head-coupled display such as a Binocular Omni-Orientation Monitor or BOOM. Fully immersive VR systems tend to be the most demanding in terms of the computing power and level of technology required achieving a satisfactory level of realism and development is constantly underway to improve the technologies. Major areas of research and development include field of view vs. resolution trade-offs, reducing the size and weight of HMDs and reducing system lag times. 8
  • 9. VIRTUAL REALITY A good comparison between the various VR implementations is shown below. It is also important that these implementations are not regarded as distinct boundaries for implementations. For example, it is possible to turn a desktop system into a semiimmersive system by simply adding shutter glasses and the appropriate software, or a fully immersive system by connecting an HMD. Qualitative Performance Main Features Non- Immersive Semi-Immersive Full VR VR VR (Desktop) (Projection) (Head-coupled) Resolution High High Low - Medium Scale (perception) Low Medium - High High Medium High Sense of situational Low Immersive awareness (navigation skills) Field of regard Low Medium High Lag Low Low Medium - High Medium - High Medium - High Sense of None - low immersion There are a number of specialized types of hardware devices that have been developed or used for Virtual Reality applications. 9
  • 10. VIRTUAL REALITY 5.1 Head Mounted Display (HMD) • One hardware device closely associated with VR is the Head Mounted Device (HMD). These use some sort of helmet or goggles to place small video displays in front of each eye, with special optics to focus and stretch the perceived field of view. Most HMDs use two displays and can provide stereoscopic imaging. Others use a single larger display to provide higher resolution, but without the stereoscopic vision. • An HMD uses small monitors placed in front of each eye which can provide stereo, bi-ocular or monocular images. Stereo images are provided in a similar way to shutter glasses, in that a slightly different image is presented to each eye. The major difference is that the two screens are placed very close (50-70mm) to the eye, although the image, which the wearer focuses on, will be much further away because of the HMD optical system. Bi-ocular images can be provided by displaying identical images on each screen and monocular images by using only one display screen. 5.2 Binocular Omni-Orientation Monitor • The human brain perceives depth only because it has two eyes for visual input. Each eye sees a slightly different angle of the same scene. These two separate views are combined in the brain to form a single, 3D image, with parts of the data from each eye used to work out relative distances. • To replicate this effect in VR, you require a device that can do the same thing – give each eye a separate view. Enter the BOOM or Binocular Omni Orientation Monitor. 10
  • 11. VIRTUAL REALITY The Binocular Omni Orientation Monitor or BOOM is one of the oldest VR displays, and direct ancestor to the HMD. It consists of a 3-D display device suspended from a weighted boom that can swivel freely. Sometimes this boom is mounted on a trolley, sometimes affixed to the ceiling. • BOOMs typically communicate the user’s point of view to the computer system by the position and orientation of the view port. Typical BOOM configurations will swivel in six degrees of freedom, moving up and down and swiveling on the boom as well as rotating about an axis point, to closely replicate head movements without being attached to the head. 5.3 Cave Automatic Virtual Environment • A Cave Automatic Virtual Environment is an immersive virtual reality environment where projectors are directed to three, four, five or six of the walls of a room-sized cube. The name is also a reference to the allegory of the Cave in Plato's Republic where a philosopher contemplates perception, reality and illusion. 11
  • 12. VIRTUAL REALITY • The CAVE is a 10’ X 10’ X 9’ theatre that sits in a larger room measured to be around 35’ X 25’ X 13’. The walls of the CAVE are made up of rear-projection screens, and the floor is made of a down-projection screen. High-resolution projectors (the University of Illinois uses an Electro home Marquee 8000) display images on each of the screens by projecting the images onto mirrors which reflect the images onto the projection screens. 5.4 Data Gloves One common VR device is the instrumented glove. The use of a glove to manipulate objects in a computer is covered by a basic patent in the USA. Such a glove is outfitted with sensors on the fingers as well as an overall position/orientation tracker. There are a number of different types of sensors that can be used. This device is easily adapted to interface to a personal computer. It provides some limited hand 12
  • 13. VIRTUAL REALITY location and finger position data using strain gauges for finger bends and ultrasonic position sensors. 5.5 Control Devices One key element for interaction with a virtual world is a means of tracking the position of a real world object, such as a head or hand. There are numerous methods for position tracking and control. Ideally a technology should provide 3 measures for position(X, Y, Z) and 3 measures of orientation (roll, pitch, and yaw). One of the biggest problems for position tracking is latency, or the time required to make the measurements and preprocess them before input to the simulation engine. The simplest control hardware is a conventional mouse, trackball or joystick. While these are two dimensional devices, creative programming can use them for 6D controls. There are a number of 3 and 6 dimensional mice/trackball/joystick devices being introduced to the market at this time. These add some extra buttons and wheels that are used to control not just the XY translation of a cursor, but its Z dimension and rotations in all three directions. The Global Devices 6D Controllers is one such 6D joystick it looks like a racket ball mounted on a short stick. You can pull and twist the ball in addition to the left/right & forward/back of a normal joystick. Other 3D and 6D mice, joystick and force balls are available from Logitech, Mouse System Corp. among others. 13
  • 14. VIRTUAL REALITY There are two major categories for the available VR software: toolkits and authoring systems. Toolkits are programming libraries, generally for C or C++ that provides a set of functions with which a skilled programmer can create VR applications. Authoring systems are complete programs with graphical interfaces for creating worlds without resorting to detailed programming. These usually include some sort of scripting language in which to describe complex actions, so they are not really non-programming, just much simpler programming. The programming libraries are generally more flexible and have faster renders than the authoring systems, but you must be a very skilled programmer to use them. 6.1 Multiverse • Multiverse is a freeware UNIX based client/server system written by Robert Grant. It is a multi-user, non-immersive; X-Windows based Virtual Reality system, primarily focused on entertainment/research. It includes capabilities for setting up multi-person worlds and a client/server type world simulation over a local or long haul network. Multiverse source and binaries for several flavors of UNIX are available via anonymous ftp from medg.lcs.mit.edu in the directory pub/multiverse 6.2 Virtual Reality Studio • Virtual Reality Studio (or VR Studio, VRS) is a very low cost VR authoring system that does allow the user to define their own virtual worlds. This program is also known as "3D Construction Kit" in Europe. The program has a fairly nice graphical interface and includes a simple scripting language. It is available for about $100 from Domark for PC and Amiga systems. Worlds created with the program can be freely distributed with a player program. There are a quite number of these worlds available from the BBSes, and other sources. Compuserve's Cyber forum has several in its libraries, like the company provided demo VRSDMO.ZIP (VRS.TXT gives a solution to the demo game). Version 2 of VR Studio was released in early 1993. It has many new features including a much enhanced scripting language and editor, but also an 14
  • 15. VIRTUAL REALITY annoying number of bugs. The developers of VRS (Dimension International) are working hard to correct these. 6.3 Sense8 World Tool Kit • Sense 8 has announced a $795 programming library for Windows called World Tool Kit for Windows. This will be released late in 1993 as DLL for Windows systems. It will work directly with standard SVGA displays and show worlds with texture mapping either within a window or allow full screen display. The programming library will support DDE so a virtual world can be controlled from a spreadsheet, database or other program. • The Sense8 World Tool Kit (WTK) is probably the most widely used product of this type. It runs on a wide variety of platforms from i860 assisted PCs to high end SGI boxes. It has won several awards for excellence. 6.4 Autodesk Cyberspace Development Kit The Autodesk Cyberspace Development kit is another product in this range. It is a C++ library for MSDOS systems using the Metaware HighC/C++ compiler and Pharlap DOS 32bit extender. It supports VESA displays as well as several rendering accelerator boards (SPEA Fireboard, FVS Sapphire, Division's dView). I used this system for a few months and found it requires a strong background in C++ and a rendering accelerator card. VESA speeds were about 4 frames per second. 15
  • 16. VIRTUAL REALITY The basic parts of the system can be broken down into an Input Processor, a Simulation Processor, a Rendering Process, and a World Database. All these parts must consider the time required for processing. Every delay in response time degrades the feeling of 'presence' and reality of the simulation. 7.1 Input Processor The Input Processes of a VR program control the devices used to input information to the computer. There are a wide variety of possible input devices: keyboard, mouse, trackball, joystick, 3D & 6D position trackers (glove, wand, head tracker, body suit, etc.). A networked VR system would add inputs received from net. A voice recognition system is also a good augmentation for VR, especially if the user's hands are being used for other tasks. Generally, the input processing of a VR system is kept simple. The object is to get the coordinate data to the rest of the system with minimal lag time. Some position sensor systems add some filtering and data smoothing processing. Some glove systems add gesture recognition. This processing step examines the glove inputs and determines when a specific gesture has been made. Thus it can provide a higher level of input to the simulation. 16
  • 17. VIRTUAL REALITY 7.2 Simulation Processor • The core of a VR program is the simulation system. This is the process that knows about the objects and the various inputs. It handles the interactions, the scripted object actions, simulations of physical laws (real or imaginary) and determines the world status. This simulation is basically a discrete process that is iterated once for each time step or frame. A networked VR application may have multiple simulations running on different machines, each with a different time step. Coordination of these can be a complex task. • It is the simulation engine that takes the user inputs along with any tasks programmed into the world such as collision detection, scripts, etc. and determines the actions that will take place in the virtual world. 7.3 Rendering Processor The Rendering Processes of a VR program are those that create the sensations that are output to the user. A network VR program would also output data to other network processes. There would be separate rendering processes for visual, auditory, haptic (touch/force), and other sensory systems. Each renderer would take a description of the world state from the simulation process or derive it directly from the World Database for each time step. 7.4 World Database • The storage of information on objects and the world is a major part of the design of a VR system. The primary things that are stored in the World Database are the objects that inhabit the world, scripts that describe actions of those objects or the user, lighting, program controls, and hardware device support. • There are a number of different ways the world information may be stored: a single file, a collection of files, or a database. The multiple file method is one of the more common approaches for VR development packages. Each object has one or more files (geometry, scripts, etc.) and there is some overall 'world' file that causes the other 17
  • 18. VIRTUAL REALITY files to be loaded. Some systems also include a configuration file that defines the hardware interface connections. • Sometimes the entire database is loaded during program startup; other systems only read the currently needed files. A real database system helps tremendously with the latter approach. An Object Oriented Database would be a great fit for a VR system, but I am not aware of any projects currently using one. Virtual Reality is well known for its use with flight simulators and games. However, these are only two of the many ways virtual reality is being used today. This article will summarize how virtual reality is used in medicine, architecture, weather simulation, and chemistry. 1. MEDICINES • Mark Billinghurst, at the Hit Lab in Washington, has developed a prototype surgical assistant for simulation of paranasal surgery. During a simulated operation the system provides vocal and visual feedback to the user, and warns the surgeon when a dangerous action is about to take place. In addition to training, the expert assistant can be used during the actual operation to provide feedback and guidance. This is very 18
  • 19. VIRTUAL REALITY useful when the surgeon's awareness of the situation is limited due to complex anatomy. • Finally, Billinghurst and his associates are working at developing a toolkit for physicians which will help them create their own expert assistants for other types of surgery. 2. ARCHITECTURE: • The department of visualization and virtual reality at the IGD University in Germany has developed a program that uses radiosity and ray tracing to simulate light. This virtual reality program has applications in the area of architecture and light engineering. • With light simulation architects can examine how outdoor light will fall inside and outside their building before it is built. If the lighting needs to be redesigned, the architect can redesign the building on the computer and examine the new outdoor light effects. • In addition to outdoor light, lighting engineers use virtual reality to examine the effects of point lights, spotlights and other indoor light sources. An interior designer could examine how light will affect different room arrangements. 3. WEATHER SIMULATION: It has developed a visualization system for weather forecasting called "TriVis". TriVis accepts data from meteorological services such as satellite data, statistically corrected forecast data, precipitation data and fronts information. It then analyzes this data and uses fractal functions to create projections of storm systems. Using TriVis to visualize artificial clouds, meteorologists can predict weather with increased accuracy. 19
  • 20. VIRTUAL REALITY The data gathered and analyzed by the TriVis system is used by television weather reporters to show their audiences storm systems. TriVis has been used in television weather forecasts since 1993. 4. CHEMISTRY: • Real Mol is a program that uses virtual reality to show molecular models in an interactive, immersive environment. The scientist who uses the program wears a cyber glove and a head mounted display to interact with the molecular system. Using Real Mol scientists can move molecules or protein chains to create new molecules. This is useful in fields such as drug design. • Real Mol displays molecules in three ways: ball and stick model, stick model and CPK model. The molecules are rendered through a molecular dynamics simulation program. 8.1 ADVANTAGES: • VR is imaginably more personal than electronic mail or instant messaging, or even a letter or a telephone call. • VR is a great social leveler; it may find a common ground across differences in age, culture, and linguistic orientation. • People will be drawn together by similar interests instead of purely by geographic location. 20
  • 21. VIRTUAL REALITY • Communication will be challenging and rewarding, more effective and productive, and thus more enjoyable. • A tremendous opportunity for every 'connected' person to find his or her field and/or discipline. • After using a medium that provides total freedom of expression face-to-face communication may be found to be too confining. 8.2 DISADVANTAGES: • An inescapable aspect of social life is the formation and maintenance of interpersonal relationships. • Interaction ought not to be substituted for community. • Separates the 'haves' from the 'have-nots', a technology of Information Age Industrialized nations. • VR will provide a communication environment in which the dangers of deception and the benefits of creativity are amplified beyond the levels that humans currently experience in their interpersonal interactions. • Could lead to low self-esteem, feelings of worthlessness and insignificance, even selfdestructive acts. 21