Your SlideShare is downloading. ×
0
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Mapping - Reality and Virtual Reality (Strictly No AR!!)
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Mapping - Reality and Virtual Reality (Strictly No AR!!)

145

Published on

A small project done at Virtual Reality Lab at CPDM, IISc. The PPT talks about an idea of mapping Reality and Virtual Reality using electromagnetic position trackers and 3D head mounted display. This …

A small project done at Virtual Reality Lab at CPDM, IISc. The PPT talks about an idea of mapping Reality and Virtual Reality using electromagnetic position trackers and 3D head mounted display. This project is pure Virtual Reality based implementation and is not dependent on camera based Augmented Reality techniques.

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
145
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Mapping Physical andVirtual Worlds for better immersion Nitesh Bhatia | CPDM, IISc 2 March 2012 Wednesday, 4 June 14
  • 2. FUSION Real-Virtual World Reality -Virtual Reality What we perceive through Eyes What we perceive through HMD Physical World Virtual World Wednesday, 4 June 14
  • 3. Field of View (FoV) - Eye vs HMD Accessibility for the Disabled - A Design Manual for a Barrier Free Environment Eye HMD Text 62º 62º 20º 20º 120º 40º Eye HMD 15º 15º Wednesday, 4 June 14
  • 4. THX HDTV Setup Guidelines 40º FOV Differentiable areas of Eye FOV Things that can be done... ✓Identify Text written at far distance ✓Identify Shapes at medium distance ✓Identify Color at near distance Field of View (FoV) - Eye vs HMD Wednesday, 4 June 14
  • 5. Challenges [1] Due to limited FoV of HMD it is expected that we’ll be facing challenges for following tasks • Identification of Text and Shapes at near distance • Identification of Color and Contrast at far distance. Wednesday, 4 June 14
  • 6. Variable FoV of Eye Wide FoV Narrow FoV eye-lookat eye-lookat Wednesday, 4 June 14
  • 7. OpenGL Camera •OpenGL Camera requires FoV and zNear-zFar data explicitly Wednesday, 4 June 14
  • 8. Challenges [2] • Human Eye FoV (viewport) varies according to the point where eye is looking-at. • In Graphics camera viewport is fixed since the point where eye is looking-at it not known. • Challenges are expected in mapping Real World toVirtual World because of dynamic FoV-viewport. Wednesday, 4 June 14
  • 9. Setup • Design a Tabletop environment to perform simple interaction tasks to map physical world with virtual world. • Complete Geometric approach for Colocation of Real andVirtual World - No Augmented Reality ! • Using nVis SX60 HMD, Polhemous Trackers and a 100cm by 80cm table - andVector Algebra! Wednesday, 4 June 14
  • 10. The problem of Colocation Physical World • Tracker World • Gives the position and orientation of Table and Head. • Receiver of the tracker has it own frame of reference • Trackers have their own frames of reference Virtual World • OpenGL World • Graphics has its own frame of reference • Position and Orientation of Camera w.r.t head to be identified Wednesday, 4 June 14
  • 11. Table Frame to GL Frame • Coordinates we are getting from Table-receiver are converted to OpenGL World coordinates Receiver Frame GL Frame x y z x y z (0,0,0) (0,0,0) Tab2GL Wednesday, 4 June 14
  • 12. (0,0,0) (74,94,-20) Receiver Frame x y z (0,0,0) Table Frame to GL Frame Tab2GL Wednesday, 4 June 14
  • 13. (-94,20,74) GL Frame xz y (0,0,0) Table Frame to GL Frame Tab2GL Wednesday, 4 June 14
  • 14. HeadTracker to Table Table Frame x y z (0,0,0) Head Tracker Frame Head2Tab •To nullify the effect of mis-orientation of head tracker •Align Tracker Frame with Table Frame Wednesday, 4 June 14
  • 15. HeadTracker to L/R Camera (Eye) Head Tracker Frame Wednesday, 4 June 14
  • 16. HeadTracker to L/R Camera (Eye) Table Frame Wednesday, 4 June 14
  • 17. HeadTracker to L/R Camera (Eye) Table Frame Left Camera Frame Head2LCam Head2RCam Wednesday, 4 June 14
  • 18. Head TrackingYaw Pitch Roll • Based on Position and Orientation of Head, the view of the scene can be changed. • Look around - Look closer Wednesday, 4 June 14
  • 19. Left Eye Right Eye Real World Virtual World (Actual 3D view may differ based on head orientation) Colocation Wednesday, 4 June 14
  • 20. Issues • Limited HMD FoV : 40º • Increasing the FoV more than 40º makes the scene skew and impedes in proper depth perception. • The present working area is restricted to 40º FoV for realistic view and depth perception. • VariableViewport • Viewport / FoV of eye changes dynamically according to the point where eye is looking-at which is not possible in the case of virtual world as we don't know where real eye is looking at in the the scene. • The issue Binocular Eye trackers solve the above problem which are under process of acquisition. Wednesday, 4 June 14
  • 21. Issues • Non Smooth EM Tracker Data • Position / Orientation data obtained via Electromagnetic Trackers is full of noise • Scene Rendering is highly dependent on Tracker Data - that too - Multiple trackers • Render Scene appears jittery • Solution: Use of Kalman Filters for Smoothing the data Wednesday, 4 June 14
  • 22. Thank you Wednesday, 4 June 14

×