Lecture 4. AR Interaction         Mark Billinghurst   mark.billinghurst@hitlabnz.org             Gun Lee       gun.lee@hit...
Building Compelling AR ExperiencesB ildi   C    lli      E    i          experiences          applications   Interaction  ...
AR Interaction Designing AR System = Interface Design   Using diff   U i different input and output technologies          ...
Interaction Tasks     2D (from [Foley]):           Selection, Text Entry, Quantify, Position                              ...
AR Interfaces as Data Browsers2D/3D virtual objects areregistered in 3D  “VR in Real World”InteractionI t    ti  2D/3D vir...
AR Information BrowsersInformation is registered toreal-world context   Hand held AR displaysInteraction   Manipulation f ...
Architecture
Current AR Information BrowsersMobile AR  GPS + compassMany Applications  Layar    y  Wikitude  Acrossair  PressLite  Yelp...
JunaioAR Browser from Metaio  http://www.junaio.com/     p       jAR browsing  GPS + compass  2D/3D object placement  Phot...
Advantages and DisadvantagesImportant class of AR interfaces  Wearable computers  AR simulation, trainingLimited interacti...
3D AR InterfacesVirtual objects displayed in 3Dphysical space and manipulated  HMDs and 6DOF head-tracking  6DOF hand trac...
AR 3D Interaction
AR G ffiti    Graffitiwww.nextwall.net
Advantages and DisadvantagesImportant class of AR interfaces   Entertainment, design, trainingAdvantages   User can intera...
Augmented S f            Surfaces andTangible Interfaces Basic principles   Virtual objects are   projected on a surface  ...
Augmented SurfacesRekimoto, et al. 1998  Front projection  F         j i  Marker-based tracking  Multiple projection surfa...
Tangible User Interfaces (Ishii 97)Create digital shadowsfor h i l bjf physical objectsForeground    g  graspable UIBackgr...
Tangible Interfaces - Ambient Dangling String D l S   Jeremijenko 1995   Ambient ethernet monitor   A bi       h          ...
Tangible Interface: ARgrooveCollaborative InstrumentExploring Physically Based Interaction   Map physical actions to Midi ...
ARgroove in Use
Visual FeedbackContinuous Visual Feedback is KeySingle Virtual Image Provides:  Rotation  Tilt  Tl  Height
i/O Brush (Ryokai, Marti, Ishii)
Other ExamplesTriangles (Gorbert 1998)  Triangular based story telling       g               y       gActiveCube (Kitamura...
Lessons from Tangible InterfacesPhysical objects make us smart  Norman’s “Things that Make Us Smart”                 g  en...
TUI LimitationsDifficult to change object properties  can’t tell state of digital dataLimited display capabilities  projec...
Advantages and DisadvantagesAdvantages  Natural - users hands are used for interacting  with both virtual and real objects...
Orthogonal Nature of AR Interfaces
Back to the Real WorldAR overcomes limitation of TUIs  enhance display possibilities  merge task/display space  provide pu...
Space vs. Time - MultiplexedSpace-multiplexed  Many devices each with one function      y   - Quicker to use, more intuiti...
Tangible AR: Tiles (Space Multiplexed) Tiles T l semantics   data tiles   operation tiles           i  il Operation on til...
Space-multiplexed Interface   Data authoring in Tiles
Proximity-based Interaction
Object Based Interaction: MagicCup Intuitive Virtual Object Manipulation on a Table-Top Workspace   Time multiplexed   Mul...
MagicCup systemMain table, Menu table, Cup interface
Tangible AR: Time-multiplexed Interaction  Use of natural physical object manipulations to                    j  control v...
VOMAR Interface
Advantages and DisadvantagesAdvantages  Natural interaction with virtual and physical tools   - No need for special purpos...
Wrap-upBrowsing Interfaces  simple (conceptually!), unobtrusive         (conceptually!)3D AR Interfaces  expressive, creat...
Designing ARApplications
Interface Design Path1/ Prototype Demonstration2/ Adoption of Interaction Techniques from other  interface metaphors      ...
AR Design Space    Reality                            Virtual Reality                  Augmented R lit                  A ...
AR is mixture of physical affordance andvirtual affordancePhysical  Tangible  T ibl controllers and objects              l...
AR Design PrinciplesInterface Components Physical components Display elements  - Visual/audio Interaction metaphors       ...
Tangible AR MetaphorAR overcomes limitation of TUIs  enhance dis la possibilities          display ssibilities  merge task...
Tangible AR Design PrinciplesTangible AR Interfaces use TUI principles  Physical controllers for moving virtual content  S...
Case Study 1: 3D AR LensGoal: Develop a lens based AR interface  MagicLenses    Developed at Xerox PARC in 1993    View   ...
3D MagicLensesMagicLenses extended to 3D (Veiga et. al. 96)  Volumetric and flat lenses
AR Lens Design PrinciplesPhysical Components  Lens handle   - Virtual lens attached to real objectDisplay Elements  Lens v...
3D AR Lenses: Model ViewerDisplays models made up of multiple partsEach part can be shown or hidden through the lensAllows...
AR Lens Demo
AR FlexiLensReal handles/controllers with flexible AR lens
Techniques based on AR LensesObject Selection  Select objects by targeting them with the lensInformation Filtering  Show d...
Case Study 2 : LevelHeadBlock based game
Case Study 2: LevelHeadPhysical Components  Real blocksDisplay Elements  Virtual person and roomsInteraction MI       i Me...
Case Study 3: AR Chemistry (Fjeld 2002)Tangible AR chemistry education
Goal: An AR application to test molecularG  structure in chemistry                       y Physical Components   Real book...
AR Chemistry Input Devices
Case Study 4: Transitional InterfacesGoal: An AR interface supporting transitions from reality to virtual reality Physical...
Milgram’s Continuum (1994)                      Mixed Reality (MR)Reality        Augmented           Augmented            ...
TransitionsInterfaces of the future will need to supporttransitions along the RV continuum     ii      l     h           i...
The MagicBookDesign Goals:  Allows user to move smoothly between reality  and virtual reality  Support collaboration  S   ...
MagicBook Metaphor
FeaturesSeamless transition between Reality and Virtuality  Reliance on real d  R li           l decreases as virtual incr...
Design alternatives for      common user tasks in Tangible AR User Tasks                          Interface Design        ...
Design Tips for Tangible ARUsing metaphors from the real worldTake advantage of parallel interactionsUse timers to prevent...
OSGART:From Registration to Interaction
Keyboard and Mouse InteractionTraditional input techniquesOSG provides a framework for handling keyboard     p            ...
Keyboard and Mouse Interaction          Create your own event handler classclass KeyboardMouseEventHandler : public osgGA:...
Keyboard Interaction         Handle W,A,S,D keys to move an objectcase osgGA::GUIEventAdapter::KEYDOWN: {   switch (ea.get...
Keyboard Interaction Demo
Mouse InteractionMouse is pointing device…Use mouse to select objects in an AR sceneOSG provides methods for ray-casting a...
Mouse Interaction   Compute the list of nodes under the clicked position   Invoke an action on nodes that are hit, e.g. se...
Mouse Interaction Demo
Proximity TechniquesInteraction based on  the distance between a marker and the camera  the distance between multiple mark...
Single Marker Techniques: ProximityUse distance from camera to marker asinput parameter   e.g.   e g Lean in close to exam...
Single Marker Techniques: Proximity// Load some modelsosg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg...
Single Marker Proximity Demo
Multiple Marker ConceptsInteraction based on the relationship betweenmarkers  e.g. When the distance between two markers  ...
Multiple Marker Proximity                           Virtual                           Camera    Transform A               ...
Multiple Marker Proximity                           Virtual                           Camera    Transform A               ...
Multiple Marker Proximity        Use a node callback to test for proximity and update the relevant nodesvirtual void opera...
Multiple Marker Proximity
Paddle Interaction Use U one marker as a tool for selecting and              k           lf    l        d manipulating obj...
Paddle InteractionOften useful to adopt a local coordinate system                                                      All...
Tilt and Shake InteractionDetect types of paddle movement:        yp      p  Tilt   - gradual change in orientation  Shake...
More Information• Mark Billinghurst       Billingh rst  – mark.billinghurst@hitlabnz.org• Gun Lee  – gun.lee@hitlabnz.org ...
Building Tangible AR Interfaces        with ARToolKit
Required CodeCalculating Camera PC l l       C      Position  Range to markerLoading Multiple Patterns/Models             ...
Tangible AR Coordinate Frames
Local vs. Global InteractionsLocal  Actions determined from single camera to marker  transform   - shaking, appearance, re...
Range-based Interaction Sample Fil R S   l File: RangeTest.c                  T t/* get the camera transformation */      ...
Loading Multiple PatternsSample F l L dM lS   l File: LoadMulti.c  Uses object.c to loadObject Structuretypedef struct {  ...
Finding Multiple Transforms Create object list CObjectData_T      *object; Read in objects - in init( )read_ObjData(read O...
Drawing Multiple Objects Send the object list to the draw functiondraw( object objectnum );      object, Draw each object ...
Proximity Based InteractionSample File – CollideTest.cDetect distance between markerscheckCollisions(object[0],object[1], ...
Multi-marker TrackingSample File – multiTest.cMultiple markers to establish asingle coordinate frame   g  Reading in a con...
MultiMarker Configuration FileSample File - Data/multi/marker dat              Data/multi/marker.datContains list of all t...
Camera Transform CalculationInclude <AR/arMulti.h>I l d <AR/ M lti h>Link to libARMulti.libIn mainLoop()  Detect markers a...
Paddle-based InteractionTracking single marker relative to multi-marker set  - paddle contains single marker    p         ...
Paddle Interaction Code  Sample File – PaddleDemo.c  Get paddle marker location + draw paddle before drawing      p       ...
Paddle Interaction Code IISample File – paddleDrawDemo.cFinds the paddle position relative to global coordinate frame:setB...
General Tangible AR Librarycommand_sub.c, command_sub.h         d b              d bhContains functions for recognizing a ...
COSC 426 lect. 4: AR Interaction
COSC 426 lect. 4: AR Interaction
COSC 426 lect. 4: AR Interaction
COSC 426 lect. 4: AR Interaction
Upcoming SlideShare
Loading in...5
×

COSC 426 lect. 4: AR Interaction

1,630

Published on

Published in: Technology
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,630
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
115
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

COSC 426 lect. 4: AR Interaction

  1. 1. Lecture 4. AR Interaction Mark Billinghurst mark.billinghurst@hitlabnz.org Gun Lee gun.lee@hitlabnz.org Aug 2011 COSC 426: Augmented Reality
  2. 2. Building Compelling AR ExperiencesB ildi C lli E i experiences applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
  3. 3. AR Interaction Designing AR System = Interface Design Using diff U i different input and output technologies i d h l i Objective is a high quality of user experience j g q y p Ease of use and learning Performance and satisfaction
  4. 4. Interaction Tasks 2D (from [Foley]): Selection, Text Entry, Quantify, Position y y 3D (from [Bowman]): Navigation (Travel/Wayfinding) Selection Manipulation System Control/Data Input AR: 2D + 3D Tasks and.. more specific tasks?[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D.,V. Wallace & P. Chan. IEEE ComputerGraphics and Applications(Nov.): 13-48. 1984.[Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
  5. 5. AR Interfaces as Data Browsers2D/3D virtual objects areregistered in 3D “VR in Real World”InteractionI t ti 2D/3D virtual viewpoint control lApplications Visualization, training
  6. 6. AR Information BrowsersInformation is registered toreal-world context Hand held AR displaysInteraction Manipulation f M i l i of a windowi d into information spaceApplications Context-aware information displays Rekimoto, et al. 1997
  7. 7. Architecture
  8. 8. Current AR Information BrowsersMobile AR GPS + compassMany Applications Layar y Wikitude Acrossair PressLite Yelp AR Car Finder …
  9. 9. JunaioAR Browser from Metaio http://www.junaio.com/ p jAR browsing GPS + compass 2D/3D object placement Photos/live video Community viewing
  10. 10. Advantages and DisadvantagesImportant class of AR interfaces Wearable computers AR simulation, trainingLimited interactivity Modification of virtual content is difficult Rekimoto, et al. 1997
  11. 11. 3D AR InterfacesVirtual objects displayed in 3Dphysical space and manipulated HMDs and 6DOF head-tracking 6DOF hand trackers for input h d k f iInteraction Viewpoint control Traditional 3D user interface Kiyokawa, et al. 2000 interaction: manipulation, selection, etc.
  12. 12. AR 3D Interaction
  13. 13. AR G ffiti Graffitiwww.nextwall.net
  14. 14. Advantages and DisadvantagesImportant class of AR interfaces Entertainment, design, trainingAdvantages User can interact with 3D virtual object everywhere i space bj t h in Natural, familiar interactionDisadvantages Usually no tactile feedback User has to use different devices for virtual and physical objects Oshima, et al. 2000
  15. 15. Augmented S f Surfaces andTangible Interfaces Basic principles Virtual objects are projected on a surface Physical objects are used as controls f virtual l for l objects Support for collaboration
  16. 16. Augmented SurfacesRekimoto, et al. 1998 Front projection F j i Marker-based tracking Multiple projection surfaces
  17. 17. Tangible User Interfaces (Ishii 97)Create digital shadowsfor h i l bjf physical objectsForeground g graspable UIBackgroundB k d ambient interfaces
  18. 18. Tangible Interfaces - Ambient Dangling String D l S Jeremijenko 1995 Ambient ethernet monitor A bi h i Relies on peripheral cues Ambient Fixtures Dahley, Wisneski, Ishii 1998 Use natural material qualities for information display
  19. 19. Tangible Interface: ARgrooveCollaborative InstrumentExploring Physically Based Interaction Map physical actions to Midi output - Translation rotation Translation, - Tilt, shake
  20. 20. ARgroove in Use
  21. 21. Visual FeedbackContinuous Visual Feedback is KeySingle Virtual Image Provides: Rotation Tilt Tl Height
  22. 22. i/O Brush (Ryokai, Marti, Ishii)
  23. 23. Other ExamplesTriangles (Gorbert 1998) Triangular based story telling g y gActiveCube (Kitamura 2000-) Cubes C b with sensors h
  24. 24. Lessons from Tangible InterfacesPhysical objects make us smart Norman’s “Things that Make Us Smart” g encode affordances, constraintsObjects aid collaboration establish shared meaningObjects increase understanding serve as cognitive artifacts
  25. 25. TUI LimitationsDifficult to change object properties can’t tell state of digital dataLimited display capabilities projection screen = 2D dependent on physical display surfaceSeparation between object and display p j p y ARgroove
  26. 26. Advantages and DisadvantagesAdvantages Natural - users hands are used for interacting with both virtual and real objects. - N need for special purpose input devices No df i l i td iDisadvantages g Spatial gap - Interaction is limited only to 2D surface • Full 3D interaction and manipulation is difficult - Separation between interaction object and display
  27. 27. Orthogonal Nature of AR Interfaces
  28. 28. Back to the Real WorldAR overcomes limitation of TUIs enhance display possibilities merge task/display space provide public and private viewsTUI + AR = Tangible AR g Apply TUI methods to AR interface design
  29. 29. Space vs. Time - MultiplexedSpace-multiplexed Many devices each with one function y - Quicker to use, more intuitive, clutter - Real ToolboxTime multiplexedTime-multiplexed One device with many functions - Space efficient - mouse
  30. 30. Tangible AR: Tiles (Space Multiplexed) Tiles T l semantics data tiles operation tiles i il Operation on tiles proximity spatial arrangements space-multiplexed lti l d
  31. 31. Space-multiplexed Interface Data authoring in Tiles
  32. 32. Proximity-based Interaction
  33. 33. Object Based Interaction: MagicCup Intuitive Virtual Object Manipulation on a Table-Top Workspace Time multiplexed Multiple Markers M li l M k - Robust Tracking Tangible User Interface - Intuitive Manipulation Stereo Display - Good Presence
  34. 34. MagicCup systemMain table, Menu table, Cup interface
  35. 35. Tangible AR: Time-multiplexed Interaction Use of natural physical object manipulations to j control virtual objects VOMAR Demo Catalog book: - Turn over the page p g Paddle operation: - Push, shake, incline, hit, scoop
  36. 36. VOMAR Interface
  37. 37. Advantages and DisadvantagesAdvantages Natural interaction with virtual and physical tools - No need for special purpose input devices Spatial interaction with virtual objects - 3D manipulation with virtual objects anywhere in physical spaceDisadvantages Requires Head Mounted Display
  38. 38. Wrap-upBrowsing Interfaces simple (conceptually!), unobtrusive (conceptually!)3D AR Interfaces expressive, creative, require attentionTangible Interfaces Embedded into conventional environmentsTangibleT ibl AR Combines TUI input + AR display p p y
  39. 39. Designing ARApplications
  40. 40. Interface Design Path1/ Prototype Demonstration2/ Adoption of Interaction Techniques from other interface metaphors p Augmented Reality3/ Development of new interface metaphors appropriate to the medium Virtual Reality4/ Development of formal theoretical models for predicting and modeling user actions Desktop WIMP
  41. 41. AR Design Space Reality Virtual Reality Augmented R lit A t d RealityPhysical Design Virtual Design
  42. 42. AR is mixture of physical affordance andvirtual affordancePhysical Tangible T ibl controllers and objects ll d bjVirtual Virtual graphics and audio
  43. 43. AR Design PrinciplesInterface Components Physical components Display elements - Visual/audio Interaction metaphors Physical y Display p y Elements Interaction Elements Metaphor Input Output
  44. 44. Tangible AR MetaphorAR overcomes limitation of TUIs enhance dis la possibilities display ssibilities merge task/display space provide public and private viewsTUI + AR = Tangible AR Apply TUI methods to AR interface design
  45. 45. Tangible AR Design PrinciplesTangible AR Interfaces use TUI principles Physical controllers for moving virtual content Support for spatial 3D interaction techniques Support for multi-handed interaction Match object affordances to task requirements j q Support parallel activity with multiple objects Allow collaboration between multiple users
  46. 46. Case Study 1: 3D AR LensGoal: Develop a lens based AR interface MagicLenses Developed at Xerox PARC in 1993 View Vi a region of the workspace differently to the rest i f h k diff l h Overlap MagicLenses to create composite effects
  47. 47. 3D MagicLensesMagicLenses extended to 3D (Veiga et. al. 96) Volumetric and flat lenses
  48. 48. AR Lens Design PrinciplesPhysical Components Lens handle - Virtual lens attached to real objectDisplay Elements Lens view - Reveal layers in datasetInteraction Metaphor Physically holding lens Ph i ll h ldi l
  49. 49. 3D AR Lenses: Model ViewerDisplays models made up of multiple partsEach part can be shown or hidden through the lensAllows the user to peer inside the modelMaintains focus + context
  50. 50. AR Lens Demo
  51. 51. AR FlexiLensReal handles/controllers with flexible AR lens
  52. 52. Techniques based on AR LensesObject Selection Select objects by targeting them with the lensInformation Filtering Show different representations through the lens Hide certain content to reduce clutter, look inside things
  53. 53. Case Study 2 : LevelHeadBlock based game
  54. 54. Case Study 2: LevelHeadPhysical Components Real blocksDisplay Elements Virtual person and roomsInteraction MI i Metaphor h Blocks are rooms
  55. 55. Case Study 3: AR Chemistry (Fjeld 2002)Tangible AR chemistry education
  56. 56. Goal: An AR application to test molecularG structure in chemistry y Physical Components Real book, R l b k rotation cube, scoop, tracking markers i b ki k Display Elements p y AR atoms and molecules Interaction M t h I t ti Metaphor Build your own molecule
  57. 57. AR Chemistry Input Devices
  58. 58. Case Study 4: Transitional InterfacesGoal: An AR interface supporting transitions from reality to virtual reality Physical Components Real book Display Elements AR and VR content Interaction Metaphor Book pages hold virtual scenes p g
  59. 59. Milgram’s Continuum (1994) Mixed Reality (MR)Reality Augmented Augmented Virtuality y(Tangible Reality (AR) Virtuality (AV) (VirtualInterfaces) Reality) Central Hypothesis yp The next generation of interfaces will support transitions along the Reality-Virtuality continuum
  60. 60. TransitionsInterfaces of the future will need to supporttransitions along the RV continuum ii l h iAugmented Reality is preferred for: co-located collaborationImmersive Virtual Reality is preferred for:I i Vi t l R lit i f df experiencing world immersively (egocentric) sharing views remote collaboration
  61. 61. The MagicBookDesign Goals: Allows user to move smoothly between reality and virtual reality Support collaboration S ll b i
  62. 62. MagicBook Metaphor
  63. 63. FeaturesSeamless transition between Reality and Virtuality Reliance on real d R li l decreases as virtual increases i liSupports egocentric and exocentric views User can pick appropriate viewComputer becomes invisible Consistent interface metaphors Virtual content seems realSupports collaboration
  64. 64. Design alternatives for common user tasks in Tangible AR User Tasks Interface Design Camera on HMDViewpoint Fixed camera – top view, front view, mirroringControl Handheld camera Statically paired virtual and physical objectsSelection Dynamically paired virtual and physical objects - paddle, pointer D i ll i d i t l d h i l bj t ddl i t Direct mapping of whole 6DOF3D Filtered/distorted mapping - snapping, non-linear mappingManipulation Multiplexed mapping - rotation from one, position from another Location and pose based - proximity, spatial configuration Gestures with props - tilt shake tilt,Event & Keyboard & mouseCommand Menu, 2D/3D GUI with tracking objects as pointers Buttons Occlusion based interaction Custom hardware devices
  65. 65. Design Tips for Tangible ARUsing metaphors from the real worldTake advantage of parallel interactionsUse timers to prevent accidentsInteraction volume – user, tracking, whitespaceWhat happens when tracking gets lost or objectis out of view?Problems in visualization Field of view, occlusions ,
  66. 66. OSGART:From Registration to Interaction
  67. 67. Keyboard and Mouse InteractionTraditional input techniquesOSG provides a framework for handling keyboard p g yand mouse input events (osgGA) 1. Subclass osgGA::GUIEventHandler g 2. Handle events: • Mouse up / down / move / drag / scroll-wheel • Key up / down 3. Add instance of new handler to the viewer
  68. 68. Keyboard and Mouse Interaction Create your own event handler classclass KeyboardMouseEventHandler : public osgGA::GUIEventHandler {public: KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { } virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa, osg::Object* obj, osg::NodeVisitor* nv) { switch (ea.getEventType()) { // Possible events we can handle case osgGA::GUIEventAdapter::PUSH: break; case osgGA::GUIEventAdapter::RELEASE: break; case osgGA::GUIEventAdapter::MOVE: break; case osgGA::GUIEventAdapter::DRAG: break; case osgGA::GUIEventAdapter::SCROLL: break; case osgGA::GUIEventAdapter::KEYUP: b d break; k case osgGA::GUIEventAdapter::KEYDOWN: break; } return false; }}; Add it to the viewer to receive eventsviewer.addEventHandler(new KeyboardMouseEventHandler());
  69. 69. Keyboard Interaction Handle W,A,S,D keys to move an objectcase osgGA::GUIEventAdapter::KEYDOWN: { switch (ea.getKey()) { case w: // Move forward 5mm localTransform->preMult(osg::Matrix::translate(0, -5, 0)); return true; case s: // Move back 5mm localTransform->preMult(osg::Matrix::translate(0, 5, 0)); return true; case a: // Rotate 10 degrees left localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS)); return true; case d: // Rotate 10 degrees right localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS)); return t t true; case : // Reset the transformation localTransform->setMatrix(osg::Matrix::identity()); return true; }break;localTransform = new osg::MatrixTransform();localTransform->addChild(osgDB::readNodeFile("media/car.ive"));arTransform->addChild(localTransform.get());arTransform >addChild(localTransform get());
  70. 70. Keyboard Interaction Demo
  71. 71. Mouse InteractionMouse is pointing device…Use mouse to select objects in an AR sceneOSG provides methods for ray-casting andintersection testing Return an osg::NodePath (the p g ( path from the hit node all the way back to the root) Projection Plane (screen) scene
  72. 72. Mouse Interaction Compute the list of nodes under the clicked position Invoke an action on nodes that are hit, e.g. select, deletecase osgGA::GUIEventAdapter::PUSH: osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa); osgUtil::LineSegmentIntersector::Intersections intersections; // Clear previous selections for ( f (unsigned i t i = 0 i < t i d int 0; targets.size(); i++) { t i () targets[i]->setSelected(false); } // Find new selection based on click position if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) { for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) { if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) { std::cout << "HIT!" << std::endl; target >setSelected(true); target->setSelected(true); return true; } } } break;
  73. 73. Mouse Interaction Demo
  74. 74. Proximity TechniquesInteraction based on the distance between a marker and the camera the distance between multiple markers
  75. 75. Single Marker Techniques: ProximityUse distance from camera to marker asinput parameter e.g. e g Lean in close to examineCan use the osg::LOD class to showdifferent content at different depthranges Image: OpenSG Consortium
  76. 76. Single Marker Techniques: Proximity// Load some modelsosg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg");osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg");osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");// Use a Level-Of-Detail node to show each model at different distance ranges.osg::ref_ptr<osg::LOD> lod = new osg::LOD();lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m awaylod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm awaylod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm awayarTransform->addChild(lod.get()); Define depth ranges for each node Add as many as you want Ranges can overlap
  77. 77. Single Marker Proximity Demo
  78. 78. Multiple Marker ConceptsInteraction based on the relationship betweenmarkers e.g. When the distance between two markers decreases below threshold invoke an action Tangible User InterfaceApplications:A l Memory card g y games File operations
  79. 79. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance > Threshold Switch A Switch BModel Model Model Model A1 A2 B1 B2
  80. 80. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance <= Threshold Switch A Switch BModel Model Model Model A1 A2 B1 B2
  81. 81. Multiple Marker Proximity Use a node callback to test for proximity and update the relevant nodesvirtual void operator()(osg::Node* node, osg::NodeVisitor* nv) { if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) { if (mMarkerA->valid() && mMarkerB->valid()) { osg::Vec3 posA = mMarkerA->getTransform().getTrans(); osg::Vec3 posB = mMarkerB->getTransform().getTrans(); osg::Vec3 offset = posA - posB; float distance = offset.length(); if (distance <= mThreshold) { if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1); if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1); } else { if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0); if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0); } } } traverse(node,nv);}
  82. 82. Multiple Marker Proximity
  83. 83. Paddle Interaction Use U one marker as a tool for selecting and k lf l d manipulating objects (tangible user interface) Another A h marker provides a frame of reference k id f f f A grid of markers can alleviate problems with occlusionMagicCup (Kato et al) VOMAR (K t et al) (Kato t l)
  84. 84. Paddle InteractionOften useful to adopt a local coordinate system Allows the camera to move without disrupting Tlocal Places the paddle in the th same coordinate di t system as the content on the grid Simplifies interactionosgosgART co putes Tlocal using the osgART::LocalTransformationCallback computes us g t e osg :: oca a s o at o Ca bac
  85. 85. Tilt and Shake InteractionDetect types of paddle movement: yp p Tilt - gradual change in orientation Shake - short sudden chan es in translation short, s dden changes
  86. 86. More Information• Mark Billinghurst Billingh rst – mark.billinghurst@hitlabnz.org• Gun Lee – gun.lee@hitlabnz.org g @ g• Websites – www hitlabnz org www.hitlabnz.org
  87. 87. Building Tangible AR Interfaces with ARToolKit
  88. 88. Required CodeCalculating Camera PC l l C Position Range to markerLoading Multiple Patterns/Models /Interaction between objects Proximity Relative position/orientationOcclusion Stencil buffering Multi-marker tracking
  89. 89. Tangible AR Coordinate Frames
  90. 90. Local vs. Global InteractionsLocal Actions determined from single camera to marker transform - shaking, appearance, relative position, rangeGlobal Actions determined from two relationships - marker to camera, world to camera coords. - Marker transform determined in world coordinates • object tilt, absolute position, absolute rotation, hitting
  91. 91. Range-based Interaction Sample Fil R S l File: RangeTest.c T t/* get the camera transformation */ h f iarGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans); marker width marker trans);//* find the range */ /Xpos = marker_trans[0][3];Ypos = marker_trans[1][3]; p [ ][ ]Zpos = marker_trans[2][3];range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
  92. 92. Loading Multiple PatternsSample F l L dM lS l File: LoadMulti.c Uses object.c to loadObject Structuretypedef struct { char name[256]; 2 6 int id; int visible;; double marker_coord[4][2]; double trans[3][4]; double d bl marker_width; k idth double marker_center[2];} ObjectData_T; _
  93. 93. Finding Multiple Transforms Create object list CObjectData_T *object; Read in objects - in init( )read_ObjData(read ObjData( char *name int *objectnum ); *name, Find Transform – in mainLoop( ) p(for( i = 0; i < objectnum; i++ ) { ..Check patterns p ..Find transforms for each marker }
  94. 94. Drawing Multiple Objects Send the object list to the draw functiondraw( object objectnum ); object, Draw each object individuallyfor( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; argConvGlpara(object[i].trans, gl_para); draw_object( object[i].id, gl_para);}
  95. 95. Proximity Based InteractionSample File – CollideTest.cDetect distance between markerscheckCollisions(object[0],object[1], DIST)If distance < collide distanceThen change the model/perform interaction
  96. 96. Multi-marker TrackingSample File – multiTest.cMultiple markers to establish asingle coordinate frame g Reading in a configuration file Tracking from sets of markers T ki f t f k Careful camera calibration
  97. 97. MultiMarker Configuration FileSample File - Data/multi/marker dat Data/multi/marker.datContains list of all the patterns and their exactpositions ii #the number of patterns to be recognized 6 Pattern File #marker 1 Pattern Width + Data/multi/patt.a Coordinate Origin 40.0 0.0 0.0 Pattern Transform 1.0000 0.0000 0.0000 -100.0000 Relative to Global 0.0000 1.0000 0.0000 50.0000 Origin 0.0000 0.0000 1.0000 0.0000 0 0000 0 0000 1 0000 0 0000 …
  98. 98. Camera Transform CalculationInclude <AR/arMulti.h>I l d <AR/ M lti h>Link to libARMulti.libIn mainLoop() Detect markers as usual arDetectMarkerLite(dataPtr, thresh, &marker_info, &marker_num) Use MultiMarker Function if( (err=arMultiGetTransMat(marker_info, marker_num, config)) < 0 ) { k fi )) argSwapBuffers(); return; }
  99. 99. Paddle-based InteractionTracking single marker relative to multi-marker set - paddle contains single marker p g
  100. 100. Paddle Interaction Code Sample File – PaddleDemo.c Get paddle marker location + draw paddle before drawing p p g background model paddleGetTrans(paddleInfo, marker_info, marker_flag, marker_num, marker flag marker num &cparam); /* draw the paddle */ if( paddleInfo->active ){ draw_paddle( draw paddle( paddleInfo); }draw_paddledraw paddle uses a Stencil Buffer to increase realism
  101. 101. Paddle Interaction Code IISample File – paddleDrawDemo.cFinds the paddle position relative to global coordinate frame:setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])Sample File – paddleTouch.cFinds the paddle position:findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);Checks for collisions:checkCollision(&curPaddlePos,myTarget[i].pos,20.0)checkCollision(&curPaddlePos myTarget[i] pos 20 0)
  102. 102. General Tangible AR Librarycommand_sub.c, command_sub.h d b d bhContains functions for recognizing a range ofdifferent paddle motions:diff ddl iint check_shake( );inti t check_punch( ) h k h( );int check_incline( );int check pickup( ); check_pickup(int check_push( );Eg: to check angle between paddle and basecheck_incline(paddle->trans, base->trans, &ang)
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×