Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

426 lecture6b: AR Interaction

1,772 views

Published on

COSC 426 Lecture 6b on AR Interaction. P

Published in: Technology, News & Politics
  • Be the first to comment

426 lecture6b: AR Interaction

  1. 1. COSC 426: Augmented Reality Mark Billinghurst mark.billinghurst@hitlabnz.org August 14th 2012 Lecture 6: AR Interaction
  2. 2. Building Compelling AR Experiences experiences applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
  3. 3. AR Interaction  Designing AR System = Interface Design   Using different input and output technologies  Objective is a high quality of user experience   Ease of use and learning   Performance and satisfaction
  4. 4. User Interface and Tool   Human  User Interface/Tool  Machine/Object   Human Machine InterfaceTools User Interface © Andreas Dünser
  5. 5. User Interface: Characteristics  Input: mono or multimodal  Output: mono or multisensorial Sensation of Output movement  Technique/Metaphor/Paradigm InputMetaphor:“Push” to accelerate“Turn” to rotate © Andreas Dünser
  6. 6. Human Computer Interface  Human  User Interface Computer System  Human Computer Interface= Hardware +| Software  Computer is everywhere now HCI electronic devices, Home Automation, Transport vehicles, etc © Andreas Dünser
  7. 7. More terminology  Interaction Device= Input/Output of User Interface  Interaction Style= category of similar interaction techniques  Interaction Paradigm  Modality (human sense)  Usability
  8. 8. Back to AR  You can see spatially registered AR.. how can you interact with it?
  9. 9. Interaction Tasks  2D (from [Foley]):   Selection, Text Entry, Quantify, Position  3D (from [Bowman]):   Navigation (Travel/Wayfinding)   Selection   Manipulation   System Control/Data Input  AR: 2D + 3D Tasks and.. more specific tasks?[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D.,V. Wallace & P. Chan. IEEE ComputerGraphics and Applications(Nov.): 13-48. 1984.[Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
  10. 10. AR Interfaces as Data Browsers  2D/3D virtual objects are registered in 3D   “VR in Real World”  Interaction   2D/3D virtual viewpoint control  Applications   Visualization, training
  11. 11. AR Information Browsers  Information is registered to real-world context   Hand held AR displays  Interaction   Manipulation of a window into information space  Applications   Context-aware information displays Rekimoto, et al. 1997
  12. 12. Architecture
  13. 13. Current AR Information Browsers  Mobile AR   GPS + compass  Many Applications   Layar   Wikitude   Acrossair   PressLite   Yelp   AR Car Finder   …
  14. 14. Junaio  AR Browser from Metaio   http://www.junaio.com/  AR browsing   GPS + compass   2D/3D object placement   Photos/live video   Community viewing
  15. 15. Web Interface
  16. 16. Adding Models in Web Interface
  17. 17. Advantages and Disadvantages  Important class of AR interfaces   Wearable computers   AR simulation, training  Limited interactivity   Modification of virtual content is difficult Rekimoto, et al. 1997
  18. 18. 3D AR Interfaces  Virtual objects displayed in 3D physical space and manipulated   HMDs and 6DOF head-tracking   6DOF hand trackers for input  Interaction   Viewpoint control   Traditional 3D user interface Kiyokawa, et al. 2000 interaction: manipulation, selection, etc.
  19. 19. AR 3D Interaction
  20. 20. AR Graffitiwww.nextwall.net
  21. 21. Advantages and Disadvantages  Important class of AR interfaces   Entertainment, design, training  Advantages   User can interact with 3D virtual object everywhere in space   Natural, familiar interaction  Disadvantages   Usually no tactile feedback   User has to use different devices for virtual and physical objects Oshima, et al. 2000
  22. 22. Augmented Surfaces andTangible Interfaces  Basic principles   Virtual objects are projected on a surface   Physical objects are used as controls for virtual objects   Support for collaboration
  23. 23. Augmented Surfaces  Rekimoto, et al. 1998   Front projection   Marker-based tracking   Multiple projection surfaces
  24. 24. Tangible User Interfaces (Ishii 97)  Create digital shadows for physical objects  Foreground   graspable UI  Background   ambient interfaces
  25. 25. Tangible Interfaces - Ambient  Dangling String   Jeremijenko 1995   Ambient ethernet monitor   Relies on peripheral cues  Ambient Fixtures   Dahley, Wisneski, Ishii 1998   Use natural material qualities for information display
  26. 26. Tangible Interface: ARgroove  Collaborative Instrument  Exploring Physically Based Interaction   Map physical actions to Midi output -  Translation, rotation -  Tilt, shake
  27. 27. ARgroove in Use
  28. 28. Visual Feedback  Continuous Visual Feedback is Key  Single Virtual Image Provides:   Rotation   Tilt   Height
  29. 29. i/O Brush (Ryokai, Marti, Ishii)
  30. 30. Other Examples  Triangles (Gorbert 1998)   Triangular based story telling  ActiveCube (Kitamura 2000-)   Cubes with sensors
  31. 31. Lessons from Tangible Interfaces  Physical objects make us smart   Norman’s “Things that Make Us Smart”   encode affordances, constraints  Objects aid collaboration   establish shared meaning  Objects increase understanding   serve as cognitive artifacts
  32. 32. TUI Limitations  Difficult to change object properties   can’t tell state of digital data  Limited display capabilities   projection screen = 2D   dependent on physical display surface  Separation between object and display   ARgroove
  33. 33. Advantages and Disadvantages  Advantages   Natural - users hands are used for interacting with both virtual and real objects. -  No need for special purpose input devices  Disadvantages   Interaction is limited only to 2D surface -  Full 3D interaction and manipulation is difficult
  34. 34. Orthogonal Nature of AR Interfaces
  35. 35. Back to the Real World  AR overcomes limitation of TUIs   enhance display possibilities   merge task/display space   provide public and private views  TUI + AR = Tangible AR   Apply TUI methods to AR interface design
  36. 36.   Space-multiplexed   Many devices each with one function -  Quicker to use, more intuitive, clutter -  Real Toolbox  Time-multiplexed   One device with many functions -  Space efficient -  mouse
  37. 37. Tangible AR: Tiles (Space Multiplexed)  Tiles semantics   data tiles   operation tiles  Operation on tiles   proximity   spatial arrangements   space-multiplexed
  38. 38. Space-multiplexed Interface Data authoring in Tiles
  39. 39. Proximity-based Interaction
  40. 40. Object Based Interaction: MagicCup  Intuitive Virtual Object Manipulation on a Table-Top Workspace   Time multiplexed   Multiple Markers -  Robust Tracking   Tangible User Interface -  Intuitive Manipulation   Stereo Display -  Good Presence
  41. 41. Our system  Main table, Menu table, Cup interface
  42. 42. Tangible AR: Time-multiplexed Interaction  Use of natural physical object manipulations to control virtual objects  VOMAR Demo   Catalog book: -  Turn over the page   Paddle operation: -  Push, shake, incline, hit, scoop
  43. 43. VOMAR Interface
  44. 44. Advantages and Disadvantages  Advantages   Natural interaction with virtual and physical tools -  No need for special purpose input devices   Spatial interaction with virtual objects -  3D manipulation with virtual objects anywhere in physical space  Disadvantages   Requires Head Mounted Display
  45. 45. Wrap-up  Browsing Interfaces   simple (conceptually!), unobtrusive  3D AR Interfaces   expressive, creative, require attention  Tangible Interfaces   Embedded into conventional environments  Tangible AR   Combines TUI input + AR display
  46. 46. AR User Interface: Categorization  Traditional Desktop: keyboard, mouse, joystick (with or without 2D/3D GUI)  Specialized/VR Device: 3D VR devices, specially design device
  47. 47. AR User Interface: Categorization  Tangible Interface : using physical object Hand/ Touch Interface : using pose and gesture of hand, fingers  Body Interface: using movement of body
  48. 48. AR User Interface: Categorization  Speech Interface: voice, speech control  Multimodal Interface : Gesture + Speech  Haptic Interface : haptic feedback  Eye Tracking, Physiological, Brain Computer Interface..
  49. 49. OSGART:From Registration to Interaction
  50. 50. Keyboard and Mouse Interaction  Traditional input techniques  OSG provides a framework for handling keyboard and mouse input events (osgGA) 1.  Subclass osgGA::GUIEventHandler 2.  Handle events: •  Mouse up / down / move / drag / scroll-wheel •  Key up / down 3.  Add instance of new handler to the viewer
  51. 51. Keyboard and Mouse Interaction   Create your own event handler classclass KeyboardMouseEventHandler : public osgGA::GUIEventHandler {public: KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { } virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa, osg::Object* obj, osg::NodeVisitor* nv) { switch (ea.getEventType()) { // Possible events we can handle case osgGA::GUIEventAdapter::PUSH: break; case osgGA::GUIEventAdapter::RELEASE: break; case osgGA::GUIEventAdapter::MOVE: break; case osgGA::GUIEventAdapter::DRAG: break; case osgGA::GUIEventAdapter::SCROLL: break; case osgGA::GUIEventAdapter::KEYUP: break; case osgGA::GUIEventAdapter::KEYDOWN: break; } return false; }};   Add it to the viewer to receive eventsviewer.addEventHandler(new KeyboardMouseEventHandler());
  52. 52. Keyboard Interaction   Handle W,A,S,D keys to move an objectcase osgGA::GUIEventAdapter::KEYDOWN: { switch (ea.getKey()) { case w: // Move forward 5mm localTransform->preMult(osg::Matrix::translate(0, -5, 0)); return true; case s: // Move back 5mm localTransform->preMult(osg::Matrix::translate(0, 5, 0)); return true; case a: // Rotate 10 degrees left localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS)); return true; case d: // Rotate 10 degrees right localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS)); return true; case : // Reset the transformation localTransform->setMatrix(osg::Matrix::identity()); return true; }break;localTransform = new osg::MatrixTransform();localTransform->addChild(osgDB::readNodeFile("media/car.ive"));arTransform->addChild(localTransform.get());
  53. 53. Keyboard Interaction Demo
  54. 54. Mouse Interaction  Mouse is pointing device…  Use mouse to select objects in an AR scene  OSG provides methods for ray-casting and intersection testing   Return an osg::NodePath (the path from the hit node all the way back to the root) Projection Plane (screen) scene
  55. 55. Mouse Interaction  Compute the list of nodes under the clicked position  Invoke an action on nodes that are hit, e.g. select, deletecase osgGA::GUIEventAdapter::PUSH: osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa); osgUtil::LineSegmentIntersector::Intersections intersections; // Clear previous selections for (unsigned int i = 0; i < targets.size(); i++) { targets[i]->setSelected(false); } // Find new selection based on click position if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) { for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) { if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) { std::cout << "HIT!" << std::endl; target->setSelected(true); return true; } } } break;
  56. 56. Mouse Interaction Demo
  57. 57. Proximity Techniques  Interaction based on   the distance between a marker and the camera   the distance between multiple markers
  58. 58. Single Marker Techniques: Proximity  Use distance from camera to marker as input parameter   e.g. Lean in close to examine  Can use the osg::LOD class to show different content at different depth ranges Image: OpenSG Consortium
  59. 59. Single Marker Techniques: Proximity// Load some modelsosg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg");osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg");osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");// Use a Level-Of-Detail node to show each model at different distance ranges.osg::ref_ptr<osg::LOD> lod = new osg::LOD();lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m awaylod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm awaylod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm awayarTransform->addChild(lod.get());  Define depth ranges for each node  Add as many as you want  Ranges can overlap
  60. 60. Single Marker Proximity Demo
  61. 61. Multiple Marker Concepts  Interaction based on the relationship between markers   e.g. When the distance between two markers decreases below threshold invoke an action   Tangible User Interface  Applications:   Memory card games   File operations
  62. 62. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance > Threshold Switch A Switch BModel Model Model Model A1 A2 B1 B2
  63. 63. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance <= Threshold Switch A Switch BModel Model Model Model A1 A2 B1 B2
  64. 64. Multiple Marker Proximity  Use a node callback to test for proximity and update the relevant nodesvirtual void operator()(osg::Node* node, osg::NodeVisitor* nv) { if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) { if (mMarkerA->valid() && mMarkerB->valid()) { osg::Vec3 posA = mMarkerA->getTransform().getTrans(); osg::Vec3 posB = mMarkerB->getTransform().getTrans(); osg::Vec3 offset = posA - posB; float distance = offset.length(); if (distance <= mThreshold) { if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1); if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1); } else { if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0); if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0); } } } traverse(node,nv);}
  65. 65. Multiple Marker Proximity
  66. 66. Paddle Interaction  Use one marker as a tool for selecting and manipulating objects (tangible user interface)  Another marker provides a frame of reference   A grid of markers can alleviate problems with occlusion MagicCup (Kato et al) VOMAR (Kato et al)
  67. 67. Paddle Interaction  Often useful to adopt a local coordinate system   Allows the camera to move without disrupting Tlocal   Places the paddle in the same coordinate system as the content on the grid   Simplifies interaction  osgART computes Tlocal using the osgART::LocalTransformationCallback
  68. 68. Tilt and Shake Interaction  Detect types of paddle movement:   Tilt -  gradual change in orientation   Shake -  short, sudden changes in translation
  69. 69. Building Tangible AR Interfaces with ARToolKit
  70. 70. Required Code  Calculating Camera Position   Range to marker  Loading Multiple Patterns/Models  Interaction between objects   Proximity   Relative position/orientation  Occlusion   Stencil buffering   Multi-marker tracking
  71. 71. Tangible AR Coordinate Frames
  72. 72. Local vs. Global Interactions  Local   Actions determined from single camera to marker transform -  shaking, appearance, relative position, range  Global   Actions determined from two relationships -  marker to camera, world to camera coords. -  Marker transform determined in world coordinates •  object tilt, absolute position, absolute rotation, hitting
  73. 73. Range-based Interaction  Sample File: RangeTest.c/* get the camera transformation */arGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans);/* find the range */Xpos = marker_trans[0][3];Ypos = marker_trans[1][3];Zpos = marker_trans[2][3];range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
  74. 74. Loading Multiple Patterns  Sample File: LoadMulti.c   Uses object.c to load  Object Structure typedef struct { char name[256]; int id; int visible; double marker_coord[4][2]; double trans[3][4]; double marker_width; double marker_center[2]; } ObjectData_T;
  75. 75. Finding Multiple Transforms  Create object listObjectData_T *object;  Read in objects - in init( )read_ObjData( char *name, int *objectnum );  Find Transform – in mainLoop( )for( i = 0; i < objectnum; i++ ) { ..Check patterns ..Find transforms for each marker }
  76. 76. Drawing Multiple Objects  Send the object list to the draw functiondraw( object, objectnum );  Draw each object individuallyfor( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; argConvGlpara(object[i].trans, gl_para); draw_object( object[i].id, gl_para);}
  77. 77. Proximity Based Interaction  Sample File – CollideTest.c  Detect distance between markers checkCollisions(object[0],object[1], DIST) If distance < collide distance Then change the model/perform interaction
  78. 78. Multi-marker Tracking  Sample File – multiTest.c  Multiple markers to establish a single coordinate frame   Reading in a configuration file   Tracking from sets of markers   Careful camera calibration
  79. 79. MultiMarker Configuration File  Sample File - Data/multi/marker.dat  Contains list of all the patterns and their exact positions #the number of patterns to be recognized 6 Pattern File #marker 1 Pattern Width + Data/multi/patt.a Coordinate Origin 40.0 0.0 0.0 Pattern Transform 1.0000 0.0000 0.0000 -100.0000 Relative to Global 0.0000 1.0000 0.0000 50.0000 Origin 0.0000 0.0000 1.0000 0.0000 …
  80. 80. Camera Transform Calculation  Include <AR/arMulti.h>  Link to libARMulti.lib  In mainLoop()   Detect markers as usual arDetectMarkerLite(dataPtr, thresh, &marker_info, &marker_num)   Use MultiMarker Function if( (err=arMultiGetTransMat(marker_info, marker_num, config)) < 0 ) { argSwapBuffers(); return; }
  81. 81. Paddle-based InteractionTracking single marker relative to multi-marker set - paddle contains single marker
  82. 82. Paddle Interaction Code  Sample File – PaddleDemo.c  Get paddle marker location + draw paddle before drawing background model paddleGetTrans(paddleInfo, marker_info, marker_flag, marker_num, &cparam); /* draw the paddle */ if( paddleInfo->active ){ draw_paddle( paddleInfo); }draw_paddle uses a Stencil Buffer to increase realism
  83. 83. Paddle Interaction Code II  Sample File – paddleDrawDemo.c  Finds the paddle position relative to global coordinate frame: setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])  Sample File – paddleTouch.c  Finds the paddle position: findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);  Checks for collisions: checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
  84. 84. General Tangible AR Library  command_sub.c, command_sub.h  Contains functions for recognizing a range of different paddle motions: int check_shake( ); int check_punch( ); int check_incline( ); int check_pickup( ); int check_push( );  Eg: to check angle between paddle and base check_incline(paddle->trans, base->trans, &ang)

×