• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
426 lecture6b: AR Interaction
 

426 lecture6b: AR Interaction

on

  • 1,342 views

COSC 426 Lecture 6b on AR Interaction. P

COSC 426 Lecture 6b on AR Interaction. P

Statistics

Views

Total Views
1,342
Views on SlideShare
1,340
Embed Views
2

Actions

Likes
3
Downloads
64
Comments
0

1 Embed 2

https://si0.twimg.com 2

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    426 lecture6b: AR Interaction 426 lecture6b: AR Interaction Presentation Transcript

    • COSC 426: Augmented Reality Mark Billinghurst mark.billinghurst@hitlabnz.org August 14th 2012 Lecture 6: AR Interaction
    • Building Compelling AR Experiences experiences applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
    • AR Interaction  Designing AR System = Interface Design   Using different input and output technologies  Objective is a high quality of user experience   Ease of use and learning   Performance and satisfaction
    • User Interface and Tool   Human  User Interface/Tool  Machine/Object   Human Machine InterfaceTools User Interface © Andreas Dünser
    • User Interface: Characteristics  Input: mono or multimodal  Output: mono or multisensorial Sensation of Output movement  Technique/Metaphor/Paradigm InputMetaphor:“Push” to accelerate“Turn” to rotate © Andreas Dünser
    • Human Computer Interface  Human  User Interface Computer System  Human Computer Interface= Hardware +| Software  Computer is everywhere now HCI electronic devices, Home Automation, Transport vehicles, etc © Andreas Dünser
    • More terminology  Interaction Device= Input/Output of User Interface  Interaction Style= category of similar interaction techniques  Interaction Paradigm  Modality (human sense)  Usability
    • Back to AR  You can see spatially registered AR.. how can you interact with it?
    • Interaction Tasks  2D (from [Foley]):   Selection, Text Entry, Quantify, Position  3D (from [Bowman]):   Navigation (Travel/Wayfinding)   Selection   Manipulation   System Control/Data Input  AR: 2D + 3D Tasks and.. more specific tasks?[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D.,V. Wallace & P. Chan. IEEE ComputerGraphics and Applications(Nov.): 13-48. 1984.[Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
    • AR Interfaces as Data Browsers  2D/3D virtual objects are registered in 3D   “VR in Real World”  Interaction   2D/3D virtual viewpoint control  Applications   Visualization, training
    • AR Information Browsers  Information is registered to real-world context   Hand held AR displays  Interaction   Manipulation of a window into information space  Applications   Context-aware information displays Rekimoto, et al. 1997
    • Architecture
    • Current AR Information Browsers  Mobile AR   GPS + compass  Many Applications   Layar   Wikitude   Acrossair   PressLite   Yelp   AR Car Finder   …
    • Junaio  AR Browser from Metaio   http://www.junaio.com/  AR browsing   GPS + compass   2D/3D object placement   Photos/live video   Community viewing
    • Web Interface
    • Adding Models in Web Interface
    • Advantages and Disadvantages  Important class of AR interfaces   Wearable computers   AR simulation, training  Limited interactivity   Modification of virtual content is difficult Rekimoto, et al. 1997
    • 3D AR Interfaces  Virtual objects displayed in 3D physical space and manipulated   HMDs and 6DOF head-tracking   6DOF hand trackers for input  Interaction   Viewpoint control   Traditional 3D user interface Kiyokawa, et al. 2000 interaction: manipulation, selection, etc.
    • AR 3D Interaction
    • AR Graffitiwww.nextwall.net
    • Advantages and Disadvantages  Important class of AR interfaces   Entertainment, design, training  Advantages   User can interact with 3D virtual object everywhere in space   Natural, familiar interaction  Disadvantages   Usually no tactile feedback   User has to use different devices for virtual and physical objects Oshima, et al. 2000
    • Augmented Surfaces andTangible Interfaces  Basic principles   Virtual objects are projected on a surface   Physical objects are used as controls for virtual objects   Support for collaboration
    • Augmented Surfaces  Rekimoto, et al. 1998   Front projection   Marker-based tracking   Multiple projection surfaces
    • Tangible User Interfaces (Ishii 97)  Create digital shadows for physical objects  Foreground   graspable UI  Background   ambient interfaces
    • Tangible Interfaces - Ambient  Dangling String   Jeremijenko 1995   Ambient ethernet monitor   Relies on peripheral cues  Ambient Fixtures   Dahley, Wisneski, Ishii 1998   Use natural material qualities for information display
    • Tangible Interface: ARgroove  Collaborative Instrument  Exploring Physically Based Interaction   Map physical actions to Midi output -  Translation, rotation -  Tilt, shake
    • ARgroove in Use
    • Visual Feedback  Continuous Visual Feedback is Key  Single Virtual Image Provides:   Rotation   Tilt   Height
    • i/O Brush (Ryokai, Marti, Ishii)
    • Other Examples  Triangles (Gorbert 1998)   Triangular based story telling  ActiveCube (Kitamura 2000-)   Cubes with sensors
    • Lessons from Tangible Interfaces  Physical objects make us smart   Norman’s “Things that Make Us Smart”   encode affordances, constraints  Objects aid collaboration   establish shared meaning  Objects increase understanding   serve as cognitive artifacts
    • TUI Limitations  Difficult to change object properties   can’t tell state of digital data  Limited display capabilities   projection screen = 2D   dependent on physical display surface  Separation between object and display   ARgroove
    • Advantages and Disadvantages  Advantages   Natural - users hands are used for interacting with both virtual and real objects. -  No need for special purpose input devices  Disadvantages   Interaction is limited only to 2D surface -  Full 3D interaction and manipulation is difficult
    • Orthogonal Nature of AR Interfaces
    • Back to the Real World  AR overcomes limitation of TUIs   enhance display possibilities   merge task/display space   provide public and private views  TUI + AR = Tangible AR   Apply TUI methods to AR interface design
    •   Space-multiplexed   Many devices each with one function -  Quicker to use, more intuitive, clutter -  Real Toolbox  Time-multiplexed   One device with many functions -  Space efficient -  mouse
    • Tangible AR: Tiles (Space Multiplexed)  Tiles semantics   data tiles   operation tiles  Operation on tiles   proximity   spatial arrangements   space-multiplexed
    • Space-multiplexed Interface Data authoring in Tiles
    • Proximity-based Interaction
    • Object Based Interaction: MagicCup  Intuitive Virtual Object Manipulation on a Table-Top Workspace   Time multiplexed   Multiple Markers -  Robust Tracking   Tangible User Interface -  Intuitive Manipulation   Stereo Display -  Good Presence
    • Our system  Main table, Menu table, Cup interface
    • Tangible AR: Time-multiplexed Interaction  Use of natural physical object manipulations to control virtual objects  VOMAR Demo   Catalog book: -  Turn over the page   Paddle operation: -  Push, shake, incline, hit, scoop
    • VOMAR Interface
    • Advantages and Disadvantages  Advantages   Natural interaction with virtual and physical tools -  No need for special purpose input devices   Spatial interaction with virtual objects -  3D manipulation with virtual objects anywhere in physical space  Disadvantages   Requires Head Mounted Display
    • Wrap-up  Browsing Interfaces   simple (conceptually!), unobtrusive  3D AR Interfaces   expressive, creative, require attention  Tangible Interfaces   Embedded into conventional environments  Tangible AR   Combines TUI input + AR display
    • AR User Interface: Categorization  Traditional Desktop: keyboard, mouse, joystick (with or without 2D/3D GUI)  Specialized/VR Device: 3D VR devices, specially design device
    • AR User Interface: Categorization  Tangible Interface : using physical object Hand/ Touch Interface : using pose and gesture of hand, fingers  Body Interface: using movement of body
    • AR User Interface: Categorization  Speech Interface: voice, speech control  Multimodal Interface : Gesture + Speech  Haptic Interface : haptic feedback  Eye Tracking, Physiological, Brain Computer Interface..
    • OSGART:From Registration to Interaction
    • Keyboard and Mouse Interaction  Traditional input techniques  OSG provides a framework for handling keyboard and mouse input events (osgGA) 1.  Subclass osgGA::GUIEventHandler 2.  Handle events: •  Mouse up / down / move / drag / scroll-wheel •  Key up / down 3.  Add instance of new handler to the viewer
    • Keyboard and Mouse Interaction   Create your own event handler classclass KeyboardMouseEventHandler : public osgGA::GUIEventHandler {public: KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { } virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa, osg::Object* obj, osg::NodeVisitor* nv) { switch (ea.getEventType()) { // Possible events we can handle case osgGA::GUIEventAdapter::PUSH: break; case osgGA::GUIEventAdapter::RELEASE: break; case osgGA::GUIEventAdapter::MOVE: break; case osgGA::GUIEventAdapter::DRAG: break; case osgGA::GUIEventAdapter::SCROLL: break; case osgGA::GUIEventAdapter::KEYUP: break; case osgGA::GUIEventAdapter::KEYDOWN: break; } return false; }};   Add it to the viewer to receive eventsviewer.addEventHandler(new KeyboardMouseEventHandler());
    • Keyboard Interaction   Handle W,A,S,D keys to move an objectcase osgGA::GUIEventAdapter::KEYDOWN: { switch (ea.getKey()) { case w: // Move forward 5mm localTransform->preMult(osg::Matrix::translate(0, -5, 0)); return true; case s: // Move back 5mm localTransform->preMult(osg::Matrix::translate(0, 5, 0)); return true; case a: // Rotate 10 degrees left localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS)); return true; case d: // Rotate 10 degrees right localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS)); return true; case : // Reset the transformation localTransform->setMatrix(osg::Matrix::identity()); return true; }break;localTransform = new osg::MatrixTransform();localTransform->addChild(osgDB::readNodeFile("media/car.ive"));arTransform->addChild(localTransform.get());
    • Keyboard Interaction Demo
    • Mouse Interaction  Mouse is pointing device…  Use mouse to select objects in an AR scene  OSG provides methods for ray-casting and intersection testing   Return an osg::NodePath (the path from the hit node all the way back to the root) Projection Plane (screen) scene
    • Mouse Interaction  Compute the list of nodes under the clicked position  Invoke an action on nodes that are hit, e.g. select, deletecase osgGA::GUIEventAdapter::PUSH: osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa); osgUtil::LineSegmentIntersector::Intersections intersections; // Clear previous selections for (unsigned int i = 0; i < targets.size(); i++) { targets[i]->setSelected(false); } // Find new selection based on click position if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) { for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) { if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) { std::cout << "HIT!" << std::endl; target->setSelected(true); return true; } } } break;
    • Mouse Interaction Demo
    • Proximity Techniques  Interaction based on   the distance between a marker and the camera   the distance between multiple markers
    • Single Marker Techniques: Proximity  Use distance from camera to marker as input parameter   e.g. Lean in close to examine  Can use the osg::LOD class to show different content at different depth ranges Image: OpenSG Consortium
    • Single Marker Techniques: Proximity// Load some modelsosg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg");osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg");osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");// Use a Level-Of-Detail node to show each model at different distance ranges.osg::ref_ptr<osg::LOD> lod = new osg::LOD();lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m awaylod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm awaylod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm awayarTransform->addChild(lod.get());  Define depth ranges for each node  Add as many as you want  Ranges can overlap
    • Single Marker Proximity Demo
    • Multiple Marker Concepts  Interaction based on the relationship between markers   e.g. When the distance between two markers decreases below threshold invoke an action   Tangible User Interface  Applications:   Memory card games   File operations
    • Multiple Marker Proximity Virtual Camera Transform A Transform B Distance > Threshold Switch A Switch BModel Model Model Model A1 A2 B1 B2
    • Multiple Marker Proximity Virtual Camera Transform A Transform B Distance <= Threshold Switch A Switch BModel Model Model Model A1 A2 B1 B2
    • Multiple Marker Proximity  Use a node callback to test for proximity and update the relevant nodesvirtual void operator()(osg::Node* node, osg::NodeVisitor* nv) { if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) { if (mMarkerA->valid() && mMarkerB->valid()) { osg::Vec3 posA = mMarkerA->getTransform().getTrans(); osg::Vec3 posB = mMarkerB->getTransform().getTrans(); osg::Vec3 offset = posA - posB; float distance = offset.length(); if (distance <= mThreshold) { if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1); if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1); } else { if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0); if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0); } } } traverse(node,nv);}
    • Multiple Marker Proximity
    • Paddle Interaction  Use one marker as a tool for selecting and manipulating objects (tangible user interface)  Another marker provides a frame of reference   A grid of markers can alleviate problems with occlusion MagicCup (Kato et al) VOMAR (Kato et al)
    • Paddle Interaction  Often useful to adopt a local coordinate system   Allows the camera to move without disrupting Tlocal   Places the paddle in the same coordinate system as the content on the grid   Simplifies interaction  osgART computes Tlocal using the osgART::LocalTransformationCallback
    • Tilt and Shake Interaction  Detect types of paddle movement:   Tilt -  gradual change in orientation   Shake -  short, sudden changes in translation
    • Building Tangible AR Interfaces with ARToolKit
    • Required Code  Calculating Camera Position   Range to marker  Loading Multiple Patterns/Models  Interaction between objects   Proximity   Relative position/orientation  Occlusion   Stencil buffering   Multi-marker tracking
    • Tangible AR Coordinate Frames
    • Local vs. Global Interactions  Local   Actions determined from single camera to marker transform -  shaking, appearance, relative position, range  Global   Actions determined from two relationships -  marker to camera, world to camera coords. -  Marker transform determined in world coordinates •  object tilt, absolute position, absolute rotation, hitting
    • Range-based Interaction  Sample File: RangeTest.c/* get the camera transformation */arGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans);/* find the range */Xpos = marker_trans[0][3];Ypos = marker_trans[1][3];Zpos = marker_trans[2][3];range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
    • Loading Multiple Patterns  Sample File: LoadMulti.c   Uses object.c to load  Object Structure typedef struct { char name[256]; int id; int visible; double marker_coord[4][2]; double trans[3][4]; double marker_width; double marker_center[2]; } ObjectData_T;
    • Finding Multiple Transforms  Create object listObjectData_T *object;  Read in objects - in init( )read_ObjData( char *name, int *objectnum );  Find Transform – in mainLoop( )for( i = 0; i < objectnum; i++ ) { ..Check patterns ..Find transforms for each marker }
    • Drawing Multiple Objects  Send the object list to the draw functiondraw( object, objectnum );  Draw each object individuallyfor( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; argConvGlpara(object[i].trans, gl_para); draw_object( object[i].id, gl_para);}
    • Proximity Based Interaction  Sample File – CollideTest.c  Detect distance between markers checkCollisions(object[0],object[1], DIST) If distance < collide distance Then change the model/perform interaction
    • Multi-marker Tracking  Sample File – multiTest.c  Multiple markers to establish a single coordinate frame   Reading in a configuration file   Tracking from sets of markers   Careful camera calibration
    • MultiMarker Configuration File  Sample File - Data/multi/marker.dat  Contains list of all the patterns and their exact positions #the number of patterns to be recognized 6 Pattern File #marker 1 Pattern Width + Data/multi/patt.a Coordinate Origin 40.0 0.0 0.0 Pattern Transform 1.0000 0.0000 0.0000 -100.0000 Relative to Global 0.0000 1.0000 0.0000 50.0000 Origin 0.0000 0.0000 1.0000 0.0000 …
    • Camera Transform Calculation  Include <AR/arMulti.h>  Link to libARMulti.lib  In mainLoop()   Detect markers as usual arDetectMarkerLite(dataPtr, thresh, &marker_info, &marker_num)   Use MultiMarker Function if( (err=arMultiGetTransMat(marker_info, marker_num, config)) < 0 ) { argSwapBuffers(); return; }
    • Paddle-based InteractionTracking single marker relative to multi-marker set - paddle contains single marker
    • Paddle Interaction Code  Sample File – PaddleDemo.c  Get paddle marker location + draw paddle before drawing background model paddleGetTrans(paddleInfo, marker_info, marker_flag, marker_num, &cparam); /* draw the paddle */ if( paddleInfo->active ){ draw_paddle( paddleInfo); }draw_paddle uses a Stencil Buffer to increase realism
    • Paddle Interaction Code II  Sample File – paddleDrawDemo.c  Finds the paddle position relative to global coordinate frame: setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])  Sample File – paddleTouch.c  Finds the paddle position: findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);  Checks for collisions: checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
    • General Tangible AR Library  command_sub.c, command_sub.h  Contains functions for recognizing a range of different paddle motions: int check_shake( ); int check_punch( ); int check_incline( ); int check_pickup( ); int check_push( );  Eg: to check angle between paddle and base check_incline(paddle->trans, base->trans, &ang)