Your SlideShare is downloading. ×
0
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
426 lecture 7: Designing AR Interfaces
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

426 lecture 7: Designing AR Interfaces

843

Published on

Lecture 7 in the 201

Lecture 7 in the 201

Published in: Technology
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
843
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
76
Comments
0
Likes
4
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. COSC 426: Augmented Reality Mark Billinghurst mark.billinghurst@hitlabnz.org Sept 5th 2012 Lecture 7: Designing AR Interfaces
  • 2. AR Interfaces  Browsing Interfaces   simple (conceptually!), unobtrusive  3D AR Interfaces   expressive, creative, require attention  Tangible Interfaces   Embedded into conventional environments  Tangible AR   Combines TUI input + AR display
  • 3. AR Interfaces as Data Browsers  2D/3D virtual objects are registered in 3D   “VR in Real World”  Interaction   2D/3D virtual viewpoint control  Applications   Visualization, training
  • 4. 3D AR Interfaces  Virtual objects displayed in 3D physical space and manipulated   HMDs and 6DOF head-tracking   6DOF hand trackers for input  Interaction   Viewpoint control   Traditional 3D user interface Kiyokawa, et al. 2000 interaction: manipulation, selection, etc.
  • 5. Augmented Surfaces andTangible Interfaces  Basic principles   Virtual objects are projected on a surface   Physical objects are used as controls for virtual objects   Support for collaboration
  • 6. Tangible Interfaces - Ambient  Dangling String   Jeremijenko 1995   Ambient ethernet monitor   Relies on peripheral cues  Ambient Fixtures   Dahley, Wisneski, Ishii 1998   Use natural material qualities for information display
  • 7. Back to the Real World  AR overcomes limitation of TUIs   enhance display possibilities   merge task/display space   provide public and private views  TUI + AR = Tangible AR   Apply TUI methods to AR interface design
  • 8.   Space-multiplexed   Many devices each with one function -  Quicker to use, more intuitive, clutter -  Real Toolbox  Time-multiplexed   One device with many functions -  Space efficient -  mouse
  • 9. Tangible AR: Tiles (Space Multiplexed)  Tiles semantics   data tiles   operation tiles  Operation on tiles   proximity   spatial arrangements   space-multiplexed
  • 10. Tangible AR: Time-multiplexed Interaction  Use of natural physical object manipulations to control virtual objects  VOMAR Demo   Catalog book: -  Turn over the page   Paddle operation: -  Push, shake, incline, hit, scoop
  • 11. Building Compelling AR Experiences experiences applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
  • 12. Interface Design Path1/ Prototype Demonstration2/ Adoption of Interaction Techniques from other interface metaphors Augmented Reality3/ Development of new interface metaphors appropriate to the medium Virtual Reality4/ Development of formal theoretical models for predicting and modeling user actions Desktop WIMP
  • 13. Interface metaphors  Designed to be similar to a physical entity but also has own properties   e.g. desktop metaphor, search engine  Exploit user’s familiar knowledge, helping them to understand ‘the unfamiliar’  Conjures up the essence of the unfamiliar activity, enabling users to leverage of this to understand more aspects of the unfamiliar functionality  People find it easier to learn and talk about what they are doing at the computer interface in terms familiar to them
  • 14. Example: The spreadsheet  Analogous to ledger sheet  Interactive and computational  Easy to understand  Greatly extending what accountants and others could dowww.bricklin.com/history/refcards.htm
  • 15. Why was it so good?  It was simple, clear, and obvious to the users how to use the application and what it could do  “it is just a tool to allow others to work out their ideas and reduce the tedium of repeating the same calculations.”  capitalized on user’s familiarity with ledger sheets  Got the computer to perform a range of different calculations in response to user input
  • 16. Another classic  8010 Star office system targeted at workers not interested in computing per se  Spent several person-years at beginning working out the conceptual model  Simplified the electronic world, making it seem more familiar, less alien, and easier to learn Johnson et al (1989)
  • 17. The Star interface
  • 18. Benefits of interface metaphors  Makes learning new systems easier  Helps users understand the underlying conceptual model  Can be innovative and enable the realm of computers and their applications to be made more accessible to a greater diversity of users
  • 19. Problems with interface metaphors (Nielson, 1990)  Break conventional and cultural rules   e.g., recycle bin placed on desktop  Can constrain designers in the way they conceptualize a problem  Conflict with design principles  Forces users to only understand the system in terms of the metaphor  Designers can inadvertently use bad existing designs and transfer the bad parts over  Limits designers’ imagination with new conceptual models
  • 20. Microsoft Bob
  • 21.   PSDoom – killing processes
  • 22. AR Design Principles  Interface Components   Physical components   Display elements -  Visual/audio   Interaction metaphors Physical Display Elements Interaction Elements Metaphor Input Output
  • 23. Back to the Real World  AR overcomes limitation of TUIs   enhance display possibilities   merge task/display space   provide public and private views  TUI + AR = Tangible AR   Apply TUI methods to AR interface design
  • 24. AR Design Space Reality Virtual Reality Augmented RealityPhysical Design Virtual Design
  • 25. Tangible AR Design Principles  Tangible AR Interfaces use TUI principles   Physical controllers for moving virtual content   Support for spatial 3D interaction techniques   Time and space multiplexed interaction   Support for multi-handed interaction   Match object affordances to task requirements   Support parallel activity with multiple objects   Allow collaboration between multiple users
  • 26.   Space-multiplexed   Many devices each with one function -  Quicker to use, more intuitive, clutter -  Tiles Interface, toolbox  Time-multiplexed   One device with many functions -  Space efficient -  VOMAR Interface, mouse
  • 27. Design of Objects  Objects   Purposely built – affordances   “Found” – repurposed   Existing – already at use in marketplace  Make affordances obvious (Norman)   Object affordances visible   Give feedback   Provide constraints   Use natural mapping   Use good cognitive model
  • 28. Object Design
  • 29. Affordances: to give a clue  Refers to an attribute of an object that allows people to know how to use it   e.g. a mouse button invites pushing, a door handle affords pulling  Norman (1988) used the term to discuss the design of everyday objects  Since has been much popularised in interaction design to discuss how to design interface objects   e.g. scrollbars to afford moving up and down, icons to afford clicking on
  • 30. "...the term affordance refers to the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used. [...] Affordances provide strong clues to the operations of things. Plates are for pushing. Knobs are for turning. Slots are for inserting things into. Balls are for throwing or bouncing. When affordances are taken advantage of, the user knows what to do just by looking: no picture, label, or instruction needed." (Norman, The Psychology of Everyday Things 1988, p.9)
  • 31. Physical Affordances  Physical affordances: How do the following physical objects afford? Are they obvious?
  • 32. ‘Affordance’ and Interface Design?  Interfaces are virtual and do not have affordances like physical objects  Norman argues it does not make sense to talk about interfaces in terms of ‘real’ affordances  Instead interfaces are better conceptualized as ‘perceived’ affordances   Learned conventions of arbitrary mappings between action and effect at the interface   Some mappings are better than others
  • 33. Virtual Affordances  Virtual affordances How do the following screen objects afford? What if you were a novice user? Would you know what to do with them?
  • 34.   AR is mixture of physical affordance and virtual affordance  Physical   Tangible controllers and objects  Virtual   Virtual graphics and audio
  • 35. Case Study 1: 3D AR LensGoal: Develop a lens based AR interface  MagicLenses   Developed at Xerox PARC in 1993   View a region of the workspace differently to the rest   Overlap MagicLenses to create composite effects
  • 36. 3D MagicLensesMagicLenses extended to 3D (Veiga et. al. 96)  Volumetric and flat lenses
  • 37. AR Lens Design Principles  Physical Components   Lens handle -  Virtual lens attached to real object  Display Elements   Lens view -  Reveal layers in dataset  Interaction Metaphor   Physically holding lens
  • 38. 3D AR Lenses: Model Viewer  Displays models made up of multiple parts  Each part can be shown or hidden through the lens  Allows the user to peer inside the model  Maintains focus + context
  • 39. AR Lens Demo
  • 40. AR Lens ImplementationStencil Buffer Outside LensInside Lens Virtual Magnifying Glass
  • 41. AR FlexiLensReal handles/controllers with flexible AR lens
  • 42. Techniques based on AR Lenses  Object Selection   Select objects by targeting them with the lens  Information Filtering   Show different representations through the lens   Hide certain content to reduce clutter, look inside things  Move between AR and VR   Transition along the reality-virtuality continuum   Change our viewpoint to suit our needs
  • 43. Case Study 2 : LevelHead  Block based game
  • 44. Case Study 2: LevelHead  Physical Components   Real blocks  Display Elements   Virtual person and rooms  Interaction Metaphor   Blocks are rooms
  • 45. Case Study 3: AR Chemistry (Fjeld 2002)  Tangible AR chemistry education
  • 46. Goal: An AR application to test molecular structure in chemistry  Physical Components   Real book, rotation cube, scoop, tracking markers  Display Elements   AR atoms and molecules  Interaction Metaphor   Build your own molecule
  • 47. AR Chemistry Input Devices
  • 48. Case Study 4: Transitional InterfacesGoal: An AR interface supporting transitions from reality to virtual reality  Physical Components   Real book  Display Elements   AR and VR content  Interaction Metaphor   Book pages hold virtual scenes
  • 49. Milgram’s Continuum (1994) Mixed Reality (MR)Reality Augmented Augmented Virtuality(Tangible Reality (AR) Virtuality (AV) (VirtualInterfaces) Reality) Central Hypothesis   The next generation of interfaces will support transitions along the Reality-Virtuality continuum
  • 50. Transitions  Interfaces of the future will need to support transitions along the RV continuum  Augmented Reality is preferred for:   co-located collaboration  Immersive Virtual Reality is preferred for:   experiencing world immersively (egocentric)   sharing views   remote collaboration
  • 51. The MagicBook  Design Goals:   Allows user to move smoothly between reality and virtual reality   Support collaboration
  • 52. MagicBook Metaphor
  • 53. Features  Seamless transition between Reality and Virtuality   Reliance on real decreases as virtual increases  Supports egocentric and exocentric views   User can pick appropriate view  Computer becomes invisible   Consistent interface metaphors   Virtual content seems real  Supports collaboration
  • 54. Collaboration  Collaboration on multiple levels:   Physical Object   AR Object   Immersive Virtual Space  Egocentric + exocentric collaboration   multiple multi-scale users  Independent Views   Privacy, role division, scalability
  • 55. Technology  Reality   No technology  Augmented Reality   Camera – tracking   Switch – fly in  Virtual Reality   Compass – tracking   Press pad – move   Switch – fly out
  • 56. Scientific Visualization
  • 57. Education
  • 58. Summary  When designing AR interfaces, think of:   Physical Components -  Physical affordances   Virtual Components -  Virtual affordances   Interface Metaphors
  • 59. OSGART:From Registration to Interaction
  • 60. Keyboard and Mouse Interaction  Traditional input techniques  OSG provides a framework for handling keyboard and mouse input events (osgGA) 1.  Subclass osgGA::GUIEventHandler 2.  Handle events: •  Mouse up / down / move / drag / scroll-wheel •  Key up / down 3.  Add instance of new handler to the viewer
  • 61. Keyboard and Mouse Interaction   Create your own event handler classclass KeyboardMouseEventHandler : public osgGA::GUIEventHandler {public: KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { } virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa, osg::Object* obj, osg::NodeVisitor* nv) { switch (ea.getEventType()) { // Possible events we can handle case osgGA::GUIEventAdapter::PUSH: break; case osgGA::GUIEventAdapter::RELEASE: break; case osgGA::GUIEventAdapter::MOVE: break; case osgGA::GUIEventAdapter::DRAG: break; case osgGA::GUIEventAdapter::SCROLL: break; case osgGA::GUIEventAdapter::KEYUP: break; case osgGA::GUIEventAdapter::KEYDOWN: break; } return false; }};   Add it to the viewer to receive eventsviewer.addEventHandler(new KeyboardMouseEventHandler());
  • 62. Keyboard Interaction   Handle W,A,S,D keys to move an objectcase osgGA::GUIEventAdapter::KEYDOWN: { switch (ea.getKey()) { case w: // Move forward 5mm localTransform->preMult(osg::Matrix::translate(0, -5, 0)); return true; case s: // Move back 5mm localTransform->preMult(osg::Matrix::translate(0, 5, 0)); return true; case a: // Rotate 10 degrees left localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS)); return true; case d: // Rotate 10 degrees right localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS)); return true; case : // Reset the transformation localTransform->setMatrix(osg::Matrix::identity()); return true; }break;localTransform = new osg::MatrixTransform();localTransform->addChild(osgDB::readNodeFile("media/car.ive"));arTransform->addChild(localTransform.get());
  • 63. Keyboard Interaction Demo
  • 64. Mouse Interaction  Mouse is pointing device…  Use mouse to select objects in an AR scene  OSG provides methods for ray-casting and intersection testing   Return an osg::NodePath (the path from the hit node all the way back to the root) Projection Plane (screen) scene
  • 65. Mouse Interaction  Compute the list of nodes under the clicked position  Invoke an action on nodes that are hit, e.g. select, deletecase osgGA::GUIEventAdapter::PUSH: osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa); osgUtil::LineSegmentIntersector::Intersections intersections; // Clear previous selections for (unsigned int i = 0; i < targets.size(); i++) { targets[i]->setSelected(false); } // Find new selection based on click position if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) { for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) { if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) { std::cout << "HIT!" << std::endl; target->setSelected(true); return true; } } } break;
  • 66. Mouse Interaction Demo
  • 67. Proximity Techniques  Interaction based on   the distance between a marker and the camera   the distance between multiple markers
  • 68. Single Marker Techniques: Proximity  Use distance from camera to marker as input parameter   e.g. Lean in close to examine  Can use the osg::LOD class to show different content at different depth ranges Image: OpenSG Consortium
  • 69. Single Marker Techniques: Proximity// Load some modelsosg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg");osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg");osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");// Use a Level-Of-Detail node to show each model at different distance ranges.osg::ref_ptr<osg::LOD> lod = new osg::LOD();lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m awaylod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm awaylod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm awayarTransform->addChild(lod.get());  Define depth ranges for each node  Add as many as you want  Ranges can overlap
  • 70. Single Marker Proximity Demo
  • 71. Multiple Marker Concepts  Interaction based on the relationship between markers   e.g. When the distance between two markers decreases below threshold invoke an action   Tangible User Interface  Applications:   Memory card games   File operations
  • 72. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance > Threshold Switch A Switch BModel Model Model Model A1 A2 B1 B2
  • 73. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance <= Threshold Switch A Switch BModel Model Model Model A1 A2 B1 B2
  • 74. Multiple Marker Proximity  Use a node callback to test for proximity and update the relevant nodesvirtual void operator()(osg::Node* node, osg::NodeVisitor* nv) { if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) { if (mMarkerA->valid() && mMarkerB->valid()) { osg::Vec3 posA = mMarkerA->getTransform().getTrans(); osg::Vec3 posB = mMarkerB->getTransform().getTrans(); osg::Vec3 offset = posA - posB; float distance = offset.length(); if (distance <= mThreshold) { if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1); if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1); } else { if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0); if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0); } } } traverse(node,nv);}
  • 75. Multiple Marker Proximity
  • 76. Paddle Interaction  Use one marker as a tool for selecting and manipulating objects (tangible user interface)  Another marker provides a frame of reference   A grid of markers can alleviate problems with occlusion MagicCup (Kato et al) VOMAR (Kato et al)
  • 77. Paddle Interaction  Often useful to adopt a local coordinate system   Allows the camera to move without disrupting Tlocal   Places the paddle in the same coordinate system as the content on the grid   Simplifies interaction  osgART computes Tlocal using the osgART::LocalTransformationCallback
  • 78. Tilt and Shake Interaction  Detect types of paddle movement:   Tilt -  gradual change in orientation   Shake -  short, sudden changes in translation
  • 79. Building Tangible AR Interfaces with ARToolKit
  • 80. Required Code  Calculating Camera Position   Range to marker  Loading Multiple Patterns/Models  Interaction between objects   Proximity   Relative position/orientation  Occlusion   Stencil buffering   Multi-marker tracking
  • 81. Tangible AR Coordinate Frames
  • 82. Local vs. Global Interactions  Local   Actions determined from single camera to marker transform -  shaking, appearance, relative position, range  Global   Actions determined from two relationships -  marker to camera, world to camera coords. -  Marker transform determined in world coordinates •  object tilt, absolute position, absolute rotation, hitting
  • 83. Range-based Interaction  Sample File: RangeTest.c/* get the camera transformation */arGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans);/* find the range */Xpos = marker_trans[0][3];Ypos = marker_trans[1][3];Zpos = marker_trans[2][3];range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
  • 84. Loading Multiple Patterns  Sample File: LoadMulti.c   Uses object.c to load  Object Structure typedef struct { char name[256]; int id; int visible; double marker_coord[4][2]; double trans[3][4]; double marker_width; double marker_center[2]; } ObjectData_T;
  • 85. Finding Multiple Transforms  Create object listObjectData_T *object;  Read in objects - in init( )read_ObjData( char *name, int *objectnum );  Find Transform – in mainLoop( )for( i = 0; i < objectnum; i++ ) { ..Check patterns ..Find transforms for each marker }
  • 86. Drawing Multiple Objects  Send the object list to the draw functiondraw( object, objectnum );  Draw each object individuallyfor( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; argConvGlpara(object[i].trans, gl_para); draw_object( object[i].id, gl_para);}
  • 87. Proximity Based Interaction  Sample File – CollideTest.c  Detect distance between markers checkCollisions(object[0],object[1], DIST) If distance < collide distance Then change the model/perform interaction
  • 88. Multi-marker Tracking  Sample File – multiTest.c  Multiple markers to establish a single coordinate frame   Reading in a configuration file   Tracking from sets of markers   Careful camera calibration
  • 89. MultiMarker Configuration File  Sample File - Data/multi/marker.dat  Contains list of all the patterns and their exact positions #the number of patterns to be recognized 6 Pattern File #marker 1 Pattern Width + Data/multi/patt.a Coordinate Origin 40.0 0.0 0.0 Pattern Transform 1.0000 0.0000 0.0000 -100.0000 Relative to Global 0.0000 1.0000 0.0000 50.0000 Origin 0.0000 0.0000 1.0000 0.0000 …
  • 90. Camera Transform Calculation  Include <AR/arMulti.h>  Link to libARMulti.lib  In mainLoop()   Detect markers as usual arDetectMarkerLite(dataPtr, thresh, &marker_info, &marker_num)   Use MultiMarker Function if( (err=arMultiGetTransMat(marker_info, marker_num, config)) < 0 ) { argSwapBuffers(); return; }
  • 91. Paddle-based InteractionTracking single marker relative to multi-marker set - paddle contains single marker
  • 92. Paddle Interaction Code  Sample File – PaddleDemo.c  Get paddle marker location + draw paddle before drawing background model paddleGetTrans(paddleInfo, marker_info, marker_flag, marker_num, &cparam); /* draw the paddle */ if( paddleInfo->active ){ draw_paddle( paddleInfo); }draw_paddle uses a Stencil Buffer to increase realism
  • 93. Paddle Interaction Code II  Sample File – paddleDrawDemo.c  Finds the paddle position relative to global coordinate frame: setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])  Sample File – paddleTouch.c  Finds the paddle position: findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);  Checks for collisions: checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
  • 94. General Tangible AR Library  command_sub.c, command_sub.h  Contains functions for recognizing a range of different paddle motions: int check_shake( ); int check_punch( ); int check_incline( ); int check_pickup( ); int check_push( );  Eg: to check angle between paddle and base check_incline(paddle->trans, base->trans, &ang)

×