SlideShare a Scribd company logo
COSC 426: Augmented Reality
           Mark Billinghurst
     mark.billinghurst@hitlabnz.org

           August 14th 2012

       Lecture 6: AR Interaction
Building Compelling AR Experiences

          experiences

          applications   Interaction


             tools       Authoring


          components     Tracking, Display



                                       Sony CSL © 2004
AR Interaction
  Designing AR System = Interface Design
     Using different input and output technologies
  Objective is a high quality of user experience
     Ease of use and learning
     Performance and satisfaction
User Interface and Tool
   Human  User Interface/Tool  Machine/Object
   Human Machine Interface




Tools




                                             User
                                             Interface


                          © Andreas Dünser
User Interface: Characteristics
  Input: mono or multimodal
  Output: mono or multisensorial                       Sensation of
                                              Output
   movement
  Technique/Metaphor/Paradigm


          Input




Metaphor:
“Push” to accelerate
“Turn” to rotate


                           © Andreas Dünser
Human Computer Interface
  Human  User Interface Computer System
  Human Computer Interface=
       Hardware +| Software
  Computer is everywhere now HCI electronic
   devices, Home Automation, Transport vehicles, etc 




                        © Andreas Dünser
More terminology
  Interaction Device= Input/Output of User
   Interface
  Interaction Style= category of similar
   interaction techniques
  Interaction Paradigm
  Modality (human sense)
  Usability
Back to AR
  You can see spatially registered AR..
       how can you interact with it?
Interaction Tasks
  2D (from [Foley]):
        Selection, Text Entry, Quantify, Position
  3D (from [Bowman]):
        Navigation (Travel/Wayfinding)
        Selection
        Manipulation
        System Control/Data Input
  AR: 2D + 3D Tasks and.. more specific tasks?

[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D.,V. Wallace & P. Chan. IEEE Computer
Graphics and Applications(Nov.): 13-48. 1984.
[Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
AR Interfaces as Data Browsers
  2D/3D virtual objects are
   registered in 3D
    “VR in Real World”
  Interaction
    2D/3D virtual viewpoint
     control
  Applications
    Visualization, training
AR Information Browsers
  Information is registered to
   real-world context
     Hand held AR displays
  Interaction
     Manipulation of a window
      into information space
  Applications
     Context-aware information displays


                                           Rekimoto, et al. 1997
Architecture
Current AR Information Browsers
  Mobile AR
     GPS + compass
  Many Applications
       Layar
       Wikitude
       Acrossair
       PressLite
       Yelp
       AR Car Finder
       …
Junaio
  AR Browser from Metaio
    http://www.junaio.com/
  AR browsing
    GPS + compass
    2D/3D object placement
    Photos/live video
    Community viewing
Web Interface
Adding Models in Web Interface
Advantages and Disadvantages

  Important class of AR interfaces
     Wearable computers
     AR simulation, training
  Limited interactivity
     Modification of virtual
      content is difficult
                                  Rekimoto, et al. 1997
3D AR Interfaces
  Virtual objects displayed in 3D
   physical space and manipulated
     HMDs and 6DOF head-tracking
     6DOF hand trackers for input
  Interaction
     Viewpoint control
     Traditional 3D user interface           Kiyokawa, et al. 2000
      interaction: manipulation, selection,
      etc.
AR 3D Interaction
AR Graffiti




www.nextwall.net
Advantages and Disadvantages
  Important class of AR interfaces
     Entertainment, design, training
  Advantages
     User can interact with 3D virtual
      object everywhere in space
     Natural, familiar interaction
  Disadvantages
     Usually no tactile feedback
     User has to use different devices for
      virtual and physical objects
                                              Oshima, et al. 2000
Augmented Surfaces and
Tangible Interfaces
  Basic principles
     Virtual objects are
      projected on a surface
     Physical objects are used
      as controls for virtual
      objects
     Support for collaboration
Augmented Surfaces
  Rekimoto, et al. 1998
    Front projection
    Marker-based tracking
    Multiple projection surfaces
Tangible User Interfaces (Ishii 97)
  Create digital shadows
   for physical objects
  Foreground
    graspable UI
  Background
    ambient interfaces
Tangible Interfaces - Ambient
  Dangling String
     Jeremijenko 1995
     Ambient ethernet monitor
     Relies on peripheral cues
  Ambient Fixtures
     Dahley, Wisneski, Ishii 1998
     Use natural material qualities
       for information display
Tangible Interface: ARgroove
  Collaborative Instrument
  Exploring Physically Based Interaction
      Map physical actions to Midi output
       -  Translation, rotation
       -  Tilt, shake
ARgroove in Use
Visual Feedback
  Continuous Visual Feedback is Key
  Single Virtual Image Provides:
    Rotation
    Tilt
    Height
i/O Brush (Ryokai, Marti, Ishii)
Other Examples
  Triangles (Gorbert 1998)
    Triangular based story telling
  ActiveCube (Kitamura 2000-)
    Cubes with sensors
Lessons from Tangible Interfaces
  Physical objects make us smart
    Norman’s “Things that Make Us Smart”
    encode affordances, constraints
  Objects aid collaboration
    establish shared meaning
  Objects increase understanding
    serve as cognitive artifacts
TUI Limitations

  Difficult to change object properties
     can’t tell state of digital data
  Limited display capabilities
     projection screen = 2D
     dependent on physical display surface
  Separation between object and display
     ARgroove
Advantages and Disadvantages
  Advantages
    Natural - users hands are used for interacting
     with both virtual and real objects.
     -  No need for special purpose input devices

  Disadvantages
    Interaction is limited only to 2D surface
     -  Full 3D interaction and manipulation is difficult
Orthogonal Nature of AR Interfaces
Back to the Real World

  AR overcomes limitation of TUIs
    enhance display possibilities
    merge task/display space
    provide public and private views


  TUI + AR = Tangible AR
    Apply TUI methods to AR interface design
  Space-multiplexed
      Many devices each with one function
        -  Quicker to use, more intuitive, clutter
        -  Real Toolbox

  Time-multiplexed
      One device with many functions
        -  Space efficient
        -  mouse
Tangible AR: Tiles (Space Multiplexed)
  Tiles semantics
     data tiles
     operation tiles
  Operation on tiles
     proximity
     spatial arrangements
     space-multiplexed
Space-multiplexed Interface




   Data authoring in Tiles
Proximity-based Interaction
Object Based Interaction: MagicCup
  Intuitive Virtual Object Manipulation
   on a Table-Top Workspace
    Time multiplexed
    Multiple Markers
     -  Robust Tracking
    Tangible User Interface
     -  Intuitive Manipulation
    Stereo Display
     -  Good Presence
Our system




  Main   table, Menu table, Cup interface
Tangible AR: Time-multiplexed Interaction
  Use of natural physical object manipulations to
   control virtual objects
  VOMAR Demo
     Catalog book:
      -  Turn over the page
     Paddle operation:
      -  Push, shake, incline, hit, scoop
VOMAR Interface
Advantages and Disadvantages
  Advantages
     Natural interaction with virtual and physical tools
      -  No need for special purpose input devices
     Spatial interaction with virtual objects
      -  3D manipulation with virtual objects anywhere in physical
         space

  Disadvantages
     Requires Head Mounted Display
Wrap-up
  Browsing Interfaces
    simple (conceptually!), unobtrusive
  3D AR Interfaces
    expressive, creative, require attention
  Tangible Interfaces
    Embedded into conventional environments
  Tangible AR
    Combines TUI input + AR display
AR User Interface: Categorization
  Traditional Desktop: keyboard, mouse,
   joystick (with or without 2D/3D GUI)
  Specialized/VR Device: 3D VR devices,
   specially design device
AR User Interface: Categorization
  Tangible Interface : using physical object Hand/
   Touch Interface : using pose and gesture of hand,
   fingers
  Body Interface: using movement of body
AR User Interface: Categorization
  Speech Interface: voice, speech control
  Multimodal Interface : Gesture + Speech
  Haptic Interface : haptic feedback
  Eye Tracking, Physiological, Brain Computer
   Interface..
OSGART:
From Registration to Interaction
Keyboard and Mouse Interaction
    Traditional input techniques
    OSG provides a framework for handling keyboard
     and mouse input events (osgGA)
      1.  Subclass osgGA::GUIEventHandler
      2.  Handle events:
         •    Mouse up / down / move / drag / scroll-wheel
         •    Key up / down
      3.  Add instance of new handler to the viewer
Keyboard and Mouse Interaction
       Create your own event handler class
class KeyboardMouseEventHandler : public osgGA::GUIEventHandler {

public:
   KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { }

     virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa,
        osg::Object* obj, osg::NodeVisitor* nv) {

         switch (ea.getEventType()) {
            // Possible events we can handle
            case osgGA::GUIEventAdapter::PUSH: break;
            case osgGA::GUIEventAdapter::RELEASE: break;
            case osgGA::GUIEventAdapter::MOVE: break;
            case osgGA::GUIEventAdapter::DRAG: break;
            case osgGA::GUIEventAdapter::SCROLL: break;
            case osgGA::GUIEventAdapter::KEYUP: break;
            case osgGA::GUIEventAdapter::KEYDOWN: break;
         }

         return false;
     }
};


       Add it to the viewer to receive events
viewer.addEventHandler(new KeyboardMouseEventHandler());
Keyboard Interaction
    Handle W,A,S,D keys to move an object
case osgGA::GUIEventAdapter::KEYDOWN: {

   switch (ea.getKey()) {
      case 'w': // Move forward 5mm
         localTransform->preMult(osg::Matrix::translate(0, -5, 0));
         return true;
      case 's': // Move back 5mm
         localTransform->preMult(osg::Matrix::translate(0, 5, 0));
         return true;
      case 'a': // Rotate 10 degrees left
         localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS));
         return true;
      case 'd': // Rotate 10 degrees right
         localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS));
         return true;
      case ' ': // Reset the transformation
         localTransform->setMatrix(osg::Matrix::identity());
         return true;
   }

break;


localTransform = new osg::MatrixTransform();
localTransform->addChild(osgDB::readNodeFile("media/car.ive"));
arTransform->addChild(localTransform.get());
Keyboard Interaction Demo
Mouse Interaction
  Mouse is pointing device…
  Use mouse to select objects in an AR scene
  OSG provides methods for ray-casting and
   intersection testing
    Return an osg::NodePath (the path from the hit
     node all the way back to the root)


                          Projection
                        Plane (screen)    scene
Mouse Interaction
  Compute the list of nodes under the clicked position
  Invoke an action on nodes that are hit, e.g. select, delete
case osgGA::GUIEventAdapter::PUSH:

   osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa);
   osgUtil::LineSegmentIntersector::Intersections intersections;

   // Clear previous selections
   for (unsigned int i = 0; i < targets.size(); i++) {
      targets[i]->setSelected(false);
   }

   // Find new selection based on click position
   if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) {
      for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin();
         iter != intersections.end(); iter++) {

            if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) {
               std::cout << "HIT!" << std::endl;
               target->setSelected(true);
               return true;
            }
       }
   }

   break;
Mouse Interaction Demo
Proximity Techniques
  Interaction based on
    the distance between a marker and the camera
    the distance between multiple markers
Single Marker Techniques: Proximity
  Use distance from camera to marker as
   input parameter
     e.g. Lean in close to examine
  Can use the osg::LOD class to show
   different content at different depth
   ranges                                  Image: OpenSG Consortium
Single Marker Techniques: Proximity
// Load some models
osg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg");
osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg");
osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg");

// Use a Level-Of-Detail node to show each model at different distance ranges.
osg::ref_ptr<osg::LOD> lod = new osg::LOD();
lod->addChild(farNode.get(), 500.0f, 10000.0f);      // Show the "far" node from 50cm to 10m away
lod->addChild(closerNode.get(), 200.0f, 500.0f);     // Show the "closer" node from 20cm to 50cm away
lod->addChild(nearNode.get(), 0.0f, 200.0f);         // Show the "near" node from 0cm to 2cm away

arTransform->addChild(lod.get());




  Define depth ranges for each node
  Add as many as you want
  Ranges can overlap
Single Marker Proximity Demo
Multiple Marker Concepts
  Interaction based on the relationship between
   markers
    e.g. When the distance between two markers
     decreases below threshold invoke an action
    Tangible User Interface
  Applications:
    Memory card games
    File operations
Multiple Marker Proximity

                           Virtual
                           Camera
    Transform A                         Transform B

                                                                Distance > Threshold

        Switch A                             Switch B




Model              Model             Model              Model
 A1                 A2                B1                 B2
Multiple Marker Proximity

                           Virtual
                           Camera
    Transform A                         Transform B

                                                                Distance <= Threshold

        Switch A                             Switch B




Model              Model             Model              Model
 A1                 A2                B1                 B2
Multiple Marker Proximity
  Use a node callback to test for proximity and update the relevant nodes

virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) {

    if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) {
       if (mMarkerA->valid() && mMarkerB->valid()) {

            osg::Vec3 posA =   mMarkerA->getTransform().getTrans();
            osg::Vec3 posB =   mMarkerB->getTransform().getTrans();
            osg::Vec3 offset   = posA - posB;
            float distance =   offset.length();

            if (distance <= mThreshold) {
               if (mSwitchA->getNumChildren()   > 1) mSwitchA->setSingleChildOn(1);
               if (mSwitchB->getNumChildren()   > 1) mSwitchB->setSingleChildOn(1);
            } else {
               if (mSwitchA->getNumChildren()   > 0) mSwitchA->setSingleChildOn(0);
               if (mSwitchB->getNumChildren()   > 0) mSwitchB->setSingleChildOn(0);
            }

        }

    }

    traverse(node,nv);

}
Multiple Marker Proximity
Paddle Interaction
  Use one marker as a tool for selecting and
   manipulating objects (tangible user interface)
  Another marker provides a frame of reference
      A grid of markers can alleviate problems with occlusion




 MagicCup (Kato et al)    VOMAR (Kato et al)
Paddle Interaction
  Often useful to adopt a local coordinate system

                                                       Allows the camera
                                                        to move without
                                                        disrupting Tlocal

                                                       Places the paddle in
                                                        the same coordinate
                                                        system as the
                                                        content on the grid
                                                            Simplifies interaction
  osgART computes Tlocal using the osgART::LocalTransformationCallback
Tilt and Shake Interaction

  Detect types of paddle movement:
    Tilt
      -  gradual change in orientation
    Shake
      -  short, sudden changes in translation
Building Tangible AR Interfaces
        with ARToolKit
Required Code
  Calculating Camera Position
     Range to marker
  Loading Multiple Patterns/Models
  Interaction between objects
     Proximity
     Relative position/orientation
  Occlusion
     Stencil buffering
     Multi-marker tracking
Tangible AR Coordinate Frames
Local vs. Global Interactions
  Local
     Actions determined from single camera to marker
      transform
      -  shaking, appearance, relative position, range
  Global
     Actions determined from two relationships
      -  marker to camera, world to camera coords.
      -  Marker transform determined in world coordinates
           •  object tilt, absolute position, absolute rotation, hitting
Range-based Interaction
  Sample File: RangeTest.c

/* get the camera transformation */
arGetTransMat(&marker_info[k], marker_center,
  marker_width, marker_trans);

/* find the range */
Xpos = marker_trans[0][3];
Ypos = marker_trans[1][3];
Zpos = marker_trans[2][3];
range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
Loading Multiple Patterns
  Sample File: LoadMulti.c
     Uses object.c to load
  Object Structure
   typedef struct {
     char       name[256];
     int        id;
     int        visible;
     double     marker_coord[4][2];
     double     trans[3][4];
     double     marker_width;
     double     marker_center[2];
   } ObjectData_T;
Finding Multiple Transforms
  Create object list
ObjectData_T        *object;

  Read in objects - in init( )
read_ObjData( char *name, int *objectnum );

  Find Transform – in mainLoop( )
for( i = 0; i < objectnum; i++ ) {
    ..Check patterns
    ..Find transforms for each marker
  }
Drawing Multiple Objects
  Send the object list to the draw function
draw( object, objectnum );
  Draw each object individually
for( i = 0; i < objectnum; i++ ) {
   if( object[i].visible == 0 ) continue;
   argConvGlpara(object[i].trans, gl_para);
   draw_object( object[i].id, gl_para);
}
Proximity Based Interaction

  Sample File – CollideTest.c
  Detect distance between markers
  checkCollisions(object[0],object[1], DIST)
  If distance < collide distance
  Then change the model/perform interaction
Multi-marker Tracking
  Sample File – multiTest.c
  Multiple markers to establish a
   single coordinate frame
    Reading in a configuration file
    Tracking from sets of markers
    Careful camera calibration
MultiMarker Configuration File
  Sample File - Data/multi/marker.dat
  Contains list of all the patterns and their exact
   positions
   #the number of patterns to be recognized
   6
                                  Pattern File

   #marker 1
                                           Pattern Width +
   Data/multi/patt.a
                                           Coordinate Origin
   40.0
   0.0 0.0                                    Pattern Transform
   1.0000 0.0000 0.0000 -100.0000             Relative to Global
   0.0000 1.0000 0.0000 50.0000               Origin
   0.0000 0.0000 1.0000 0.0000
   …
Camera Transform Calculation
  Include <AR/arMulti.h>
  Link to libARMulti.lib
  In mainLoop()
     Detect markers as usual
      arDetectMarkerLite(dataPtr, thresh,
              &marker_info, &marker_num)
     Use MultiMarker Function
      if( (err=arMultiGetTransMat(marker_info,
                            marker_num, config)) <
      0 ) {
               argSwapBuffers();
               return;
         }
Paddle-based Interaction




Tracking single marker relative to multi-marker set
  - paddle contains single marker
Paddle Interaction Code
  Sample File – PaddleDemo.c
  Get paddle marker location + draw paddle before drawing
   background model
   paddleGetTrans(paddleInfo, marker_info,
       marker_flag, marker_num, &cparam);

  /* draw the paddle */
  if( paddleInfo->active ){
      draw_paddle( paddleInfo);
  }
draw_paddle uses a Stencil Buffer to increase realism
Paddle Interaction Code II
  Sample File – paddleDrawDemo.c
  Finds the paddle position relative to global coordinate frame:
   setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])
  Sample File – paddleTouch.c
  Finds the paddle position:
   findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);
  Checks for collisions:
   checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
General Tangible AR Library
  command_sub.c, command_sub.h
  Contains functions for recognizing a range of
   different paddle motions:
   int   check_shake( );
   int   check_punch( );
   int   check_incline( );
   int   check_pickup( );
   int   check_push( );
  Eg: to check angle between paddle and base
   check_incline(paddle->trans, base->trans, &ang)

More Related Content

What's hot

Spatial computing - extending reality
Spatial computing - extending realitySpatial computing - extending reality
Spatial computing - extending reality
Kelan tutkimus / Research at Kela
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
Mark Billinghurst
 
2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design Guidelines2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design Guidelines
Mark Billinghurst
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
Mark Billinghurst
 
Mixed Reality in the Workspace
Mixed Reality in the WorkspaceMixed Reality in the Workspace
Mixed Reality in the Workspace
Mark Billinghurst
 
VR- virtual reality
VR- virtual realityVR- virtual reality
VR- virtual reality
Lakshmi Narayanan S
 
Virtual Reality
Virtual RealityVirtual Reality
Virtual Reality
Akshay Patole
 
Lecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual RealityLecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual Reality
Mark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
Mark Billinghurst
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
Mark Billinghurst
 
Tangible AR Interface
Tangible AR InterfaceTangible AR Interface
Tangible AR Interface
JongHyoun
 
COMP Lecture1 - Introduction to Virtual Reality
COMP Lecture1 - Introduction to Virtual RealityCOMP Lecture1 - Introduction to Virtual Reality
COMP Lecture1 - Introduction to Virtual Reality
Mark Billinghurst
 
WeARHand
WeARHandWeARHand
WeARHand
Taejin Ha
 
Augmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsAugmented Reality: The Next 20 Years
Augmented Reality: The Next 20 Years
Mark Billinghurst
 
Alvaro Cassinelli / Meta Perception Group leader
Alvaro Cassinelli / Meta Perception Group leaderAlvaro Cassinelli / Meta Perception Group leader
Alvaro Cassinelli / Meta Perception Group leader
School of Creative Media, City University, Hong KOng
 
Touching More than 3 Dimensions Research Into Novel Interfaces – three dimen...
Touching More than 3 Dimensions Research Into Novel Interfaces – three dimen...Touching More than 3 Dimensions Research Into Novel Interfaces – three dimen...
Touching More than 3 Dimensions Research Into Novel Interfaces – three dimen...
Joaquim Jorge
 
COSC 426 Lect. 6: Collaborative AR
COSC 426 Lect. 6: Collaborative ARCOSC 426 Lect. 6: Collaborative AR
COSC 426 Lect. 6: Collaborative AR
Mark Billinghurst
 
VRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst KeynoteVRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst Keynote
Mark Billinghurst
 
Multimodal man machine interaction
Multimodal man machine interactionMultimodal man machine interaction
Multimodal man machine interaction
Dr. Rajesh P Barnwal
 
IRJET - Touchless Technology
IRJET - Touchless TechnologyIRJET - Touchless Technology
IRJET - Touchless Technology
IRJET Journal
 

What's hot (20)

Spatial computing - extending reality
Spatial computing - extending realitySpatial computing - extending reality
Spatial computing - extending reality
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design Guidelines2013 Lecture 6: AR User Interface Design Guidelines
2013 Lecture 6: AR User Interface Design Guidelines
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
 
Mixed Reality in the Workspace
Mixed Reality in the WorkspaceMixed Reality in the Workspace
Mixed Reality in the Workspace
 
VR- virtual reality
VR- virtual realityVR- virtual reality
VR- virtual reality
 
Virtual Reality
Virtual RealityVirtual Reality
Virtual Reality
 
Lecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual RealityLecture 5: 3D User Interfaces for Virtual Reality
Lecture 5: 3D User Interfaces for Virtual Reality
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
 
Tangible AR Interface
Tangible AR InterfaceTangible AR Interface
Tangible AR Interface
 
COMP Lecture1 - Introduction to Virtual Reality
COMP Lecture1 - Introduction to Virtual RealityCOMP Lecture1 - Introduction to Virtual Reality
COMP Lecture1 - Introduction to Virtual Reality
 
WeARHand
WeARHandWeARHand
WeARHand
 
Augmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsAugmented Reality: The Next 20 Years
Augmented Reality: The Next 20 Years
 
Alvaro Cassinelli / Meta Perception Group leader
Alvaro Cassinelli / Meta Perception Group leaderAlvaro Cassinelli / Meta Perception Group leader
Alvaro Cassinelli / Meta Perception Group leader
 
Touching More than 3 Dimensions Research Into Novel Interfaces – three dimen...
Touching More than 3 Dimensions Research Into Novel Interfaces – three dimen...Touching More than 3 Dimensions Research Into Novel Interfaces – three dimen...
Touching More than 3 Dimensions Research Into Novel Interfaces – three dimen...
 
COSC 426 Lect. 6: Collaborative AR
COSC 426 Lect. 6: Collaborative ARCOSC 426 Lect. 6: Collaborative AR
COSC 426 Lect. 6: Collaborative AR
 
VRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst KeynoteVRCAI 2011 Billinghurst Keynote
VRCAI 2011 Billinghurst Keynote
 
Multimodal man machine interaction
Multimodal man machine interactionMultimodal man machine interaction
Multimodal man machine interaction
 
IRJET - Touchless Technology
IRJET - Touchless TechnologyIRJET - Touchless Technology
IRJET - Touchless Technology
 

Viewers also liked

426 lecture6a osgART Development
426 lecture6a osgART Development426 lecture6a osgART Development
426 lecture6a osgART Development
Mark Billinghurst
 
ARE 2011 AR Authoring
ARE 2011 AR AuthoringARE 2011 AR Authoring
ARE 2011 AR Authoring
Mark Billinghurst
 
426 lecture1: Introduction to AR
426 lecture1: Introduction to AR426 lecture1: Introduction to AR
426 lecture1: Introduction to AR
Mark Billinghurst
 
Augmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interactionAugmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interactionBello Abubakar
 
The Reality of Augmented Reality: Are we there yet?
The Reality of Augmented Reality: Are we there yet?The Reality of Augmented Reality: Are we there yet?
The Reality of Augmented Reality: Are we there yet?
Mark Billinghurst
 
2014 COSC 426 Lecture 2: Augmented Reality Technology
2014 COSC 426 Lecture 2: Augmented Reality Technology2014 COSC 426 Lecture 2: Augmented Reality Technology
2014 COSC 426 Lecture 2: Augmented Reality Technology
Mark Billinghurst
 

Viewers also liked (6)

426 lecture6a osgART Development
426 lecture6a osgART Development426 lecture6a osgART Development
426 lecture6a osgART Development
 
ARE 2011 AR Authoring
ARE 2011 AR AuthoringARE 2011 AR Authoring
ARE 2011 AR Authoring
 
426 lecture1: Introduction to AR
426 lecture1: Introduction to AR426 lecture1: Introduction to AR
426 lecture1: Introduction to AR
 
Augmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interactionAugmented reality the evolution of human computer interaction
Augmented reality the evolution of human computer interaction
 
The Reality of Augmented Reality: Are we there yet?
The Reality of Augmented Reality: Are we there yet?The Reality of Augmented Reality: Are we there yet?
The Reality of Augmented Reality: Are we there yet?
 
2014 COSC 426 Lecture 2: Augmented Reality Technology
2014 COSC 426 Lecture 2: Augmented Reality Technology2014 COSC 426 Lecture 2: Augmented Reality Technology
2014 COSC 426 Lecture 2: Augmented Reality Technology
 

Similar to 426 lecture6b: AR Interaction

COSC 426 lect. 4: AR Interaction
COSC 426 lect. 4: AR InteractionCOSC 426 lect. 4: AR Interaction
COSC 426 lect. 4: AR InteractionMark Billinghurst
 
Next generation User interfaces
Next generation User interfacesNext generation User interfaces
Next generation User interfacesHarshad Kt
 
Surface computing ppt
Surface computing pptSurface computing ppt
Surface computing ppt
Jayati Tiwari
 
Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02
Ankit Singh
 
Microsoft surface
Microsoft surfaceMicrosoft surface
Microsoft surface
sandrarachel
 
2016 AR Summer School Lecture3
2016 AR Summer School Lecture32016 AR Summer School Lecture3
2016 AR Summer School Lecture3
Mark Billinghurst
 
SVR2011 Keynote
SVR2011 KeynoteSVR2011 Keynote
SVR2011 Keynote
Mark Billinghurst
 
Abhishek meena
Abhishek meenaAbhishek meena
Abhishek meena
abhishek meena
 
microsoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.pptmicrosoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.ppt
Ankush306222
 
microsoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.pptmicrosoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.ppt
Bree79
 
Bally Sloane
Bally SloaneBally Sloane
Bally Sloane
billiboyuk
 
Human Computer Interaction
Human Computer InteractionHuman Computer Interaction
Human Computer Interaction
BHAKTI PATIL
 
Natural Interaction for Augmented Reality Applications
Natural Interaction for Augmented Reality ApplicationsNatural Interaction for Augmented Reality Applications
Natural Interaction for Augmented Reality Applications
Mark Billinghurst
 
surface computing
surface computingsurface computing
surface computing
Sunil Sahu
 
Papaer4 ea
Papaer4 eaPapaer4 ea
Papaer4 ea
Selina Chang
 
Multi touch table by vinay jain
Multi touch table by vinay jainMulti touch table by vinay jain
Multi touch table by vinay jainVinay Jain
 
Surface computer ppt
Surface computer pptSurface computer ppt
Surface computer ppttejalc
 
Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02
Narottam Singh
 
Nzgdc2004 Argaming Seminar
Nzgdc2004 Argaming SeminarNzgdc2004 Argaming Seminar
Nzgdc2004 Argaming SeminarTrond Nilsen
 
COSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research DirectionsCOSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research Directions
Mark Billinghurst
 

Similar to 426 lecture6b: AR Interaction (20)

COSC 426 lect. 4: AR Interaction
COSC 426 lect. 4: AR InteractionCOSC 426 lect. 4: AR Interaction
COSC 426 lect. 4: AR Interaction
 
Next generation User interfaces
Next generation User interfacesNext generation User interfaces
Next generation User interfaces
 
Surface computing ppt
Surface computing pptSurface computing ppt
Surface computing ppt
 
Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02
 
Microsoft surface
Microsoft surfaceMicrosoft surface
Microsoft surface
 
2016 AR Summer School Lecture3
2016 AR Summer School Lecture32016 AR Summer School Lecture3
2016 AR Summer School Lecture3
 
SVR2011 Keynote
SVR2011 KeynoteSVR2011 Keynote
SVR2011 Keynote
 
Abhishek meena
Abhishek meenaAbhishek meena
Abhishek meena
 
microsoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.pptmicrosoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.ppt
 
microsoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.pptmicrosoft-surface-ppt From EnggRoom.ppt
microsoft-surface-ppt From EnggRoom.ppt
 
Bally Sloane
Bally SloaneBally Sloane
Bally Sloane
 
Human Computer Interaction
Human Computer InteractionHuman Computer Interaction
Human Computer Interaction
 
Natural Interaction for Augmented Reality Applications
Natural Interaction for Augmented Reality ApplicationsNatural Interaction for Augmented Reality Applications
Natural Interaction for Augmented Reality Applications
 
surface computing
surface computingsurface computing
surface computing
 
Papaer4 ea
Papaer4 eaPapaer4 ea
Papaer4 ea
 
Multi touch table by vinay jain
Multi touch table by vinay jainMulti touch table by vinay jain
Multi touch table by vinay jain
 
Surface computer ppt
Surface computer pptSurface computer ppt
Surface computer ppt
 
Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02Surfacecomputerppt 130813063644-phpapp02
Surfacecomputerppt 130813063644-phpapp02
 
Nzgdc2004 Argaming Seminar
Nzgdc2004 Argaming SeminarNzgdc2004 Argaming Seminar
Nzgdc2004 Argaming Seminar
 
COSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research DirectionsCOSC 426 Lect. 8: AR Research Directions
COSC 426 Lect. 8: AR Research Directions
 

More from Mark Billinghurst

The Metaverse: Are We There Yet?
The  Metaverse:    Are   We  There  Yet?The  Metaverse:    Are   We  There  Yet?
The Metaverse: Are We There Yet?
Mark Billinghurst
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
Mark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
Mark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
Mark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
Mark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
Mark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Mark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
Mark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
Mark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
Mark Billinghurst
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
Mark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
Mark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
Mark Billinghurst
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
Mark Billinghurst
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
Mark Billinghurst
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
Mark Billinghurst
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
Mark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
Mark Billinghurst
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
Mark Billinghurst
 

More from Mark Billinghurst (20)

The Metaverse: Are We There Yet?
The  Metaverse:    Are   We  There  Yet?The  Metaverse:    Are   We  There  Yet?
The Metaverse: Are We There Yet?
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 

Recently uploaded

Accelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish CachingAccelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish Caching
Thijs Feryn
 
Key Trends Shaping the Future of Infrastructure.pdf
Key Trends Shaping the Future of Infrastructure.pdfKey Trends Shaping the Future of Infrastructure.pdf
Key Trends Shaping the Future of Infrastructure.pdf
Cheryl Hung
 
Connector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a buttonConnector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a button
DianaGray10
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
KatiaHIMEUR1
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
Kari Kakkonen
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
ThousandEyes
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
Ana-Maria Mihalceanu
 
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
UiPathCommunity
 
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 previewState of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
Prayukth K V
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
DanBrown980551
 
Essentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with ParametersEssentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with Parameters
Safe Software
 
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdfFIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance
 
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdfFIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance
 
Generating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using SmithyGenerating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using Smithy
g2nightmarescribd
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3
DianaGray10
 
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Jeffrey Haguewood
 
Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...
Product School
 
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMsTo Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
Paul Groth
 
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
Product School
 
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
Product School
 

Recently uploaded (20)

Accelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish CachingAccelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish Caching
 
Key Trends Shaping the Future of Infrastructure.pdf
Key Trends Shaping the Future of Infrastructure.pdfKey Trends Shaping the Future of Infrastructure.pdf
Key Trends Shaping the Future of Infrastructure.pdf
 
Connector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a buttonConnector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a button
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
 
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
 
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 previewState of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
 
Essentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with ParametersEssentials of Automations: Optimizing FME Workflows with Parameters
Essentials of Automations: Optimizing FME Workflows with Parameters
 
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdfFIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
 
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdfFIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
 
Generating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using SmithyGenerating a custom Ruby SDK for your web service or Rails API using Smithy
Generating a custom Ruby SDK for your web service or Rails API using Smithy
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3
 
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
 
Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...
 
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMsTo Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
 
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
 
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
 

426 lecture6b: AR Interaction

  • 1. COSC 426: Augmented Reality Mark Billinghurst mark.billinghurst@hitlabnz.org August 14th 2012 Lecture 6: AR Interaction
  • 2. Building Compelling AR Experiences experiences applications Interaction tools Authoring components Tracking, Display Sony CSL © 2004
  • 3. AR Interaction   Designing AR System = Interface Design   Using different input and output technologies   Objective is a high quality of user experience   Ease of use and learning   Performance and satisfaction
  • 4. User Interface and Tool   Human  User Interface/Tool  Machine/Object   Human Machine Interface Tools User Interface © Andreas Dünser
  • 5. User Interface: Characteristics   Input: mono or multimodal   Output: mono or multisensorial Sensation of Output movement   Technique/Metaphor/Paradigm Input Metaphor: “Push” to accelerate “Turn” to rotate © Andreas Dünser
  • 6. Human Computer Interface   Human  User Interface Computer System   Human Computer Interface= Hardware +| Software   Computer is everywhere now HCI electronic devices, Home Automation, Transport vehicles, etc © Andreas Dünser
  • 7. More terminology   Interaction Device= Input/Output of User Interface   Interaction Style= category of similar interaction techniques   Interaction Paradigm   Modality (human sense)   Usability
  • 8. Back to AR   You can see spatially registered AR.. how can you interact with it?
  • 9. Interaction Tasks   2D (from [Foley]):   Selection, Text Entry, Quantify, Position   3D (from [Bowman]):   Navigation (Travel/Wayfinding)   Selection   Manipulation   System Control/Data Input   AR: 2D + 3D Tasks and.. more specific tasks? [Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D.,V. Wallace & P. Chan. IEEE Computer Graphics and Applications(Nov.): 13-48. 1984. [Bowman]: 3D User Interfaces:Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005
  • 10. AR Interfaces as Data Browsers   2D/3D virtual objects are registered in 3D   “VR in Real World”   Interaction   2D/3D virtual viewpoint control   Applications   Visualization, training
  • 11. AR Information Browsers   Information is registered to real-world context   Hand held AR displays   Interaction   Manipulation of a window into information space   Applications   Context-aware information displays Rekimoto, et al. 1997
  • 13. Current AR Information Browsers   Mobile AR   GPS + compass   Many Applications   Layar   Wikitude   Acrossair   PressLite   Yelp   AR Car Finder   …
  • 14. Junaio   AR Browser from Metaio   http://www.junaio.com/   AR browsing   GPS + compass   2D/3D object placement   Photos/live video   Community viewing
  • 15.
  • 17. Adding Models in Web Interface
  • 18. Advantages and Disadvantages   Important class of AR interfaces   Wearable computers   AR simulation, training   Limited interactivity   Modification of virtual content is difficult Rekimoto, et al. 1997
  • 19. 3D AR Interfaces   Virtual objects displayed in 3D physical space and manipulated   HMDs and 6DOF head-tracking   6DOF hand trackers for input   Interaction   Viewpoint control   Traditional 3D user interface Kiyokawa, et al. 2000 interaction: manipulation, selection, etc.
  • 22. Advantages and Disadvantages   Important class of AR interfaces   Entertainment, design, training   Advantages   User can interact with 3D virtual object everywhere in space   Natural, familiar interaction   Disadvantages   Usually no tactile feedback   User has to use different devices for virtual and physical objects Oshima, et al. 2000
  • 23. Augmented Surfaces and Tangible Interfaces   Basic principles   Virtual objects are projected on a surface   Physical objects are used as controls for virtual objects   Support for collaboration
  • 24. Augmented Surfaces   Rekimoto, et al. 1998   Front projection   Marker-based tracking   Multiple projection surfaces
  • 25. Tangible User Interfaces (Ishii 97)   Create digital shadows for physical objects   Foreground   graspable UI   Background   ambient interfaces
  • 26. Tangible Interfaces - Ambient   Dangling String   Jeremijenko 1995   Ambient ethernet monitor   Relies on peripheral cues   Ambient Fixtures   Dahley, Wisneski, Ishii 1998   Use natural material qualities for information display
  • 27. Tangible Interface: ARgroove   Collaborative Instrument   Exploring Physically Based Interaction   Map physical actions to Midi output -  Translation, rotation -  Tilt, shake
  • 29. Visual Feedback   Continuous Visual Feedback is Key   Single Virtual Image Provides:   Rotation   Tilt   Height
  • 30. i/O Brush (Ryokai, Marti, Ishii)
  • 31. Other Examples   Triangles (Gorbert 1998)   Triangular based story telling   ActiveCube (Kitamura 2000-)   Cubes with sensors
  • 32. Lessons from Tangible Interfaces   Physical objects make us smart   Norman’s “Things that Make Us Smart”   encode affordances, constraints   Objects aid collaboration   establish shared meaning   Objects increase understanding   serve as cognitive artifacts
  • 33. TUI Limitations   Difficult to change object properties   can’t tell state of digital data   Limited display capabilities   projection screen = 2D   dependent on physical display surface   Separation between object and display   ARgroove
  • 34. Advantages and Disadvantages   Advantages   Natural - users hands are used for interacting with both virtual and real objects. -  No need for special purpose input devices   Disadvantages   Interaction is limited only to 2D surface -  Full 3D interaction and manipulation is difficult
  • 35. Orthogonal Nature of AR Interfaces
  • 36. Back to the Real World   AR overcomes limitation of TUIs   enhance display possibilities   merge task/display space   provide public and private views   TUI + AR = Tangible AR   Apply TUI methods to AR interface design
  • 37.   Space-multiplexed   Many devices each with one function -  Quicker to use, more intuitive, clutter -  Real Toolbox   Time-multiplexed   One device with many functions -  Space efficient -  mouse
  • 38. Tangible AR: Tiles (Space Multiplexed)   Tiles semantics   data tiles   operation tiles   Operation on tiles   proximity   spatial arrangements   space-multiplexed
  • 39. Space-multiplexed Interface Data authoring in Tiles
  • 41. Object Based Interaction: MagicCup   Intuitive Virtual Object Manipulation on a Table-Top Workspace   Time multiplexed   Multiple Markers -  Robust Tracking   Tangible User Interface -  Intuitive Manipulation   Stereo Display -  Good Presence
  • 42. Our system   Main table, Menu table, Cup interface
  • 43.
  • 44. Tangible AR: Time-multiplexed Interaction   Use of natural physical object manipulations to control virtual objects   VOMAR Demo   Catalog book: -  Turn over the page   Paddle operation: -  Push, shake, incline, hit, scoop
  • 46. Advantages and Disadvantages   Advantages   Natural interaction with virtual and physical tools -  No need for special purpose input devices   Spatial interaction with virtual objects -  3D manipulation with virtual objects anywhere in physical space   Disadvantages   Requires Head Mounted Display
  • 47. Wrap-up   Browsing Interfaces   simple (conceptually!), unobtrusive   3D AR Interfaces   expressive, creative, require attention   Tangible Interfaces   Embedded into conventional environments   Tangible AR   Combines TUI input + AR display
  • 48. AR User Interface: Categorization   Traditional Desktop: keyboard, mouse, joystick (with or without 2D/3D GUI)   Specialized/VR Device: 3D VR devices, specially design device
  • 49. AR User Interface: Categorization   Tangible Interface : using physical object Hand/ Touch Interface : using pose and gesture of hand, fingers   Body Interface: using movement of body
  • 50. AR User Interface: Categorization   Speech Interface: voice, speech control   Multimodal Interface : Gesture + Speech   Haptic Interface : haptic feedback   Eye Tracking, Physiological, Brain Computer Interface..
  • 52. Keyboard and Mouse Interaction   Traditional input techniques   OSG provides a framework for handling keyboard and mouse input events (osgGA) 1.  Subclass osgGA::GUIEventHandler 2.  Handle events: •  Mouse up / down / move / drag / scroll-wheel •  Key up / down 3.  Add instance of new handler to the viewer
  • 53. Keyboard and Mouse Interaction   Create your own event handler class class KeyboardMouseEventHandler : public osgGA::GUIEventHandler { public: KeyboardMouseEventHandler() : osgGA::GUIEventHandler() { } virtual bool handle(const osgGA::GUIEventAdapter& ea,osgGA::GUIActionAdapter& aa, osg::Object* obj, osg::NodeVisitor* nv) { switch (ea.getEventType()) { // Possible events we can handle case osgGA::GUIEventAdapter::PUSH: break; case osgGA::GUIEventAdapter::RELEASE: break; case osgGA::GUIEventAdapter::MOVE: break; case osgGA::GUIEventAdapter::DRAG: break; case osgGA::GUIEventAdapter::SCROLL: break; case osgGA::GUIEventAdapter::KEYUP: break; case osgGA::GUIEventAdapter::KEYDOWN: break; } return false; } };   Add it to the viewer to receive events viewer.addEventHandler(new KeyboardMouseEventHandler());
  • 54. Keyboard Interaction   Handle W,A,S,D keys to move an object case osgGA::GUIEventAdapter::KEYDOWN: { switch (ea.getKey()) { case 'w': // Move forward 5mm localTransform->preMult(osg::Matrix::translate(0, -5, 0)); return true; case 's': // Move back 5mm localTransform->preMult(osg::Matrix::translate(0, 5, 0)); return true; case 'a': // Rotate 10 degrees left localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(10.0f), osg::Z_AXIS)); return true; case 'd': // Rotate 10 degrees right localTransform->preMult(osg::Matrix::rotate(osg::DegreesToRadians(-10.0f), osg::Z_AXIS)); return true; case ' ': // Reset the transformation localTransform->setMatrix(osg::Matrix::identity()); return true; } break; localTransform = new osg::MatrixTransform(); localTransform->addChild(osgDB::readNodeFile("media/car.ive")); arTransform->addChild(localTransform.get());
  • 56. Mouse Interaction   Mouse is pointing device…   Use mouse to select objects in an AR scene   OSG provides methods for ray-casting and intersection testing   Return an osg::NodePath (the path from the hit node all the way back to the root) Projection Plane (screen) scene
  • 57. Mouse Interaction   Compute the list of nodes under the clicked position   Invoke an action on nodes that are hit, e.g. select, delete case osgGA::GUIEventAdapter::PUSH: osgViewer::View* view = dynamic_cast<osgViewer::View*>(&aa); osgUtil::LineSegmentIntersector::Intersections intersections; // Clear previous selections for (unsigned int i = 0; i < targets.size(); i++) { targets[i]->setSelected(false); } // Find new selection based on click position if (view && view->computeIntersections(ea.getX(), ea.getY(), intersections)) { for (osgUtil::LineSegmentIntersector::Intersections::iterator iter = intersections.begin(); iter != intersections.end(); iter++) { if (Target* target = dynamic_cast<Target*>(iter->nodePath.back())) { std::cout << "HIT!" << std::endl; target->setSelected(true); return true; } } } break;
  • 59. Proximity Techniques   Interaction based on   the distance between a marker and the camera   the distance between multiple markers
  • 60. Single Marker Techniques: Proximity   Use distance from camera to marker as input parameter   e.g. Lean in close to examine   Can use the osg::LOD class to show different content at different depth ranges Image: OpenSG Consortium
  • 61. Single Marker Techniques: Proximity // Load some models osg::ref_ptr<osg::Node> farNode = osgDB::readNodeFile("media/far.osg"); osg::ref_ptr<osg::Node> closerNode = osgDB::readNodeFile("media/closer.osg"); osg::ref_ptr<osg::Node> nearNode = osgDB::readNodeFile("media/near.osg"); // Use a Level-Of-Detail node to show each model at different distance ranges. osg::ref_ptr<osg::LOD> lod = new osg::LOD(); lod->addChild(farNode.get(), 500.0f, 10000.0f); // Show the "far" node from 50cm to 10m away lod->addChild(closerNode.get(), 200.0f, 500.0f); // Show the "closer" node from 20cm to 50cm away lod->addChild(nearNode.get(), 0.0f, 200.0f); // Show the "near" node from 0cm to 2cm away arTransform->addChild(lod.get());   Define depth ranges for each node   Add as many as you want   Ranges can overlap
  • 63. Multiple Marker Concepts   Interaction based on the relationship between markers   e.g. When the distance between two markers decreases below threshold invoke an action   Tangible User Interface   Applications:   Memory card games   File operations
  • 64. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance > Threshold Switch A Switch B Model Model Model Model A1 A2 B1 B2
  • 65. Multiple Marker Proximity Virtual Camera Transform A Transform B Distance <= Threshold Switch A Switch B Model Model Model Model A1 A2 B1 B2
  • 66. Multiple Marker Proximity   Use a node callback to test for proximity and update the relevant nodes virtual void operator()(osg::Node* node, osg::NodeVisitor* nv) { if (mMarkerA != NULL && mMarkerB != NULL && mSwitchA != NULL && mSwitchB != NULL) { if (mMarkerA->valid() && mMarkerB->valid()) { osg::Vec3 posA = mMarkerA->getTransform().getTrans(); osg::Vec3 posB = mMarkerB->getTransform().getTrans(); osg::Vec3 offset = posA - posB; float distance = offset.length(); if (distance <= mThreshold) { if (mSwitchA->getNumChildren() > 1) mSwitchA->setSingleChildOn(1); if (mSwitchB->getNumChildren() > 1) mSwitchB->setSingleChildOn(1); } else { if (mSwitchA->getNumChildren() > 0) mSwitchA->setSingleChildOn(0); if (mSwitchB->getNumChildren() > 0) mSwitchB->setSingleChildOn(0); } } } traverse(node,nv); }
  • 68. Paddle Interaction   Use one marker as a tool for selecting and manipulating objects (tangible user interface)   Another marker provides a frame of reference   A grid of markers can alleviate problems with occlusion MagicCup (Kato et al) VOMAR (Kato et al)
  • 69. Paddle Interaction   Often useful to adopt a local coordinate system   Allows the camera to move without disrupting Tlocal   Places the paddle in the same coordinate system as the content on the grid   Simplifies interaction   osgART computes Tlocal using the osgART::LocalTransformationCallback
  • 70. Tilt and Shake Interaction   Detect types of paddle movement:   Tilt -  gradual change in orientation   Shake -  short, sudden changes in translation
  • 71. Building Tangible AR Interfaces with ARToolKit
  • 72. Required Code   Calculating Camera Position   Range to marker   Loading Multiple Patterns/Models   Interaction between objects   Proximity   Relative position/orientation   Occlusion   Stencil buffering   Multi-marker tracking
  • 74. Local vs. Global Interactions   Local   Actions determined from single camera to marker transform -  shaking, appearance, relative position, range   Global   Actions determined from two relationships -  marker to camera, world to camera coords. -  Marker transform determined in world coordinates •  object tilt, absolute position, absolute rotation, hitting
  • 75. Range-based Interaction   Sample File: RangeTest.c /* get the camera transformation */ arGetTransMat(&marker_info[k], marker_center, marker_width, marker_trans); /* find the range */ Xpos = marker_trans[0][3]; Ypos = marker_trans[1][3]; Zpos = marker_trans[2][3]; range = sqrt(Xpos*Xpos+Ypos*Ypos+Zpos*Zpos);
  • 76. Loading Multiple Patterns   Sample File: LoadMulti.c   Uses object.c to load   Object Structure typedef struct { char name[256]; int id; int visible; double marker_coord[4][2]; double trans[3][4]; double marker_width; double marker_center[2]; } ObjectData_T;
  • 77. Finding Multiple Transforms   Create object list ObjectData_T *object;   Read in objects - in init( ) read_ObjData( char *name, int *objectnum );   Find Transform – in mainLoop( ) for( i = 0; i < objectnum; i++ ) { ..Check patterns ..Find transforms for each marker }
  • 78. Drawing Multiple Objects   Send the object list to the draw function draw( object, objectnum );   Draw each object individually for( i = 0; i < objectnum; i++ ) { if( object[i].visible == 0 ) continue; argConvGlpara(object[i].trans, gl_para); draw_object( object[i].id, gl_para); }
  • 79. Proximity Based Interaction   Sample File – CollideTest.c   Detect distance between markers checkCollisions(object[0],object[1], DIST) If distance < collide distance Then change the model/perform interaction
  • 80. Multi-marker Tracking   Sample File – multiTest.c   Multiple markers to establish a single coordinate frame   Reading in a configuration file   Tracking from sets of markers   Careful camera calibration
  • 81. MultiMarker Configuration File   Sample File - Data/multi/marker.dat   Contains list of all the patterns and their exact positions #the number of patterns to be recognized 6 Pattern File #marker 1 Pattern Width + Data/multi/patt.a Coordinate Origin 40.0 0.0 0.0 Pattern Transform 1.0000 0.0000 0.0000 -100.0000 Relative to Global 0.0000 1.0000 0.0000 50.0000 Origin 0.0000 0.0000 1.0000 0.0000 …
  • 82. Camera Transform Calculation   Include <AR/arMulti.h>   Link to libARMulti.lib   In mainLoop()   Detect markers as usual arDetectMarkerLite(dataPtr, thresh, &marker_info, &marker_num)   Use MultiMarker Function if( (err=arMultiGetTransMat(marker_info, marker_num, config)) < 0 ) { argSwapBuffers(); return; }
  • 83. Paddle-based Interaction Tracking single marker relative to multi-marker set - paddle contains single marker
  • 84. Paddle Interaction Code   Sample File – PaddleDemo.c   Get paddle marker location + draw paddle before drawing background model paddleGetTrans(paddleInfo, marker_info, marker_flag, marker_num, &cparam); /* draw the paddle */ if( paddleInfo->active ){ draw_paddle( paddleInfo); } draw_paddle uses a Stencil Buffer to increase realism
  • 85. Paddle Interaction Code II   Sample File – paddleDrawDemo.c   Finds the paddle position relative to global coordinate frame: setBlobTrans(Num,paddle_trans[3][4],base_trans[3][4])   Sample File – paddleTouch.c   Finds the paddle position: findPaddlePos(&curPadPos,paddleInfo->trans,config->trans);   Checks for collisions: checkCollision(&curPaddlePos,myTarget[i].pos,20.0)
  • 86. General Tangible AR Library   command_sub.c, command_sub.h   Contains functions for recognizing a range of different paddle motions: int check_shake( ); int check_punch( ); int check_incline( ); int check_pickup( ); int check_push( );   Eg: to check angle between paddle and base check_incline(paddle->trans, base->trans, &ang)