This document proposes a method for real-time hand tracking using skin color detection and blob matching. It detects hands by identifying skin color pixels and moving areas between frames. For tracking, it matches skin color blobs between frames based on color distribution and predefined tracking rules. The method aims to provide fast and stable hand tracking for human-computer interaction applications. Experimental results showed the algorithm performed well in real-time.
The document describes experiments conducted to develop new interfaces for networked games that make virtual and real spaces seem seamless. It discusses two experiments: 1) A marker-based interface for an online dice game that uses physical dice with markers tracked by a camera. 2) A marker-less interface for an online trading card game that uses real trading cards instead of virtual ones. The goal is for the interfaces to better reflect natural human behaviors and increase immersion in the games.
This document proposes a method for real-time hand tracking using skin color detection and blob matching. It detects hands by identifying skin color pixels and moving areas between frames. For tracking, it matches skin color blobs between frames based on color distribution and predefined tracking rules. The method aims to provide fast and stable hand tracking for human-computer interaction applications. Experimental results showed the algorithm performed well in real-time.
The document describes experiments conducted to develop new interfaces for networked games that make virtual and real spaces seem seamless. It discusses two experiments: 1) A marker-based interface for an online dice game that uses physical dice with markers tracked by a camera. 2) A marker-less interface for an online trading card game that uses real trading cards instead of virtual ones. The goal is for the interfaces to better reflect natural human behaviors and increase immersion in the games.
The document discusses augmented reality (AR) and its potential applications. It defines AR as using transparent head-mounted displays to overlay computer-generated images onto the physical environment. It mentions how AR could be used to overlay additional information and knowledge to enhance learning. The document also discusses different types of AR including wearable, spatial, hand-held, and projection-based AR and some examples of technologies in each category. It raises issues about using AR on mobile devices and discusses how AR could enable new forms of collaboration and interaction between people.
The document describes a system for 3D modeling using hand gestures as input. It uses a vision-based tracking system to recognize hand gestures without any instruments attached to the hands. The system supports basic modeling tasks like selection, translation, rotation, and scaling of 3D objects using just five static hand gestures. Visual feedback is provided to help users perceive interactions. The goal is to provide an intuitive interface for 3D modeling that requires little or no training.
This paper advocates for a new type of augmented reality (AR) interface called Tangible AR. Tangible AR interfaces combine the enhanced display possibilities of AR with intuitive physical manipulation from tangible user interfaces (TUIs). Specifically, 1) each virtual object is registered to a corresponding physical object, and 2) users interact with virtual objects by manipulating the physical objects. The paper presents some prototype Tangible AR interfaces and argues they support seamless interaction between real and virtual worlds through natural physical manipulations.
The document discusses social networking with robot pets. It describes the functionality of R.Society, including keeping robot pets offline, interacting with them online through games and communities, and having mobile conversations. It also mentions potential future uses like home security, acting as a secretary by learning and evolving. Examples of existing robot pets are provided, such as NEC's PaPeRo, along with the sensors and controllers it uses. Current research areas involving robot pets are outlined, such as computer vision for human detection and face recognition, transferring emotions through sensors, augmented reality games, and modeling emotions and feelings.
This document discusses augmented reality (AR) interfaces for visual analytics. It provides an overview of AR, including its key characteristics of combining real and virtual images in an interactive and registered 3D space. Examples are given of medical imaging applications and trials of AR interfaces. Design principles for AR user interfaces are outlined, including using physical objects, appropriate interaction metaphors, and principles from tangible interfaces. Case studies demonstrate AR lenses and collaborative AR interfaces. The document concludes with a discussion of future directions such as mobile phone AR, collaborative AR, and ubiquitous VR (UbiVR).
The document discusses augmented reality (AR) and its potential applications. It defines AR as using transparent head-mounted displays to overlay computer-generated images onto the physical environment. It mentions how AR could be used to overlay additional information and knowledge to enhance learning. The document also discusses different types of AR including wearable, spatial, hand-held, and projection-based AR and some examples of technologies in each category. It raises issues about using AR on mobile devices and discusses how AR could enable new forms of collaboration and interaction between people.
The document describes a system for 3D modeling using hand gestures as input. It uses a vision-based tracking system to recognize hand gestures without any instruments attached to the hands. The system supports basic modeling tasks like selection, translation, rotation, and scaling of 3D objects using just five static hand gestures. Visual feedback is provided to help users perceive interactions. The goal is to provide an intuitive interface for 3D modeling that requires little or no training.
This paper advocates for a new type of augmented reality (AR) interface called Tangible AR. Tangible AR interfaces combine the enhanced display possibilities of AR with intuitive physical manipulation from tangible user interfaces (TUIs). Specifically, 1) each virtual object is registered to a corresponding physical object, and 2) users interact with virtual objects by manipulating the physical objects. The paper presents some prototype Tangible AR interfaces and argues they support seamless interaction between real and virtual worlds through natural physical manipulations.
The document discusses social networking with robot pets. It describes the functionality of R.Society, including keeping robot pets offline, interacting with them online through games and communities, and having mobile conversations. It also mentions potential future uses like home security, acting as a secretary by learning and evolving. Examples of existing robot pets are provided, such as NEC's PaPeRo, along with the sensors and controllers it uses. Current research areas involving robot pets are outlined, such as computer vision for human detection and face recognition, transferring emotions through sensors, augmented reality games, and modeling emotions and feelings.
This document discusses augmented reality (AR) interfaces for visual analytics. It provides an overview of AR, including its key characteristics of combining real and virtual images in an interactive and registered 3D space. Examples are given of medical imaging applications and trials of AR interfaces. Design principles for AR user interfaces are outlined, including using physical objects, appropriate interaction metaphors, and principles from tangible interfaces. Case studies demonstrate AR lenses and collaborative AR interfaces. The document concludes with a discussion of future directions such as mobile phone AR, collaborative AR, and ubiquitous VR (UbiVR).