High Fidelity Telepresence Systems: Design, Control, and Evaluation

                       Martin Buss, Martin Kuschel, K...
in the environment are sensed, communicated, and fed                                     back has been used successfully t...
compensation of disturbances due to friction and to be                                                  stereo camera head...
C. Multi-user mobile telepresence/teleaction                                  the 5th European Control Conference ECC’99 i...
[20] M. Buss and H. Hashimoto, “Motion Scheme for Dextrous                [36] M. Ueberle, N. Mock, and M. Buss, “Towards ...
[51] N. Nitzsche, Weitr¨umige Telepr¨senz: Unbeschr¨nkte Fort-
                       a              a               a    ...
Upcoming SlideShare
Loading in...5
×

High Fidelity Telepresence Systems: Design, Control, and ...

537

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
537
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

High Fidelity Telepresence Systems: Design, Control, and ...

  1. 1. High Fidelity Telepresence Systems: Design, Control, and Evaluation Martin Buss, Martin Kuschel, Kwang-Kyu Lee, Angelika Peer, Bartlomiej Stanczyk, Marc Ueberle, Ulrich Unterhinninghofen Institute of Automatic Control Engineering (LSR) — Technische Universit¨t M¨nchen a u D-80290 Munich, Germany www.lsr.ei.tum.de www.sfb453.de Abstract. An overview of subjectively selected recent topics and research trends in the area of modern tele- presence and teleaction is given. Multi-modal telepresence systems are to enable human operators to become present and active in remote or inaccessible environments through a communication network and a suitable te- lerobotic system. The major challenges of such a multimodal telepresence system like stabilizing measures, and transparency, e.g. in the case of time-delay (latency) in the communication network, are discussed. A practical implementation of an integrated mobile and bimanual multi-user telepresence and teleaction system as well as some exemplary experiments are presented. Furthermore, it is discussed how the performance of telepresence systems can be improved by using psychophysical knowledge of human perception. Keywords. Telepresence, Multimodality, Psychophysics I. Introduction II. Multi-Modal Telepresence Systems Multi-modal telepresence and teleaction systems inclu- The structure of a multi-modal telepresence system is de classical teleoperation and telemanipulation systems. shown in Fig. 1. On the operator-site the human operator An important issue is the combination of telepresence gives multi-modal command inputs to the human system and teleaction, allowing the human operator to perform interface (HSI) using motion, voice, or symbolic input actively in remote environments. Here, remote environ- devices. The commands are transmitted to the executing ments include possibly distant and/or scaled physical en- teleoperator on the remote-site across (communication or vironments, virtual environments—VEs and augmented scale) barriers. The teleoperator is an executing robotic realities. system such as a mobile service robot and is control- One of the central issues in modern telepresence sy- led according to the commands received from the human stems is multi-modality in the human-system interface— operator. Sensors mounted on the teleoperator measure HSI accompanied by appropriate sensing techniques at the interaction between the teleoperator and the environ- the teleoperator site comprising theoretically all the hu- ment. Typically visual, acoustic, force, and tactile sensors man senses. In current technical applications most im- are used. Measured data is transmitted back to the hu- portant and only partly realized are the visual, auditory, man operator and displayed using modality dependent and haptic — i.e. kinesthetic, tactile and temperature — hardware in the multi-modal HSI comprising multimedia senses. and haptics. Application areas of telepresence and teleacti- Barriers on systems are countless, to name only a few: Operator-Site Remote-Site Acoustic Visual Environment tele-programming, tele-installation, tele-diagnosis, Display Display Teleoperator tele-service, tele-maintenance, tele-assembly, tele- 111111 000000 111111 000000 manufacturing, miniature or micro mechatronics, inner OK ? 111111 000000 111111 000000 and outer space operations, tele-teaching, tele-medicine, 111111 000000 111111 000000 Acoustic tele-surgery, tele-shopping, etc. Visual Haptic (force and tactile) feedback systems are one of Tactile Haptic Local Control Haptic Kinesthetic-Tactile the key elements in modern telepresence and virtual en- Display Display Measures Sensors/Actuators vironment systems. Telepresence systems are most often Fig.1: Multi-modal telepresence system operated with Internet communication, which means that the haptic control loop is closed over an unreliable com- One of the key issues in telepresence system design and munication channel posing additional challenges for con- operation is the degree of coupling between the human trol architectures. operator and the teleoperator. If the operator gives sym- Related overview articles addressing telepresence, hap- bolic commands to the teleoperator by pushing buttons tics, and Internet control are [15, 16, 29, 36, 72]; see also a and watching the resulting action in the remote envi- special section in Presence [10, 41, 58, 65–67] and a forth- ronment the coupling is weak. The coupling is strong for coming book about telepresence [3]. Sections II, III, and the haptic modality in a bilateral teleoperation scenario. IV discuss the general structure of multi-modal telepre- Commonly, the motion (force) of the human operator is sence systems, the design of an integrated mobile and measured, communicated, and used as the set-point for bimanual multi-user telepresence/teleaction system, and the teleoperator motion (force) controller. On the remo- psychophysical aspects, respectively. te site the resulting forces (motion) of the teleoperator
  2. 2. in the environment are sensed, communicated, and fed back has been used successfully to achieve stability (in back to the human operator through the force feedback fact passivity) of the communication link by interpreting channel of the multi-modal HSI. it as a lossless two-port [1, 5, 44, 50]. A visual predictive The literature on telepresence systems distinguishes display has been proposed using first a wire-frame and la- between remote, shared, cooperative, assisting, semi- ter a solid model image generated by computer graphics autonomous, symbolic and trading control, see e.g. [2, 11, hardware, which is then blended or overlayed with the 12, 25, 74] and [15] for a detailed discussion. camera image from the remote site [8, 35, 43]. Work by our colleagues in Munich aims at photorealistic predicti- multi-modal command augmented command ve displays [9, 10, 24, 62]. III. Integrated Mobile and Bimanual Multi-User Human Virtual (VE) Human Teleoperator & Operator Remote Engine Operator Model Environment Telepresence and Teleaction virtual feedback Existing telepresence systems show deficits with re- augmented feedback multi-modal spect to workspace, manipulability, and performance. reality feedback sensor information These deficits can partly be ascribed to a limited Barrier workspace of haptic display or telemanipulator. More- Fig.2: Multi-modal control loops in telepresence systems. over, most telepresence systems are limited to a few de- The multi-modal control loop of telepresence systems grees of freedom necessary for a specific task, which ma- is shown in Fig. 2. From the human operator the kes intuitive manipulation very difficult. multi-modal command consisting of voice, haptic (for- The demand on intuitive manageability of a telemani- ce/motion), and symbolic input is transmitted across pulation system results inevitably in a bimanual, mobi- the (communication) barrier to the teleoperator. Simul- le and kinematically redundant system. Enhancing such taneously, the command is also input to a model of the a system by further adding a multi-user mode enables teleoperator and remote environment implemented in a to perform collaborative telemanipulation tasks. Our in- virtual environment for local feedback. Data measured stitute aims at developing and implementing such a by teleoperator sensors results in multi-modal feedback to high-definition telepresence system, see Fig. 3. In or- the human operator across the barrier. Multi-modal feed- der to realize such a system the following research are- back consists of 2D or 3D visual, mono/stereo acoustic, as can be distinguished: A) stationary bimanual telepre- haptic (force, motion, tactile, temperature), and symbolic sence/teleaction in full 6 DOF, B) integrated mobile, information. The remote local control loop using a human bimanual telepresence/teleaction, C) multi-user mobile operator model increasing the autonomy of the teleope- telepresence/teleaction. In the following these research rator by implementing human task skill is also shown. topics and the current state of our research are briefly Multi-modal human system interaction with purely discussed. virtual environments has various important applications. Operator training is possible without risk for humans in- volved. A classical training application are flight simula- tors for pilot training, where the supported modalities ha- ve been visual feedback and acceleration emulation main- ly. Medical applications like e.g. practice runs of compli- cated surgical procedures are being developed for surgeon Fig.3: Mobile and bimanual haptic tele-collaboration training [73]. A system for multi-modal interaction with a virtual (possibly injured) human knee joint is [68]. Vir- A. Stationary bimanual telepresence and teleaction in tual environments are also being used to extract operator full 6 DOF expertise, to transfer, and implement this knowledge for In order to enable intuitive telemanipulation the hy- semi-autonomous local teleoperator control, see Fig. 2 per redundant haptic display ViSHaRD10 (Virtual and [11, 13, 20, 26, 48, 85]. Scenario Haptic Rendering Device with 10 actuated Feedback to the operator through the human system DOF) is used as a human system interface, see Fig. 4. interface is often augmented, i.e. remote data is fused Its main characteristics are a very large workspace free with supplemental data from a remote environment mo- of singularities, a high payload capability to accomodate del. Augmentation on the remote site uses human control various application specific end-effectors as e.g. surgical expertise from a human operator model to support local tools like drills [27] or scissors, foreseen redundancy to control of the teleoperator. Augmentation is possible for avoid kinematic singularities and user interferences and all mentioned human modalities, but most established the possibility for dual-arm haptic interaction with full are kinesthetic and (predictive) visual augmentation. In 6 DOF (again redundancy facilitates collision avoidance a bilateral kinesthetic teleoperation scenario local feed- between the two arms). In order to provide an effective
  3. 3. compensation of disturbances due to friction and to be stereo camera head able to render inertia and mass, admittance control has been implemented for this device. An appropriate inver- HMD se kinematic algorithm enables a reasonable redundancy Position resolution. Further details about the design concept, the kinematic model, and the control of ViSHaRD10 can be found in [79–83]. Force Vision ViSHaRD 10 7DoF Telerobot master slave Fig.6: Experimental setup: tele-screw-tightening B. Integrated mobile, bimanual telepresence/teleaction In order to enable telepresence in arbitrarily large re- mote environments the telemanipulator is mounted on a mobile base which can freely move around in the remote environment [14,28,64]. For maintaining a high degree of immersion in wide area telepresence it is crucial to con- Fig.4: Bimanual haptic dis- Fig. 5: Dual arm telemani- vey a natural feeling of locomotion. This is achieved by play ViSHaRD10 pulator also placing the haptic interface on a mobile base which allows to track operator motions and to reflect forces at the same time, see Fig. 3. The mobile haptic interface The superior manipulation-dexterity of humans is a (MHI) can be used in wide area telepresence as well as in result of the kinematic redundancy of human arms and extensive virtual environments [51, 54, 59]. Related Work the ability to adapt their compliance to the current task. can be found in [4, 7, 17, 19, 22, 37, 61, 70, 84]. As many technical design solutions being inspired by na- A problem which is common to both applications of ture, an anthropomorphic bi-manual redundant telema- an MHI is the limited workspace at the operator site. nipulator has been designed, see Fig. 5. The telemanipu- Techniques like scaling or indexing have been shown to lator consists of two identical, human-scaled arms. Each considerably reduce the feeling of presence in the target arm consists of two spherical joints with 3 DOF at shoul- environment (see [6,18,66,71]). Using the concept of mo- der and wrist, each, and one revolute joint at the elbow, tion compression [52, 52, 53, 55–58, 60, 69] the path in the which results in 7 DOF, see [75, 76, 78]. The redundan- remote environment is transformed in such a way that it cy of the slave is efficiently utilized to fulfill additional fits into the available operator space, see Fig. 7. As long kinematic or dynamic tasks, e.g. to avoid singularities as the curvature deviation between original and transfor- or joint limits and to increase the structural stiffness of med path is kept below a certain threshold the operator the arm in contact situations [14]. During telemanipula- cannot perceive compression artifacts. tion, the telemanipulator has to handle interactions with unstructured rigid environments. For such reasons, a con- trol algorithm that guarantees compliant behavior during contact is applied, see [14, 63, 64, 77]. In order to combine these both devices to a bimanual telemanipulation system a coupling-method for devices with different kinematic structures has been developed. In addition, the implemented control algorithms for hap- tic display and telemanipulator assure a stable interacti- on with the environment. In several experiments tracking of free space motion, haptic exploration of different ma- terials as well as fixing a screw by telepresence has been successfully demonstrated, see Fig. 6 and [14, 63, 64]. The extension of this system for bimanual manipulati- on requires further analysis of possible contact situations Fig.7: Trajectories of a test run in user environment (left) and the investigation of new stable control algorithms. and target environment (right) [57]
  4. 4. C. Multi-user mobile telepresence/teleaction the 5th European Control Conference ECC’99 in Karlsruhe, Finally the collaboration of multiple human opera- Germany (P. Frank, ed.), pp. 65–101, Springer, 1999. tors in a telepresence and teleaction scenario is currently [2] M. Buss and G. Schmidt, “Multi-Modal Telepresence,” in Proceedings of the 17th International Mechatronics Forum, being investigated. Thereby the human operators inter- Science and Technology Forum 2000, Plenary, (Kagawa Uni- act with mobile haptic interfaces and control mobile tele- versity, Kagawa, Japan), pp. 24–33, 2000. operators located at the remote site. The main research [3] S. E. Salcudean, Control Problems in Robotics and Automati- topics in this field are the development of control algo- on, pp. 51–66. No. 230 in Control for Teleoperation and Haptic Interfaces, Lecture Notes, Springer Verlag, Berlin, 1998. rithms for collaborative telemanipulation and task sha- [4] V. Hayward, O. Astley, M. Cruz-Hernandez, D. Grant, and ring as well as the automatic collision avoidance between G. Robles-De-La-Torre, “Haptic interfaces and devices,” Sen- the teleoperators. sor Review, vol. 24, no. 1, pp. 16–29, 2004. [5] J. M. Hollerbach, “Some current issues in haptics research,” in Proc. of the IEEE International Conference on Robotics & IV. Telepresence and Psychophysics Automation, (San Francisco, California), pp. 757–762, 2000. A. Dynamical model of human perception [6] P. Kammermeier, A. Kron, J. Hoogen, and G. Schmidt, “Dis- Another way to improve telepresence systems is to ta- play of holistic haptic sensations by combined tactile and ki- ke into account psychophysical aspects of human per- nesthetic feedback,” Presence: Teleoperators and Virtual En- vironments, vol. 13, no. 1, pp. 1–15, 2004. ception. Therefore, multimodal processes are described [7] B. Petzold, M. Zaeh, B. Faerber, B. Deml, H. Egermeier, quantitatively by a systems theoretical model providing J. Schilp, and S. Clarke, “A study on visual, auditory, and statical, dynamical and statistical information. On the haptic feedback for assembly tasks,” Presence: Teleoperators and Virtual Environments, vol. 13, no. 1, pp. 16–21, 2004. basis of a structural description [39] we investigate mul- timodal processes normally elicited within telepresence [8] T. Burkert, J. Leupold, and G. Passig, “A photorealistic pre- dictive display,” Presence: Teleoperators and Virtual Environ- (haptic, visual and auditive). Thereby, we concentrate ments, vol. 13, no. 1, pp. 22–43, 2004. on crossmodal interactions and sensory processes [42,45]. [9] N. Nitzsche, U. Hanebeck, and G. Schmidt, “Motion compres- We use psychophysical models to develop data reduc- sion for telepresent walking in large target environments,” Pre- sence: Teleoperators and Virtual Environments, vol. 13, no. 1, tion algorithms and new kinds of transparency measu- pp. 44–60, 2004. res to be used in haptic telepresence [30–34, 49]. High [10] M. Popp, E. Platzer, M. Eichner, and M. Schade, “Walking fidelity telepresence systems like a multimodal bima- with and without walking: Perception of distance in large-scale nual human system interface or several tactile displays urban areas in reality and in virtual reality,” Presence: Tele- operators and Virtual Environments, vol. 13, no. 1, pp. 61–76, (shear force, thermal) serve as experimental testbeds 2004. [21, 23, 38, 40, 46, 47]. [11] D. Reintsema, C. Preusche, T. Ortmaier, and G. Hirzinger, V. Conclusion “Toward high-fidelity telepresence in space and surgery ro- botics,” Presence: Teleoperators and Virtual Environments, An overview of the general structure of multi-modal vol. 13, no. 1, pp. 77–98, 2004. telepresence and teleaction systems has been given. Ty- [12] R. Aracil, C. Balaguer, M. Buss, M. Ferre, and C. Melchiorri, pical control modes in multi-modal telepresence systems eds., Advances in Telerobotics: Human Interfaces, Control, and Applications. Springer, STAR series, 2006, to appear. such as remote, shared, cooperative, assisting, trading, [13] R. Anderson, “Autonomous, Teleoperated, and Shared Con- symbolic, semi-autonomous control were briefly discus- trol of Robot Systems,” in Proceedings of the IEEE Interna- sed. A mobile and bimanual multi-user telepresence and tional Conference on Robotics and Automation, (Minneapolis, teleaction system was presented and some of the challen- Minnesota), pp. 2025–2032, 1996. ging open research problems related to bimanual, mobile [14] M. Buss, Study on Intelligent Cooperative Manipulation. PhD thesis, University of Tokyo, Tokyo, June 1994. and collaborative telemanipulation were discussed. An in- [15] M. Buss and H. Hashimoto, “Skill Acquisition and Transfer terdisciplinary approach using psychophysical aspects of System as Approach to the Intelligent Assisting System— human perception to improve telepresence systems has IAS,” in Proceedings of the 2nd IEEE Conference on Con- been discussed. trol Applications, (Vancouver, British Columbia, Canada), pp. 451–456, 1993. ACKNOWLEDGMENTS [16] C. Fischer, M. Buss, and G. Schmidt, “HuMan-Robot- This work is supported in part by the German Research Foun- Interface for Intelligent Service Robot Assistance,” in Procee- dation (DFG) within the collaborative research center SFB453 dings of the IEEE International Workshop on Robot and Hu- project. Additional research team members: Prof. G. Schmidt; man Communication (ROMAN), (Tsukuba, Japan), pp. 177– Dr. F. Freyberger; S. Hirche, A. Kron, N. Nitzsche. Technical staff: 182, 1996. J. Gradl, W. Jaschik, H. Kubick, T. Lowitz, T. Stoeber. ViSHaRD10 has been developed as part of the TOUCH-HapSys [17] T. B. Sheridan, Telerobotics, Automation, and Human Super- project financially supported by the 5th Framework IST Program- visory Control. Cambridge, Massachusetts: MIT Press, 1992. me of the European Union, action line IST-2002-6.1.1, contract [18] R. Satava and S. Jones, “Virtual Environments for Medical number IST-2001-38040. For the content of this paper the authors Training and Education,” Presence, vol. 6, pp. 139–146, 1997. are solely responsible for, it does not necessarily represent the opi- [19] R. Riener, J. Hoogen, G. Schmidt, M. Buss, and R. Burgkart, nion of the European Community. “Knee Joint Simulator Based on Haptic, Visual, and Acou- REFERENCES stic Feedback,” in Preprints of the 1st IFAC Conference on [1] M. Buss and G. Schmidt, “Control Problems in Multi-Modal Mechatronic Systems, (Darmstadt, Germany), pp. 579–583, Telepresence Systems,” in Advances in Control: Highlights of 2000.
  5. 5. [20] M. Buss and H. Hashimoto, “Motion Scheme for Dextrous [36] M. Ueberle, N. Mock, and M. Buss, “Towards a Hyper- Manipulation in the Intelligent Cooperative Manipulation Redundant Haptic Display,” in Proceedings of the Internatio- System—ICMS,” in Intelligent Robots and Systems (V. Grae- nal Workshop on High-Fidelity Telepresence and Teleaction, fe, ed.), pp. 279–294, Amsterdam: Elsevier Science, 1995. jointly with the IEEE conference HUMANOIDS’2003, (Mu- nich, Germany), 2003. [21] N. Delson and H. West, “Robot programming by human de- monstration: Subtask compliance controller identification,” in [37] M. Ueberle, N. Mock, A. Peer, C. Michas, and M. Buss, “De- Proceedings of the IEEE/RSJ International Conference on sign and Control Concepts of a Hyper Redundant Haptic In- Intelligent Robots and Systems IROS, pp. 33–41, 1993. terface for Interaction with Virtual Environments,” in Pro- ceedings of the IEEE/RSJ International Conference on In- [22] H. Friedrich, J. Holle, and R. Dillmann, “Interactive Genera- telligent Robots and Systems IROS, Workshop on Touch and tion of Flexible Robot Programs,” in Proceedings of the IEEE Haptics, (Sendai, Japan), 2004. International Conference on Robotics and Automation, (Leu- ven, Belgium), pp. 538–543, 1998. [38] M. Ueberle, N. Mock, and M. Buss, “ViSHaRD10, a No- vel Hyper-Redundant Haptic Interface,” in Proceedings of the [23] Y. Kunii and H. Hashimoto, “Tele-teaching by human demon- 12th International Symposium on Haptic Interfaces for Vir- stration in virtual environment for robotic network system,” tual Environment and Teleoperator Systems, in conjunction in Proceedings of the IEEE International Conference on Ro- with IEEE Virtual Reality 2004 conference, (Chicago, IL), botics and Automation, (Albuquerque, New Mexiko), pp. 405– pp. 58–65, 2004. 410, 1997. [39] M. Ueberle and M. Buss, “Design and control of a hyperredun- [24] J. Yang, Y. Xu, and C. Chen, “Hidden markov model approach dant haptic interface,” in Proceedings of the 9th International to skill learning and its application in telerobotics,” in Procee- Symposium on Experimental Robotics 2004 ISER2004, (Sin- dings of the IEEE International Conference on Robotics and gapore), pp. Paper–ID 130, 2004. Automation, (Atlanta, Georgia), pp. 396–402, 1993. [40] M. Ueberle, N. Mock, and M. Buss, “Design, control, and eva- [25] R. J. Anderson and M. Spong, “Bilateral control of operators luation of a hyper-redundant haptic interface,” in Advances with time delay,” IEEE Trans. on Automatic Control, no. 34, in Telerobotics: Human Interfaces, Control, and Applications pp. 494–501, 1989. (R. Aracil, C. Balaguer, M. Buss, M. Ferre, and C. Melchiorri, [26] H. Baier, M. Buss, and G. Schmidt, “Control Mode Switching eds.), Springer, STAR series, 2006, to appear. for Teledrilling Based on a Hybrid System Model,” in Procee- [41] B. Stanczyk and M. Buss, “Development of a telerobotic sy- dings of the IEEE/ASME International Conference on Ad- stem for exploration of hazardous environments,” in Procee- vanced Intelligent Mechatronics AIM’97, (Tokyo, Japan, Pa- dings of IEEE International Conference on Intelligent Robots per No. 50), 1997. and Systems (IROS), (Sendai, Japan), 2004. [27] K. Kosuge, H. Murayama, and K. Takeo, “Bilateral Feedback [42] B. Stanczyk, S. Hirche, and M. Buss, “Telemanipulation over Control of Telemanipulators via Computer Network,” in Pro- the internet: a tele-assembly experiment,” in Proceedings of ceedings of the IEEE/RSJ International Conference on Intel- the IEEE International Conference on Mechatronics and Ro- ligent Robots and Systems IROS, (Osaka, Japan), pp. 1380– botics, MechRob2004, (Aachen, Germany), 2004. 1385, 1996. [43] B. Stanczyk and M. Buss, “Towards teleoperated exploration [28] G. Niemeyer and J. Slotine, “Towards Force-Reflecting Te- of hazardous environments: Control and experimental study of leoperation Over the Internet,” in Proceedings of the IEEE kinematically dissimilar master-slave structure,” ROBOTIK International Conference on Robotics and Automation, (Leu- 2004, 17./18. June, 2004. ven, Belgium), pp. 1909–1915, 1998. [44] M. Buss, K. Lee, N. Nitzsche, A. Peer, B. Stanczyk, M. Ueber- [29] A. Bejczy, W. Kim, and S. Venema, “The phantom robot: le, and U. Unterhinninghofen, “Advanced telerobotics: Dual- Predictive displays for teleoperation with time delay,” in Pro- handed and mobile remote manipulation,” in Advances in ceedings of the IEEE International Conference on Robotics Telerobotics: Human Interfaces, Control, and Applications and Automation, (Cincinnati, Ohio), pp. 546–551, 1990. (R. Aracil, C. Balaguer, M. Buss, M. Ferre, and C. Melchiorri, [30] G. Hirzinger, “Rotex–the first robot in space,” in Proceedings eds.), Springer, STAR series, 2006, to appear. of the ICAR International Conference on Advanced Robotics, [45] B. Stanczyk and M. Buss, “Experimental comparison of pp. 9–33, 1993. interaction control methods for a redundant telemanipula- [31] W. Kim, “Virtual Reality Calibration and Preview/Predictive tor,” in Proceedings of the International Symposium on Me- Displays for Telerobotics,” Presence, vol. 5, pp. 173–190, Juni thods and Models in Automation and Robotics MMAR’2005, 1996. (Mi¸dzyzdroje, Poland), pp. 677–682, 2005. e [32] T. Burkert and G. F¨rber, “Photo-realistic scene prediction,” a [46] A. Peer, B. Stanczyk, and M. Buss, “Haptic telemanipulation in Proceedings of the IEEE International Conference on Me- with dissimilar kinematics,” in Proceedings of the IEEE/RSJ chatronics and Robotics, MechRob2004, (Aachen, Germany), International Conference on Intelligent Robots and Systems, 2004, Special Session on Telepresence and Teleaction. (Edmonton, Canada), pp. 2483–2487, 2005. [33] C. Eberst, N. St¨ffler, M. Barth, and G. F¨rber, “Compen- o a [47] A. Peer, U. Unterhinninghofen, K. Lee, B. Stanczyk, and sation of time delays in telepresence applications by photo- M. Buss, “Haptic telemanipulation in extensive remote realistic scene prediction of partially unknown environments,” environments,” in Proceedings of the Joint International in Proceedings of the IASTED International Conference on COE/HAM SFB-453 Workshop on Human Adaptive Me- Robotics and Applications RA’99, (Santa Barbara, CA), chatronics and High-Fidelity Telepresence, (Tokyo, Japan), pp. 163–168, 1999. pp. 57–62, 2005. ˇ [48] U. D. Hanebeck and N. Saldic, “A modular wheel system [34] G. Passig, T. Burkert, and J. Leupold, “Scene model acqui- sition for a photo-realistic predictive display and its applica- for mobile robot applications,” in Proc. of the IEEE/RSJ tion to endoscopic surgery,” in MIRAGE 2005: International International Conference on Intelligent Robots and Systems Conference on Computer Vision / Computer Graphics Colla- (IROS), (Kjongju, Korea), pp. 17–23, 1999. boration Techniques and Applications, (INRIA Rocquencourt, [49] N. Nitzsche, U. D. Hanebeck, and G. Schmidt, “Design issu- France), pp. 89–97, March 2005. es of mobile haptic interfaces,” Journal of Robotic Systems, vol. 20, pp. 549–556, 06 2003. [35] K. Y. H. Esen and M. Buss, “A Control Algorithm and Pre- liminary User Studies For A Bone Drilling Medical Training [50] N. Nitzsche and G. Schmidt, “Force-reflecting mobile teleope- System,” in Proceedings of RO-MAN, (California, USA), 2003. ˇ ration,” in Proc. of RAAD’04, (Brno, CR), 2004.
  6. 6. [51] N. Nitzsche, Weitr¨umige Telepr¨senz: Unbeschr¨nkte Fort- a a a [70] P. R¨ssler, U. D. Hanebeck, and N. Nitzsche, “Feedback con- o bewegung und haptische Interaktion. PhD thesis, Munich, Ger- trolled motion compression for extended range telepresence,” many, 2005. in Proc. of the IEEE International Conference on Mechatro- nics & Robotics, (Aachen), 2004. [52] F. Barbagli, A. Formaglio, M. Franzini, A. Giannitrapani, and D. Prattichizzo, “An experimental study of the limitations of [71] P. Kammermeier, M. Buss, and G. Schmidt, “A systems theo- mobile haptic interfaces,” in Proc. Int. Symp. on Experimental retical model for human perception in multimodal presence Robotics, (Singapore), 2004. systems,” in ASME Transactions on Mechatronics, pp. 234– 244, IEEE, 2001. [53] J. Park and O. Khatib, “Robust haptic teleoperation of a mo- bile manipulation platform,” in Proc. Int. Symp. on Experi- [72] P. Kammermeier, Verteilte Taktile Stimulation Zur Vermitt- mental Robotics, (Singapore), 2004. lung Mechanischer Ber¨hrungsinformation in Telepr¨senzan- u a wendungen. PhD thesis, Technische Universit¨t M¨nchen, a u [54] H. Roth, K. Schilling, and O. J. R¨sch, “Haptic interfaces for o LSR, 2003. remote control of mobile robots,” in Proc. of the IFAC 15th [73] A. Kron, Beitr¨ge zur bimanuellen und mehrfingrigen hapti- a Triennial World Congress, (Barcelona, Spain), 2002. schen Informationsvermittlung in Telepr¨senzsystemen. PhD a [55] N. Diolaiti and C. Melchiorri, “Haptic tele-operation of a mo- thesis, Technische Universit¨t M¨nchen, LSR, 2005. a u bile robot,” in Proc. of the IFAC Symposium on Robot Con- [74] S. Hirche, Haptic Telepresence in Packet Switched Networks. trol, (Wroclaw, Poland), 2003. PhD thesis, Technical University Munich, 2005. [56] H. Iwata, “Walking about virtual environments on an infini- [75] M. Kuschel, P. Kremer, and M. Buss, “Kinaesthetic-haptic te floor,” in Proc. of the IEEE Virtual Reality Conference, data reduction for telepresence systems,” in Proceedings of (Houston, Texas), pp. 286–293, 1999. the International Conference on Robotics and Automation, [57] R. P. Darken, W. R. Cockayne, and D. Carmein, “The omni- (Orlando, USA), 2006. directional treadmill: A locomotion device for virtual worlds,” [76] S. Hirche, P. Hinterseer, E. Steinbach, and M. Buss, “Network in Proc. of the ACM UIST, (Alberta, Canada), pp. 213–222, traffic reduction in haptic telepresence systems by deadband 1997. control,” in Proceedings IFAC World Congress, International [58] H. Baier, M. Buss, F. Freyberger, and G. Schmidt, “Interactive Federation of Automatic Control, (Prague, Czech Republic), stereo vision telepresence for correct communication of spatial 2005. geometry,” Advanced Robotics, The International Journal of [77] P. Hinterseer, S. Hirche, E. Steinbach, and M. Buss, “A novel, the Robotic Society of Japan, vol. 17, no. 3, pp. 219–233, 2003. psychophysically motivated transmission approach for haptic data streams in telepresence and teleaction systems,” in Pro- [59] C. Cruz-Neira, D. Sandin, and T. A. DeFanti, “Surround- ceedings of the IEEE International Conference on Acoustics, screen projection-based virtual reality: The design and imple- Speech, and Signal Processing, ICASSP’2005, (Philadelphia, mentation of the CAVE,” in Proc. of the ACM SIGGRAPH, PA, USA), pp. 1097–1100, 2005. (Anaheim, California), pp. 135–142, 1993. [78] S. Hirche, P. Hinterseer, E. Steinbach, and M. Buss, “Towards [60] G. Welch, G. Bishop, L. Vicci, S. Brumback, K. Keller, and deadband control in networked teleoperation systems,” in Pro- D. Colucci, “High-performance wide-area optical tracking - the ceedings IFAC World Congress, International Federation of hiball tracking system,” Presence, vol. 10, pp. 1–21, 02 2001. Automatic Control, (Prague, Czech Republic), 2005. [61] R. P. Darken, “Spatial orientation and wayfinding in large- [79] S. Hirche, A. Bauer, and M. Buss, “Transparency of haptic te- scale virtual spaces II,” Presence, vol. 8, pp. iii–vi, 12 1999. lepresence systems with constant time delay,” in Proceedings [62] R. A. Ruddle and D. M. Jones, “Movement in cluttered virtual of the Conference on Control Applications, (Toronto, Cana- environments,” Presence, vol. 10, pp. 511–524, 10 2001. da), pp. 328–333, IEEE, 2005. [63] N. H. Bakker, P. J. Werkhoven, and P. O. Passenier, “The [80] P. Kammermeier, A. Kron, M. Buss, and G. Schmidt, “To- effect of proprioceptive and visual feedback on geographical wards Intuitive Multi-fingered Haptic Exploration and Ma- orientation in virtual environments,” Presence, vol. 8, pp. 36– nipulation,” in Proceedings of the International Workshop 53, 02 1999. on Advances in Interactive Multimodal Telepresence Sy- stems, (Technische Universit¨t M¨nchen, M¨nchen, Germa- a u u [64] N. Nitzsche, U. D. Hanebeck, and G. Schmidt, “Mobile hap- ny), pp. 57–70, 2001. tic interaction with extended real or virtual environments,” [81] P. Kammermeier, M. Buss, and G. Schmidt, “Dynamic Dis- in Proc. of the IEEE International Workshop on Robot- play of Distributed Tactile Shape Information by a Prototypi- Human Interactive Communication, (Bordeaux/ Paris, Fran- cal Actuator Array,” in Proceedings of the IEEE/RSJ Inter- ce), pp. 313–318, 2001. national Conference on Intelligent Robots and Systems IROS, [65] N. Nitzsche, U. D. Hanebeck, and G. Schmidt, “Extending (Takamatsu, Japan), pp. 1119–1124, 2000. telepresent walking by motion compression,” in Tagungsbei- [82] B. Deml, A. Kron, M. Kuschel, and M. Buss, “Actuating tr¨ge, 1. SFB-Aussprachetag, Human Centered Robotic Sy- a a data-glove with thermal-tactile feedback - human factors stems, HCRS, (Karlsruhe, Deutschland), pp. 83–90, 2002. considerations,” in Proceedings of the Joint International [66] N. Nitzsche, U. D. Hanebeck, and G. Schmidt, “Mobile Hap- COE/HAM SFB-453 Workshop on Human Adaptive Mecha- tische Schnittstellen f¨r Weitr¨umige Telepr¨senz: Idee und u a a tronics and High-Fidelity Telepresence, (Tokyo, Japan), 2005. Methodik,” at Automatisierungstechnik, vol. 51, pp. 5–12, 01 [83] A. Kron and G. Schmidt, “Bimanual haptic telepresence 2003. technology employed to demining operations,” in Proceedings [67] N. Nitzsche, U. D. Hanebeck, and G. Schmidt, “Motion com- of the EuroHaptics’2004 Conference, (Munich, Germany), pression for telepresent walking in large-scale remote environ- pp. 490–493, 2004. ments,” in Proc. of SPIE, AeroSense Symposium, (Orlando, [84] A. Kron and G. Schmidt, “A Bimanual Haptic Telepresence Florida), 2003. System – Design Issues and Experimental Results,” in Pro- [68] N. Nitzsche, U. D. Hanebeck, and G. Schmidt, “Motion com- ceedings of the International Workshop on High-Fidelity Te- lepresence and Teleaction, jointly with the IEEE conference pression for telepresent walking in large target environments,” HUMANOIDS’2003, (Munich, Germany), 2003. Presence, vol. 13, 02 2004. [85] K. Drewing, M. Fritschi, R. Zopf, M. Ernst, and M. Buss, [69] N. Nitzsche and G. Schmidt, “A mobile haptic interface maste- “First evaluation of a novel tactile display exerting shear for- ring a mobile teleoperator,” in Proceedings of the IEEE/RSJ ce via lateral displacement,” ACM Transactions on Applied International Conference on Intelligent Robots and Systems Perception, vol. 2, no. 2, pp. 118–131, 2005. IROS, (Sendai, Japan), 2004.

×