19 151_155

277 views

Published on

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
277
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

19 151_155

  1. 1. Journal of International Council on Electrical Engineering Vol. 1, No. 2, pp. 151~155, 2011 151 A Study on Robot Motion Authoring System using Finger-Robot Interaction Kwang-Ho Seok *, Junho Ko ** and Yoon Sang Kim † Abstract – This paper proposes a robot motion authoring system using finger-robot interaction. The proposed robot motion authoring tool is a user-friendly method that easily authors creates and controls robot motion according to the number of fingers. Furthermore, the system can be used to simulate user- created robot contents in the 3D virtual environment. This allows the user to not only view the authoring process in real time but also transmit the final authored contents. The effectiveness of the proposed motion authoring system was verified based on motion authoring simulation of an industrial robot. Keywords: Finger-robot interaction, Industrial robot, Robot motion authoring system 1. Introduction This paper proposes a robot motion authoring system using finger-robot interaction. The proposed system is With rapid developments in todays robot technology, capable of authoring (creating and controlling) robotrobots have been used in a wide range of fields. As the motion according to the number of fingers. It is an intuitiveincreasing number of applications adopts intelligence, and robot motion authoring method that allows the user tothus robot requires a new type of robot software easily author robot motions. The effectiveness of theenvironment that allows user to author various commands proposed method is verified based on motion authoringfor robots. However, robots must become easier to use for simulation of an actual industrial robot.general users in order to expand the scope of applications ofintelligent robots in our everyday life. Based on this request,a number of studies are being conducted regarding new 2. Proposed Robot Motion Authoring System usingtypes of user-friendly and intuitive robot motion authoring Finger-Robot Interactionthat can render various motions [1]-[4]. Also, there areresearch projects actively taking place in human-robot With the exception of language, the hand is mostinteraction, thanks to the progress that has been made in frequently used for human communication among our bodyvision technology. A study is being conducted to develop a parts such as hands, eyes, mouth, arms and legs. Thisfinger pad that can replace the computer mouse [5], and chapter explains how industrial robot motions can be thethere are other studies proposing algorithms to improve authored according to the number of fingers and verifies thehand gesture recognition [6]-[8]. However, user-intuitive or effectiveness of the proposed approach based on simulation.user-friendly robot command authoring tools that focus on Fig. 1 illustrates the overall authoring process of the robotfacilitating general users are still rare. The lack of user- motion authoring system implemented for this study.intuitive or user-friendly tools is likely to create a barrierfor providing robot services (commands/motions) thatcorrespond with the preferences and demands of generalusers including children, the elderly and housewives, whoare expected to be the major clients in the service robotindustry.† Corresponding Author: School of Computer Science and Engineering, Korea University of Technology and Education, Korea (yoonsang@kut.ar.kr)* Dept. of Computer Science and Engineering, Korea University of Technology and Education, Korea (mono@kut.ac.kr)** Dept. of Computer Science and Engineering, Korea University of Fig. 1. Robot motion authoring system. Technology and Education, Korea (lich126@kut.ac.kr)Received: January 17, 2011; Accepted: March 23, 2011
  2. 2. 152 A Study on Robot Motion Authoring System using Finger-Robot Interaction2.1 Finger recognition provides robot’s external view. In the ZOOM/VIEW mode, the view can be zoomed in/out and rotated, allowing the There are a number of techniques for finger recognition, user to view robot’s entire structure in detail. Thewhich is being used in various fields. In this section, finger simulation panel provides the user with a real time 3Drecognition unit is implemented using color values scheme viewing of the robot motion being authored with finger-[8-9]. The finger recognition unit first converts RGB colors robot interaction. The authored contents(motions) can beinto gray scales and YCrCb for binary representation. Then transmitted to control the robot by clicking TRANSMISSIONthe region inside the hand is filled by masking and noise is button, and the actual robot motion can be stopped usingremoved. The binary image is examined in 25-pixel units, STOP button. The data panel provides the virtual robotas shown in Fig. 2. If sum of 1’s examined is greater than motion degrees and PWMs to the actual industrial robot inten, every digit is set to 1. With the image obtained by real time. The data panel provides RESET, PAUSE,masking performed by the finger recognition unit, the RESTART button. If the RESET button is clicked, the robotcenter of the hand can be calculated to identify hand’s is initialized to the default value. If the RESTART button islocation as well as the hand region furthest from the center. clicked, the robot is restarted.The center of the hand is marked with a red dot, the palmregion with a red circle and the hand region with a whitecircle. If the number of fingers is 0 or 5, a circle image isdisplayed. For 1, 2 and 3 fingers, the image is shown inblue, green and red, respectively. If there are 4 fingers, thehand is shown in white on a black background (Fig. 2). Fig. 3. Robot motion authoring tool. Fig. 2. Finger recognition process. 2.3 Virtual Robot Motion authoring simulation2.2 Robot Motion Authoring Tool This section evaluates the effectiveness of the proposed authoring method based on robot motion authoring simulation. The virtual industrial robot used for the robot In this section, a motion authoring unit capable of motion authoring simulation is an articulated robot with 6creating and controlling robot motions (industrial robot in DOF, and its motion ranges and definition are given inthis paper) is implemented using the finger-robot Table 1 and Fig. 4.interaction based on the finger recognition unit explained inthe previous section. The motion authoring unit(editor) Table 1. Motion Rangeconsists of four panels: the finger interface panel, mode AXIS ANGLE (degree)panel, simulation panel and data panel(Fig. 3). The finger 1-Axis 0 ~ -360interface panel means the finger recognition unit that 2-Axis -60 ~ 60recognizes the number of fingers (from 0 to 5) from the 3-Axis -90 ~ 90finger images taken by camera. The number of fingers 4-Axis 0 ~ -60recognized by the finger interface panel allows the user to 5-Axis 0 ~ -360control the industrial robots (such as SCARA, articulated 6-Axis 0 ~ 90robot, and etc.) displayed on the simulation panel. Themode panel provides three modes – WIRE, SOLID and Fig. 4 depicts the virtual and actual articulated robot forZOOM/VIEW. The mode panel provides TRANSMISSION the motion authoring simulation respectively. The motionbutton and STOP button. The WIRE mode allows viewing ranges and directions of the virtual articulated robot areof robot’s internal structure and the SOLID mode only almost similar to ones of the actual robot.
  3. 3. Kwang-Ho Seok, Junho Ko and Yoon Sang Kim 153 Fig. 5. The motion authoring simulation in WIRE andFig. 4. Definition of an articulated robot used for the SOLID mode. motion authoring simulation. Fig. 6(a) depicts how a robot rotation part is enlarged in In WIRE and SOLID mode, internal and external the ZOOM/VIEW mode when the number of finger isstructures of the robot can be viewed, respectively. In either recognized as 2 and 3. Also, Fig. 6(b) depicts how a robotmode, if the number of fingers recognized by the finger part is downsized in the ZOOM/VIEW mode when theinterface is 0 or 4, 1-Axis or 5-Axis rotates around the Y number of finger is recognized as 1.axis. For 1, 2, 3 fingers, 2-Axis, 3-Axis, 4-Axis rotatesaround the Z axis, respectively(Fig. 5). The ZOOM/VIEWmode allows the user to zoom in and out of the robotstructure and vary the camera angle for a detailed view.Similar to the WIRE mode, the camera is rotated clockwiseand counter-clockwise for 3 and 4 fingers, respectively,according to the definitions of Table 3 so that the user canview the robot from a desired angle. Accordingly, the useris able to not only create but control motions for a part of a Fig. 6. (a) Enlarged robot rotation part in ZOOM/VIEWrobot based on the number of fingers recognized. mode with 2 and 3 recognized (b) downsized robot Table 2 and 3 list the definitions of finger-value events in ZOOM/VIEW mode with 1 recognizedfor WIRE, SOLID and ZOOM/VIEW mode.Table 2. Definitions of Finger-value event for WIRE and 3. Experimental Results and Discussions Solid mode 3.1 Virtual Robot Experimental Result MODE FINGER EVENT 0 1-Axis rotation (base Y) 1 2-Axis rotation (base Z) The robot motion authoring system proposed in this A. WIRE 2 3-Axis rotation (base Z) paper was implemented on a personal computer with 3 4-Axis rotation (base Z) 1GByte memory, 1.80GHz AMD AthlonTM 64-bit processor B. SOLID CPU and ATI Radeon X1600 graphic card that supports 4 5-Axis rotation (base Y) 6-Axis rotation DirectX. As well, Logitech Webcam Pro 9000 was used. 5 (Gripper ON/OFF) The robot motion authoring tool in this paper was implemented with OPENGL, the open source programmingTable 3. Definitions of Finger-value events for ZOOM/VIEW library. mode Authoring was effective because different colors were displayed according to the number of fingers recognized, MODE FINGER EVENT allowing the user to immediately discern the part of the 0 default robot being authored. Fig. 7 is a captured image of the robot 1 ZOOM OUT motion authoring process using the proposed method. The C. ZOOM/VIEW 2 ZOOM IN experimental results confirmed that the robot motion 3 VIEW rotation (CW) authoring system proposed by this study provides general 4 VIEW rotation (CCW) users with an intuitive, fun and easy way to create motions.
  4. 4. 154 A Study on Robot Motion Authoring System using Finger-Robot Interaction Table 4 . NT-Robot Arm Specification [10] SPECIFICATION Speed 6V servomotor Voltage DC 6V Size 100×ø550 Power 16.1Kg-cm (max) Weight 1 Kg Structure 6 axis + gripper Center distant 130m/m Current 12.5 A The purpose of experiment in 3.2 was to confirm whether or not robot motions authored by the proposed tool works with an actual robot well. The motions authored by the tool were applied to an actual robot for the experiment (Fig. 9). The motion data was transmitted to the robot using serial communication and it was examined how the contents produced by the robot motion authoring tool controlled the robot according to user’s intention.Fig. 7. Robot motion authoring process using the proposed method.3.2 Actual Robot Experimental Result Actual articulated robot is an miniature-type robot (NT-Robot Arm product of NTREX CO., LTD) 7 RC-servomotors were used, and 5 ball casters were used on theBASE part for smooth turning. The gripper can seize an Fig. 9. Experiment results using the actual robot.object of maximum 80mm. For control, the robot arm canbe easily controlled by a computer or embedded board byusing NT-SERVO-16CH-1 and NT-USB2UART (Fig. 8). 4. ConclusionTable 4 depicts NT-Robot Arm specification. This paper proposed an authoring system capable of creating and controlling motions of industrial robots based on finger-robot interaction. The proposed system is user- friendly and intuitive and facilitates motion authoring of industrial robots using fingers, which is second only to language in terms of means of communication. The proposed authoring system also uses graphic motion data to simulate user-created robot contents in a 3D virtual environment so that the user can view the authoring process in real time and transmit the authored robot contents to control the robot. The validity of the proposed robot motion Fig. 8. Actual articulated robot structure. authoring system was examined based on simulations using
  5. 5. Kwang-Ho Seok, Junho Ko and Yoon Sang Kim 155the authored motion robot contents as well as experiments University of Technology and Education (KUT), Cheonan,of actual robot motions. The proposed robot motion Korea. His current research interests include immersiveauthoring method is expected to provide user-friendly and virtual reality, human robot interaction, and power system.intuitive solutions for not only various industrial robots, butalso other types of robots including humanoids. Junho Ko received the B.S. degree in Internet-software engineering, from References Korea University of Technology and Education (KUT), Korea, in 2009 and the M.S. degree in Information media[1] http://www.cyberbotics.com/ (CyberRobotics Webots TM) engineering, from Korea University of[2] http://www.microsoft.com/korea/robotics/studio.mspx Technology and Education (KUT),[3] Sangyep Nam, Hyoyoung Lee, Sukjoong Kim, Yichul Korea, in 2011. Since 2011 to now, he Kang and Keuneun Kim, “A study on design and has been Ph.D. student in Humane Interaction Lab (HILab), implementation of the intelligent robot simulator Korea University of Technology and Education (KUT), which is connected to an URC system”, vol. 44. no. 4, Cheonan, Korea. His research fields are artificial intelligence and vehicle interaction. IE, pp. 157-164, 2007.[4] Kwangho Seok and Yoonsang Kim, “A new robot motion authoring method using HTM”, International Yoon Sang Kim received the B.S., Conference on Control, Automation and Systems M.S., and Ph.D. degrees in electrical (ICCAS), pp. 2058-2061, Oct. 2008. engineering from Sungkyunkwan[5] Hyun Kim, Myeongyun Ock, Taewook Kang and University, Seoul, Korea, in 1993, 1995, Sangwan Kim, “Development of Finger Pad by and 1999, respectively. He was a Webcam”, The Korea Society of Marine Engineering, Member of the Postdoctoral Research pp. 397-398, June 2008. Staff of Korea Institute of Science and[6] Kangho Lee and Jongho Choi, “Hand Gesture Technology (KIST), Seoul, Korea. He was also a Faculty Sequence Recognition using Morphological Chain Research Associate in the Department of Electrical Code Edge Vector”, Korea Society of Computer Engineering, University of Washington, Seattle. He was a Information, vol. 9, pp. 85-91. Member of the Senior Research Staff, Samsung Advanced[7] Attila Licsar and Tamas Sziranyi, “User-adaptive hand Institute of Technology (SAIT), Suwon, Korea. Since gesture recognition system with interactive training”, March 2005, he has been an Associate Professor at the Image and Vision Computing 23, pp. 1102-1114, 2005. School of Computer and Science Engineering, Korea[8] Kunwoo Kim, Wonjoo Lee and Changho Jeon, “A University of Technology Education (KUT), Cheonan, Hand Gesture Recognition Scheme using WebCAM”, Korea. His current research interests include Virtual The Institute of Electronics Engineers of Korea, pp. simulation, Power-IT technology, and device-based 619-620, June. 2008. interactive application. Dr. Kim was awarded Korea[9] Mark W. Spong, Seth Hutchinson and Mathukumalli Science and Engineering Foundation (KOSEF) Overseas Vidyasagar, “ROBOT MODELING AND CONTROL”, Postdoctoral Fellow in 2000. He is a member of IEEE, WILEY, 2006. IEICE, ICASE, KIPS and KIEE.[10] http://www.ntrex.co.kr/ Kwang-Ho Seok received the B.S. degree in Internet-media engineering, from Korea University of Technology and Education (KUT), Korea, in 2007 and the M.S. degree in Information media engineering, from Korea University of Technology and Education(KUT), Korea, in 2009. Since 2009 to now, he has beenPh.D. student in Humane Interaction Lab (HILab), Korea

×