Robocup2013 Rescue Robot-Chadormalu Mining and Industrials co. Team

591 views

Published on

In this paper the Chadormalu Research Center ( CRC ) rescue robot team from Chadormalu Mining & Industrial Company and its robot explained. we have de-signed and built a first new autonomous robot , developed our Tele-operated ro-bot and a new autonomous flying robot for different situation and arenas and lo-cal map , where other devices are attached : LRF sensor and kinect camera , GPS Sensor for Flaying robot mapping. our main goal of this activity is to achieve a practical rescue robot for real situation such as earthquakes and Mine accident and flying robot use mapping/tour the mine for Chadormalu Holding. some re-search programs on autonomous mobile robot such as : autonomous flying robot in the landing on the rescue robot , automatic victim detection with neural net-work algorithm, Simulation localization and mapping (SLAM), In the first event for RoboCup jointed Full Autonomous Quad copter Robot with Rescue Robot.

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
591
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
14
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Robocup2013 Rescue Robot-Chadormalu Mining and Industrials co. Team

  1. 1. RoboCupRescue 2013 - Robot League Team Chadormalu Rescue Robot (IRAN) 1 Pouya Mansournia , Mona Yeganedoust 2 ,Amir H. Soltanzadeh3, Behzad Peykari4, Seyed mohammad hadi Sadati5 , Chadormalu Research Center R Chadormalu Mining & Industrial .Co ob ot Ali Torabi Parizi6 No 56 , Esfandyar Blvd.,Vali-e-Asr, Tehran, Iran es cu e Post Code : 19686-53647 Tel: +98 21 88882858 , +98 21 88883102 Fax: +98 21 88775935 P.mansournia@gmail.com R http://www.chadormalu.com/Robotech C ha do rm al u Abstract. In this paper the Chadormalu Research Center ( CRC ) rescue robot team from Chadormalu Mining & Industrial Company and its robot explained. we have designed and built a first new autonomous robot , developed our Teleoperated robot and a new autonomous flying robot for different situation and arenas and local map , where other devices are attached : LRF sensor and kinect camera , GPS Sensor for Flaying robot mapping. our main goal of this activity is to achieve a practical rescue robot for real situation such as earthquakes and Mine accident and flying robot use mapping/tour the mine for Chadormalu Holding. some research programs on autonomous mobile robot such as : autonomous flying robot in the landing on the rescue robot , automatic victim detection with neural network algorithm, Simulation localization and mapping (SLAM), In the first event for RoboCup jointed Full Autonomous Quad copter Robot with Rescue Robot. Introduction Chadormalu rescue robot team represent CRC ( Chadormalu Research Center ) and develops Mechatronics layers of high mobility rescue robot. team member activity began Iran Open Competition 2009 at junior rescue league and place first award in the competition, team member five years work at Robocup junior and senior league and place award in competition. now chadormalu team intention to enter Rescue
  2. 2. R es cu e R ob ot league and the combined Full autonomous flying Robot with Rescue robot and landing Flaying robot On the Rescue robot. team member more than 7 month worked at Flying autonomous robot to do Simulation and localization and mapping system. rescue mechanism in the first designed by team member. according to the bottom photo Iran is located on a fault, So we lose control over the lives of our countrymen have been. In this regard, the CRC team Trying to save the life of even one of these ones [figure 1]. rm Pouya Mansournia Mona Yeganedoust Hadi Sadati Ali Torabi Parizi do § § § § al u 1. Team Members and Their Contributions Team Leader, Electronic Mechanical design Mechanical design Mechanical design ha Chadormalu Mining & Industrials Company Sponsor C 2. Operator Station Set-up and Break-Down (10 minutes) In the rescue operation it is compulsory to set-up and break down as soon as possible in less than 10min. we've Designed a mobile control pack including : Notebook, new design joystick, access point, antenna, I/O Extension Board and UPS, connector so the operator can setup and drive user friendly.
  3. 3. 3. Communications Rescue Robot League Power (mW) Less than 60dB 4. Map generation/printing cu e Channel/Band One-Fixed on any Asked es Frequency 5.0 GHz - 802.11a R CHADORMALU (IRAN) ob ot All wireless communication is implemented in 5GHZ range using IEEE802.11a wireless LAN cards and access points. other frequency ranges are not occupied. we are using PCMCIA LAN cards and planet DWA-P2100AP access point. video, audio and data are transmitted by sending packets on wireless network. it is possible to set the wireless system to work on one channel and do not disturb other channels and frequencies, but we can change to any 802.11a channel (almost all possible channels) in case we are asked to. the signal strength is less than 60dB. Here we discuss the software code for video, audio and data transmission. we use c#.NET 2011 as our highlevel programming language. C ha do rm al u R We are Working to generate 2D map in surface. Main sensor is a Hokuyo URG 04LX. [seeFigure2] We are done with LASER based SLAM and also we’ve added gyro sensors for keeping laser horizontally when robot is moving over she step-fields and ramps. [Figure 3] Shows sample map generated by our autonomous robot justly processing laser scanner. The mapping is based on PCM.
  4. 4. 5. Control Method and Human Robot Interface R es cu e R ob ot The control scheme is partial autonomous . CRC user interface which is a simple user friendly environment for the operator to control the robot and send the commands to the computer are processed by to navigate the robot, all the sensors information are also sent to the operator to investigate the arena and detest all possible victims. to avoid colliding the robot with obstacles or victims an ultrasonic obstacle avoidance algorithm has more priority than the command of the operator. Although the generation is autonomous, when a victim is located, operator has to define the victim conditions based on the Sensors data. In order to save time, a proper GUI is designed with several push bottom keys to define the victim’s condition by Click in the mouse button and operator say (sound).I n case of losing the control of the robot by the operator , a program is designed to return the robot back to the starting point autonomously using the map stored in the robot. All the sensors data are collected in a data bank to be used even off-line after the operation. It should be mentioned that the applied servomechanism is digitally controlled by a PID control algorithm implemented on an ARM microcontroller system. An optical shaft-encoder is used to feedback the position and speed of the robot. ALL embedded microcontroller system designs for servomechanisms, instrumentations and drivers are modular for easy repair and maintenance. al u 6. Sensors for Victim Identification C ha do rm Our main sensor for victim identification is heat and sound sensor. We use a pyrometer and 8 TPA81 Thermal arrays (see Figure 4) and for sound sensor with microphone and design module at team through team members. based on Victim Characteristic such as shape face and hand , body , hate sensor , Co2 and motion .Another project which is now on the progress is using vision for victim identification. IR Temperature Sensor Camera IR Co2 Sensor
  5. 5. 7. Stereo Vision al u R es cu e R ob ot In fact, this innovation not only individually, but as with any technology integration in which the camera is used for video and moving images can be used and the proper functioning of a safety factor increases with Using genetics and found that neural networks such as the human eye can put the two together to get the camera focused on the actual size of the intersection angle of two cameras at a point obtained more Details. The same size and it can be processed with high precision. In order for the measured ambient sound in certain circumstances, and we were part of their data, a sound and temperature data for Series 11 microcontroller the ARM the environment, we used the sound and temperature conditions as a sensitive and security we sent center by Wireless. rm 8. Robot Locomotion C ha do The CRC team will compete with a Teleported bot in this competition. As robots with tracks perform better than those with wheels on difficult and uneventrains, choosing tracks seems to be the best. Tracks are manufactured using 3/8” flanged chain series which small cubic prisms with a perpendicular to chain line wall angles casted using a kind of relatively hard rubber are mounted on as its tracking profiles using sealing nuts ensuring not loosening of the bolt and nut connections. Choosing short in length and long in height hard rubber which are mounted closely on the chain ensures good gripping ability of the chain with uneven train or stairs edges from the opened gaps because of the roller sprockets curvature, while on a flat surface as the belt is flatten under the length of the robot, it form a continues hard adhesive rubber surface which has good gripping performance on steep slopes and ramps. Idle rollers are placed along the body and arms to ensure good contact force and path between the profiles and trains all over the robot body (Fig. 1).
  6. 6. ot ob Fig. 1.High low width hard rubber profiles form a continues contact surface preferable on smooth slopes and ramps, and their gaps because of the sprocket curvature are ideal for gripping step and stair edges. C ha do rm al u R es cu e R Besides designing a secondary chain roll around the middle of the robot with shorter rubber profiles than sides’ profiles covers full with of the robot and prevents from stalling on high narrow steps and ensure continues movements of the robot with minimum use of its arms. The robot is designed with four arms, which both arms of each side are actuated with a 150 Watt 480 g brush DC motor manufactured by Maxon motor Co., serial No. 148877 with 7000 rpm and 187 mNm nominal speed and torque respectively, coupled with a three stages, 460 g planetary gear head, serial No. 203120 with aspect ratio of 1:43, max. continues torque of 15 nm and efficiency of 72%. Then it combines with a standard 0.55 KW NMRV50 worm gear box speed reduced with aspect ratio of 1:30 and permissible output torque and radial load of 81 Nm and 3.15 KN. This combination results in a power pack capable of rotating in speed of5.4 rpm and up to 241 Nm. Dynamic analysis of the robot with estimated mass of 60 Kg in its worst case, as it lift itself on one of the arms, shows that max. output power and torque equals to 130 watt and 80 Nm is needed respectively which is quite satisfactory with the chosen set of motor, planetary and worm gearboxes. The motor and gearbox packs are places at the middle of the robot (Fig. 2). Arms on each sides move respectively while the arms on each end of the robot will move independently. Besides to save the inner area of the robot body for electrical devices and design a simple flat continues flawless track locomotion system, arms are placed along the body tracks beside placing them on robot sides as it is conventional on rescue robots. This consideration omits the hollow shaft mechanism of the robot as the arms are actuated via their end on the middle of the robot. A small 19 teeth sprocket is used near the arm gearbox on each side and end to transmit the track rolling of the arms to the body tracks.
  7. 7. ob ot Fig. 2. Arm actuators which are placed at the middle of the robot and arms are along the robot body tracks. C ha do rm al u R es cu e R Furthermore to save the places in the robot body for electrical devices and using long arms to overcome new high elevated obstacles, four tracks’ motor drives (two parallel motor derive for each track side) have been chosen and placed in each arms (Fig. 3). The same brush DC motor as the arm actuator pack coupled with a one stage, 460 g planetary gearhead, serial No. 223080 with aspect ratio of 1:3.5, max. continues torque of 4 nm and efficiency of 90%. Then it combines with a standard 0.18 KW NMRV30 worm gear box speed reduced with aspect ratio of 1:30 and permissible output torque and radial load of 24 Nm and 1.08 KN. This combination results in a power pack capable of rotating in speed of 66.6 rpm and up to 19.6 Nm. Dynamic analysis of the robot with estimated mass in its worst case, as it climbs of a 45 degree slope with acceleration of 0.16 m/s^2, shows that max. output power and torque equals to 150 watt and12 Nm is needed which is quite satisfactory with the chosen set of motor, planetary and worm gearboxes. Wheel radius is considered to be 15 cm (a 50 teeth sprocket for main drives and 35 teeth sprocket for each arms’ top drive have been chosen) and the linear speed of the robot would be 0.48 m/s. Fig. 3. Main Track actuators which are placed at each arm and work in parallel to derive the robot.
  8. 8. ot Figure 4 shows a section view of the robot hollow shaft mechanism and bearing set. ob Fig. 4.Section view of the robot hollow shaft mechanism and bearing set. ha do rm al u R es cu e R The chain stiffener mechanism on top of each sides of the robot is used to transmit the rotation of the side tracks to inner tracks. The chains on body and arms are fastened precisely using screws (Fig 5). Fig.5.Precise chain stiffeners using two 6 mm screws. C The main and arm bodies are designed and manufactured using hole 20x20 and 10x20 standard steel profiles welded together precisely to form stiff light chassis. The robot design is modeled with Solidworks and analysed using Solidworks Simulation and ANSYS software using Finite Element Method(FEM) (Fig 6).
  9. 9. ot ob R e cu es R u al rm do C ha Fig 6. Arm and main body welded chassis using standard steel profiles and its FEM analysis Besides a 6kg, 1.5m, 7 DOF arm manipulator is designed with 5 rotary, one prismatic and one gripping DOF as in Fig. 7. The main drives are two Buehler with serial No. 1.61.077 and continue torque and speed of 1.5 Nm and 75 rpm combined with a handmade worm gearbox with aspect ratio of 1:30. Prismatic joint is designed with a 40cm long ball and screw mechanism and end effector is derived using three RX-32 Dynamixel servo motors. Fig. 8.7 DOF Arm manipulator with one prismatic joint
  10. 10. R ob ot Gripper is derived using a geared mechanism derived with a RX-24 Dynamixel and designed so all the sensors and cameras are places within the tows of the gripper. A puzzle like design made this part a very simple to assemble and cheap to manufacture part which combines all the goods of high volume to place sensors and gripping ability of small elements (Fig. 8). e Fig. 9. Creative end effector gripper and sensor box es cu The robot body and arm are under manufacture now ant will be tested in IranOpen 2013 robotic festival. C ha do rm al u R c 9. Team Training for Operation (Human Factors) Process of communication and decision-making process based on it:
  11. 11. The software controller is developed to be executed on Windows 7 Our system includes two (2) computers: a notebook computer and a Main computer on the robot The notebook computer is used to run the monitoring software. This user interface is used to remotely control the robot. This application communicates with the robot’s controller via a wireless TCP/IP connection. We connected a joy stick for better controlling of robot and Easierto implementdecisions. R ob ot The main computer or Robot computer calculate the robot’s position, state of individual sensors , noise filtering, Localization and Mapping,Navigation and obstacle avoidance ,Generate the 2D mapping , Generate mission reports , Video compression and streaming and send the processed data into notebook and the notebook do things like Configuration of the joystick , Transmitting commands to the robot. e Laser Scan C ha do rm al u R es cu The robot also needs information about its position in the world. One can think about different ways to express this information. It could be in relation to some global coordinate system, but it could also be relative to some object. A combination is likely to be needed as every physical contact requires the robot to position itself relative to the object, whereas the robot will need its global position when reasoning about how to go from one place to another.Localization of each robot is unavoidably and very important. Weuse laserscannerto improverobot localization and our laser scanner model is URG-04LX-UG01. How do wepassthelaserperformanceand just say how to processing and getting best result : we implemented an iterative closest point and TrimmedICP algorithm which is based on point to point matching. the problem is to find the Euclidean transformation that brings an Npo-point subset of P into the best possible alignment withM. For an Euclidean motion with rotation matrix R and translation vector t, denote the transformed points of the data set by Define the individual distance from a data point pi(R; t) to the model setMas the distance to the closest point ofM:
  12. 12. ot rm al u R es cu e R ob Therefore we used Rapidly exploring random tree Algorithm (RRT) for avoiding objects and finding best way. AlgorithmBuildRRT Input: Initial configuration qinit, number of vertices in RRT K, incremental distance Δq) Output: RRT graph G G.init(qinit) fork = 1 toK qrand ← RAND_CONF() qnear ← NEAREST_VERTEX(qrand, G) qnew ← NEW_CONF(qnear , Δq) G.add_vertex(qnew) G.add_edge(qnear , qnew) returnG C ha do path planning using rrt : Eye chart
  13. 13. C ha do rm al u R es cu e R ob ot Image processing is referred to processing of a 2D picture by a computer. Before going to processing an image, it is converted into a digital form. Digitization includes sampling of image and quantization of sampled values. After converting the image into bit information, processing is performed. This processing technique may be, Image enhancement, Image reconstruction, and Image compression. We processed the image for finding eye charts so the steps of that explained here : First the white background should be omitted Then the remains must be compared to the character 'E' in different sizes if more than 4 were found we have an eye chart If you clean the background, there are going to be lines among which some of them are parallel. From those parallel lines, there should be 3 lines that their length are almost equal and their distance is quite short. If there is another line at right angle to those 3 parallel lines, we take all of them as an E character. 10. Flying Robot Overview Flying robots play an important role in the research and development of air that counts, because human access is not available anywhere on Earth, with flying robots can be its information environment, and the location details gained Parandeh the robot, we computer via radio module is designed by the team that has a comfortable user to control robots and robot differs from most birds, it is fully automatic, depending on the environment, is to help the laser scanner .
  14. 14. A. Main Processor rm al u R es cu e R ob ot Central won the robot and its relationship with the drivers :The robot is metallized board Pcb values of 6 ° IMU sensors with analog form using ADC is read. Circuit has two kalman filtering hardware and software that improves the IMU outputs. Also, a PID controller and a PD controller is also responsible for keeping the robot balanced on their heads. Cmps03 I2C module output is also called by the atmega328. Main Processor Support windows 7 and 1.6 MHz Frecunacy and 254Kb Ram C ha do B. Hardware Architect
  15. 15. 11. System Cost R ob ot Most of the mechanical part designed and built By our Team Member , Depends on the type of the motor we have bought our Dc servo motor from Maxon and full Haber , ultrasonic sensor and IMU and digital compass bought from company robotelectronic other sensor Wireless and LAN and modem bought from local Shop , the cost for each robot differed depending on the size and complexity of the robot but approximately each robot cost us about 45000$US. cu e 12. Lessons Learned do rm al u R es The Urban Search and Rescue (USAR) Robot requires capabilities in mobility, sensory perception, planning, mapping and practical operator interfaces, while searching for victims in unstructured environments. so the robot have high power and flexible mechanism to overcome the hard obstacles and it should be intelligent in control and map generation and victim detection as well. and team goal now jointed Flying autonomous robot witch rescue robot and mapping , Searching at Flying robot and going / Rescue at Rescue robot. ha References C 1. A. Birk, K. Pathak, S. Schwertfeger, W. Chonnaparamutt, “ The IUB Rugbot: an intelligent , rugged mobile robot for search and rescue operations” , International Workshop on Safety, Security an Rescue Robotics(SSRR) IEEE press, 2006. 2. Adam Jacoff, Elena Messina, Brain A. Weiss, Satoshi Tadokoro and Yuki Nakagawa, “Test arenas and Performance metrics for urban search and rescue robots” , In Proc. of the IEEE/RSJ International
  16. 16. C R ha do rm al u 7. 8. es cu 6. e R 5. ot 4. ob 3. Conference on intelligent Robots And Systems, pages 3396-3406, 2003. V. Nguyen, A. Martinelli, N.Tomatis, R.Siegwart. A Comparison of Line Extraction Algorithms using 2DLaser Rangefinder for Indoor Mobile Robotics. In Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS, 2005. S. T. Pfister, S. I. Roumeliotis, and J. W. Burdick. Weighted Line Fitting Algorithms for Mobile Robot Map Building and Efficient Data Representation. In Proc. of the IEEE/ICRA International Conference on Robotics and Automation,2003 Jijun Wang, Stephen Balakirsky, Stefano Carpin: USARSim: a robot simulator for research and education, Proceedings IEEE International Conference on Robotics and automation(2007), pages 1400-1408, April 2007 BHaskar, H., Werghi, N.; Al Mansouri, S.; , “Combined spatial and transform domain analysis for rectangle detection, “Information Fusion, 2010 13th conference on, vol, no., pp. 1-7, 26-30 july 2010 www.edinst.com www.hokuyo-aut.jp

×