Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Robot nao

4,361 views

Published on

  • Be the first to comment

  • Be the first to like this

Robot nao

  1. 1. Abstract — The new-generation humanoid robots, such asthe NAO robot made by Aldebaran Robotics, are characterizedby autonomous learning and close interaction with theenvironment including humans. While equipped with advancedvision and audio features including object and face recognition,speech recognition, sound source recognition, speech synthesis,etc., the NAO robot’s tactile sensing is limited to severalbuttons that can be used to trigger associated actions. Thispaper presents the wireless integration of tactile sensing on thehand of a NAO robot. Without any replacement ormodification of its existing hardware, the add-on allows theNAO robot to differentiate objects with similar size, color andshape but different weight, stiffness, or texture.I. INTRODUCTIONThe “sense of touch” is particularly important amongvarious modalities that are needed to perceive and react to thedynamics of the real world. It allows the assessment of objectproperties such as size, shape, weight, stiffness, and texture.The new-generation humanoid robots are characterized byautonomous learning and close interaction with theirenvironments including humans. Tactile sensing, combinedwith vision and/or audio in many cases, enhancesmultisensory perception of humanoid robots. Considerableresearch has been conducted on robot tactile sensing [1]-[7],especially on humanoid robots [5]-[7]. The advancements inintelligent humanoid robots provide new opportunities andchallenges for tactile sensing. For example, the smaller fingerarea comparing to robot manipulators, the coordinationbetween vision, touch, and other senses, requirements on thetouch sensory system to avoid interference to mobility, etc.The humanoid robot NAO (shown in Fig. 1), made byAldebaran Robotics, has been used by more than 350universities and research labs around the world for a widevariety of research topics in robotics as well as computerscience, human-machine interaction, and social sciences.While equipped with advanced vision and audio featuresincluding object and face recognition, speech recognition,sound source recognition, speech synthesis, etc., the NAOL. G. Ni is an associate professor at the Electrical and ComputerEngineering Department, College of Engineering, California BaptistUniversity, Riverside, CA 92504 USA (corresponding author, phone: 951-343-4470; fax: 951-343-4782; e-mail: gni@calbaptist.edu ).D. P. Kari recently received his BS ECE degree from California BaptistUniversity. He is currently a Master EE degree candidate at the BournsCollege of Engineering, University of California - Riverside, CA 92521USA. (e-mail: David.Kari@calbaptist.edu ).A. Muganza, B. Dushime, A. Zebaze are recent BS ECE, BS ECE andBS ME graduates from California Baptist University. (e-mail:Alex.Muganza@calbaptist.edu, Bertrand.Dushime@calbaptist.edu,Andre.Zebaze@calbaptist.edu ).robot’s tactile sensing is limited to several buttons that can beused to trigger associated actions. The main objective of ourproject is to enhance the NAO robot’s perception andintelligence by giving it the capability of identifying differentobjects by their weight, stiffness and texture.Figure 1. The humanoid robot NAO.Several features of the NAO robot make it appealing toour research. With its intuitive graphical interface, theprogramming software developed by Aldebaran Robotics forthe NAO robot called Choregraphe allows easy programmingof behaviors required for active exploration of objects usingtouch sense. For instance, the robot needs to grasp the objectwith one hand and stroke the object surface with the otherhand in order to identify the surface texture. The built-inspeech recognition and speech synthesis capabilities of theNAO robot makes the development more interactive, as therobot asks questions about the object it is touching or tells thename of the object based on tactile properties. The built-invisual object recognition capability provides opportunity forfusion of vision and touch in future research.This paper is organized as follows: The systemconfiguration including hardware setup and softwarecomponents are presented in Section II. The calibration andtesting of tactile sensors are discussed in Section III. Theexperimental results of active exploration and identificationof objects are presented in Section IV. Conclusions anddirections for future work are summarized in Section V.II. SYSTEM CONFIGURATIONFrom the system integration point of view, theframework of our wireless integration of tactile sensing onthe NAO robot can be applied to other commerciallyWireless Integration of Tactile Sensing on the Hand of a HumanoidRobot NAOLiya Grace Ni, Senior Member, IEEE, David P. Kari, Member, IEEE, Alex Muganza, Member, IEEEBertrand Dushime, Member, IEEE, and Andre N. Zebaze2012 IEEE RO-MAN: The 21st IEEE International Symposium onRobot and Human Interactive Communication.September 9-13, 2012. Paris, France.978-1-4673-4606-1/12/$31.00 ©2012 IEEE 982
  2. 2. available humanoid robots or adding other external sensorsto the NAO robot.A. Hardware SetupIn order to avoid violation of warranty, we have theconstraint of not replacing or modifying any existinghardware on the NAO robot. The hardware components weused in the system, besides the NAO robot itself, are listedin Table I, along with simple descriptions of theirfunctionalities, their locations, and mounting methods. Fig. 2shows how the hardware components are physicallymounted on the NAO robot.TABLE I. HARDWARE COMPONENTSComponent Name Functionality LocationMountingMethodFlexiForce sensors Tactile sensing FingersDouble stickytapePinted CircuitBoard (PCB)Auxiliary circuitfor sensorsUpperarmVelcro strap& tapeRF Link transmitterTransmits sensordataBack Enclosure boxArduino Mega 2560w. batteryInterfacebetween PCBand RFtransmitterBack Enclosure boxRF Link receiverReceives sensordataNot onthe robotN/AArduino UnoInterfacebetween RFreceiver andcomputerNot onthe robotN/AComputerProcesses sensordata; deliversbehavioralcommands to theNAO robotNot onthe robotN/AFigure 2. Hardward mounted on the NAO robot.The configuration of the five FlexiForce Sensors on theNAO robot’s three-fingered right hand is shown in Fig. 3.On both the left and right fingers, one sensor is mounted atthe tip and one is at the center. The fifth one is at the tip ofthe thumb. The same sensor labeling as shown in Fig. 3 willbe used later in Section IV.Figure 3. Configuration of the five sensors on NAO’s hand.The data flow in the system is illustrated in Fig. 4. Thetactile sensor measurements are sent to the computerwirelessly through the RF module and microcontrollers. Thecomputer analyzes and logs the sensor measurements, andsends speech and behavior commands to the CPU on theNAO robot itself wirelessly. The connection between theNAO robot and the tactile sensors is only a physicalattachment without data flow in between.Figure 4. Data flow chart.B. Software ComponentsThe software we developed for this project containsseveral components, as listed in Table II.Choregraphe allows easy capture of the joint angles forthe starting and ending positions of each motion weimplemented on the NAO robot later for the integration oftouch sensing. Fig. 5 shows how the arm angles werecaptured in Choregraphe.A cross-platform Arduino IDE is used to program themicrocontrollers that interface with the RF transmitter andreceiver, which communicate with 434 MHz radio frequencysignals. With the restriction of a maximum 4800 bits persecond (BPS) data rate of the RF module, currently the datapackets containing measurements of all five sensors aretransmitted at the frequency of 25 Hz.983
  3. 3. TABLE II. SOFTWARE COMPONENTSComponentDescriptionDevelopmentLanguageDevelopmentStageLocation ofExecutionMotionrecordingChoregraphe Preparation ComputerWirelesscommunicationC Integration MicrocontrollersMainapplicationC# Integration ComputerSpeech andbehavioralmodulesPython Integration The NAO robotDisplay ofmeasurementsC# Testing ComputerFigure 5. Arm angles obtained in Choregraphe.The main application involves a learning process for theNAO robot based on tactile information including weight,stiffness and roughness. Just like how a toddler learns aboutobjects in his/her surroundings, the NAO robot will gothrough the following steps during its learning process:Step 1. Pick up an object and learn how heavy/light, howhard/soft, and how rough/smooth it is, with measurementsfrom tactile sensors and associated actions.Step 2. Characteristics extracted from measurements arecompared with the corresponding features of objects in thedatabase. Decisions are made as follows.- If the actual features of weight, stiffness and roughnessare close to the features of the current object in the database,in other words, the absolute values of the differences arebelow predefined thresholds, say the name of the currentobject.- If the actual features do not match the features of thecurrent object, and the current object is not the last one in thedatabase, move on to the next object.- If the actual features do not match the features of thecurrent object, and the current object is the last one in thedatabase, go to Step 3.Step 3. Ask the name of the object and add it to thedatabase.The software flow of the main application is shown inFig. 6. Although the main application was developed in C#,in order to use the NAO SDK to send speech and behavioralcommands to the NAO robot, a Python script was writtenfor each action and was invoked in the C# program.Figure 6. The Unified Modeling Language (UML) diagram of softwareflow in the C# application.A graphical user interface programmed in C# isembedded in the main application to display the weight,stiffness and roughness data during testing anddemonstrations.III. SENSOR CALIBRATION AND TESTINGA. Design of Printed Circuit Board (PCB)The FlexiForce sensor is an ultra-thin and flexible printedcircuit that uses a resistive-based technology. The applicationof a force to the active sensing area of the sensor results in achange in the resistance of the sensing element in inverseproportion to the force applied. A modified version of therecommended amplifier circuit in the user manual [8] isshown in Fig. 7.984
  4. 4. Figure 7. Amplifier circuit for FlexiForce sensors [8].The feedback resistance RF as well as the drive voltageVT can be used to adjust the sensitivity of the sensor. Afeedback resistance value of 100 kΩ and a drive voltage of-1.5 V were selected in our design. A two-step process wasimplemented to supply the -1.5 V to the sensors. First, avoltage regulator consisting of two IN914 diodes connectedin series and a 240 Ω resistor provides +1.5 V with a 5 Vsupply from the microcontroller. Next, an ADM660 SwitchedCapacitor Voltage Converter was used to convert it to -1.5 V.Considering the constraint of the size of PCB in order tomount it on the robot’s upper arm, we chose a quad Op-Ampchip MCP6004 and a dual Op-Amp chip MCP6002 toprovide all the Op-Amps needed in the circuit. The layout ofthe custom PCB is shown in Fig. 8.Figure 8. PCB layout.B. Sensor CalibrationTwo different methods were used to calibrate the sensor.First, a flat load was applied to the sensor with its weightchanged by adding additional mass on top of it. Second, aplastic ball, or spherical load, was used as test object. Theweight was also changed by adding additional mass on top ofthe ball. The sensor was calibrated over a range of 0 toapproximately 2 N. The results are shown in Fig. 9. There isa linear relationship for a flat load between the voltage outputfrom the sensor and the force applied to the sensor. Thelinear equation fits the data with an R2value (a statisticalmetric that indicates how close the curve fits the data points)of 0.9875. The values obtained for the spherical loads arehigher than the values obtained for the flat loads, likelybecause of the smaller contact area for the spherical loads.The results indicate that at higher values of weight, thespherical load more closely approximates a flat load becauseof more even distribution of the load due to compression ofthe spherical object.Figure 9. Experimental results of FlexiForce sensor measurements (square:flat loads, circle: spherical loads).Next, the FlexiForce sensors were attached on the fingersof the NAO robot and calibration was performed with thefollowing configuration: the robot’s right arm and handwere kept still and a plastic ball was placed in its hand witha fixed position. Measurements were taken from two sensorsmounted at the centers of both left and right fingers. Thenthe sensors were replaced by two other sensors, and so on.The weight of the plastic ball was adjusted by adding waterthrough a hole on its top. The voltage for a particular masswas obtained by averaging the voltages measured by thesensors. Fig. 10 shows the experimental results of the rangesand average voltages of the five sensors versus differentweight inputs.Figure 10. Calibration results with sensors on the NAO robot(bar: range of voltages, square: average of voltages).IV. EXPERIMENTAL RESULTSIn this preliminary study, three objects, namely a golfball, a ping pong ball and a cotton ball, which are of similarsize, shape and color but different weight, stiffness androughness, were chosen for our experiments.985
  5. 5. A. Comparison of WeightThe NAO robot was programmed to reach out with itsright forearm and open its right hand. At the same time, itasked “Give me the ball.” The golf ball was then placed in itsright hand. The measurements from the two sensors mountedat the center of the fingers are shown in Fig. 11. The sensorsat the finger tips and on the thumb were not pressed due tothe size and shape of objects in our experiments, thereforetheir readings were discarded. A period of five seconds wasallotted to complete the test, and the voltage samples fromeach of the above two sensors throughout the testing periodwere averaged and logged in the database. A progress bar onthe display shows how much of the five-second period haselapsed. As can be observed from Fig. 11, the center ofgravity is closer to one of the two fingers during thatparticular test. Therefore, the average of sensor #3 and sensor#4 voltages is recoded as the indicator of object weight.Figure 11. Display of weight test result for a golf ball.This process was repeated for a ping pong ball and acotton ball. The average voltages over a period of fiveseconds for sensor #3 and #4 are shown in Table III for allthree objects. It can be easily observed from the sensor datathat the weight of the golf ball is much higher comparing toeither a ping pong ball or a cotton ball.TABLE III. WEIGHT MEASUREMENTSB. Comparison of StiffnessStiffness is an important property of an object that can beobtained using the sense of touch. Stiffness is defined as theextent to which an object resists deformation in response toan applied force. Ideally, stiffness of the object beingidentified by the NAO robot should be calculated as:Fx,where F is the force applied on the object and x is the amountof deformation of the object surface at the contact point.Unfortunately the positions of the fingertips of the NAOrobot cannot be obtained programmatically, which means themeasurement of object deformation using the penetration ofthe fingertip in the object is very difficult to implement if notimpossible. Therefore, the sensor measurements were utilizedto characterize stiffness.The NAO robot’s right hand was open for the weight test.When the stiffness test started immediately after the weightdata were logged, the robot was commanded to secure theball using its left hand from above and then close its righthand slowly. The assistance by the left hand was necessary,especially for the ping pong ball which could have easilyslipped out of the NAO robot’s hand. Display of sensorvoltage readings and corresponding forces for a ping pongball is shown in Fig. 12.Figure 12. Display of stiffness test result for a ping pong ball.As can be observed from Fig. 12, the sensor on the thumb(labelled as sensor #5) and the one at the center of the leftfinger (labelled as sensor #4) showed high voltages but theothers showed zero voltages. This can be explained by howthe ping pong ball was grasped by the NAO robot’s righthand. The ping pong ball was mainly between the thumband the left finger while the right finger was almost simplyresting on the surface of the ball. Due to the relative size ofthe ball versus the hand, the finger tips were slightly abovethe ball.ObjectSensor #3Average Voltage (V)Sensor #4Average Voltage (V)Golf Ball 0.08 0.24Ping Pong Ball 0.00 0.00Cotton Ball 0.08 0.07986
  6. 6. Sensor voltages from stiffness test are shown in Table IVfor all three objects. Although both the ping pong ball and thecotton ball are very light as mentioned in Section IV-A, thesensor voltages in Table IV showed their difference instiffness clearly, which was used for object identificationlater. The sensor voltages for the golf ball were even higherthan those for the ping pong ball, which met our expectation.TABLE IV. STIFFNESS MEASUREMENTSIn our experiment with only three objects: a golf ball, aping pong ball, and a cotton ball, the average of all sensormeasurements is sufficient to serve as an indicator ofstiffness. However, for objects with less difference instiffness, the sensor measurements should be analyzed moreselectively.C. Comparison of RoughnessDue to the fact that currently only the right hand of theNAO robot is equipped with the FlexiForce sensors, weprogrammed the robot to put the ball in his left handimmediately after stiffness data were logged in the database.When the left hand grasped the ball firmly, the right handstarted stroking the surface of the ball with the tip of onefinger.The interpretation of tactile sensor measurements for theroughness of object surface is more challenging than weightand stiffness. The research in surface texture discriminationby robots has been advanced by both the development oftactile sensing arrays and algorithms for temporal orspatiotemporal analysis of the sensor data [9][10]. The FastFourier Transform (FFT) was performed on the voltage datacollected from the sensor mounted on the tip of the fingerthat stroked the object surface. By comparing the spectrumof the golf ball data and that of the ping pong ball datashown in Fig. 13, we noticed significant difference at thehigh end of the frequency range. The sampling rate islimited to 25 Hz by the data rate of the RF module. Themagnitude at the frequencies close to 12.5 Hz (half of thesampling frequency) for the golf ball, which has a roughsurface, is obviously higher than that of the ping pong ball,which has a smooth surface. Therefore, the magnitude of theFFT at the highest frequency 12.5 Hz was logged in thedatabase for each object as the indicator of roughness.(a)(b)Figure 13. Roughness test results: (a) golf ball; (b) ping pong ball.D. Object IdentificationThe NAO robot asked for the name of the object after theweight, stiffness and roughness data were all logged in thedata base. The learning process was repeated for all threeobjects. The main application continued with objectidentification following the learning process. The NAOrobot was programmed to ask the user to give it a ball. Aftera ball was randomly selected and placed in its right hand, itwas able to identify whether it was a golf ball, a ping pongball, or a cotton ball.V. CONCLUSIONS AND FUTURE RESEARCHA tactile sensing system with five FlexiForce sensors andwireless communication was successfully integrated to theNAO humanoid robot. Active exploration behaviors wereprogrammed on the NAO robot and software interpretationof the sensor voltages were implemented for weight,stiffness, and roughness respectively. The NAO robot wasable to learn these properties of a golf ball, a ping pong balland a cotton ball, and identify them based on theirdifferences.Future research includes investigation of more advancedtactile sensing technology such as MEMS tactile sensorarrays; improvement of hardware integration, for example,using wearable LilyPad microcontrollers; integration oftactile sensing with the NAO robot’s existing visual objectrecognition capability; and last but not least, evaluation ofthe accuracy of tactile-sensing based object identificationwith testing on a large variety of objects with differentweight, stiffness and roughness.REFERENCES[1] R. D. Howe, “Tactile sensing and control of robotic manipulation,” J.Adv. Robot., vol. 8, no. 3, pp. 245-261, 1994.ObjectSensor Voltages (V)#1 #2 #3 #4 #5Golf Ball 0.03 0.61 0.01 2.05 4.99Ping Pong Ball 0.0 0.0 0.0 2.12 3.89Cotton Ball 0.02 0.02 0.02 0.02 0.02987
  7. 7. [2] J. S. Son, “Integration of Tactile Sensing and Robot Hand Control,”Ph.D. dissertation, School of Engineering and Applied Sciences,Harvard Univ., Cambridge, MA, 1996.[3] K. Suwanratchatamanee, M. Matsumoto, and S. Hashimoto, “Human-machine interaction through object using robot arm with tactilesensors,” in Proc. 17th IEEE Int. Symp. Robot Human InteractiveCommun., Munich, Germany, 2008, pp. 683-688.[4] M. Ohka, H. Kobayashi, J. Takata, and Y. Mitsuya, “Sensing precisionof an optical three-axis tactile sensor for a robotic finger,” in Proc.15th IEEE Int. Symp. Robot Human Interactive Commun., Hatfield,U.K., 2006, pp. 214-219.[5] R. Kageyama, S. Kagami, M. Inaba, and H. Inoue, “Development ofsoft and distributed tactile sensors and the application to a humanoidrobot,” in Proc. IEEE Int. Conf. Systems, Man, and Cybernetics,Tokyo, Japan, 1999, pp. 981-986.[6] P. Mittendorfer and G. Cheng, “Humanoid multimodal tactile-sensingmodules,” IEEE Trans. Robotics, vol. 27, no. 3, pp. 401-410, 2011.[7] R. S. Dahiya, G. Metta, M. Valle, and G. Sandini, “Tactile sensing –from humans to humanoids,” IEEE Trans. Robotics, vol. 21, pp. 1–20,Feb. 2010.[8] Tekscan Inc., FlexiForce Sensors User Manual, 2008.http://www.tekscan.com/pdf/FlexiForce-Sensors-Manual.pdf[9] H. B. Muhammad, C. Recchiuto, C. M. Oddo, L. Beccai, C. J.Anthony, M. J. Adams, M. C. Carrozza, and M. C. L. Ward, “Acapacitive tactile sensor array for surface texture discrimination,”Microelectronic Engineering, vol. 88, Jan. 2011, pp. 1811–1813.[10] C. J. Cascio and K. Sathian, “Temporal cues contribute to tactileperception of roughness,” J. Neurosci., vol. 21, no. 14, pp. 5289-5296,2001.988

×