SlideShare a Scribd company logo
1 of 8
Download to read offline
Student Research Paper Conference Vol-2, No-15, July 2015
79
Design and Development of Human Following Robot
1
Muhammad Sarmad Hassan, 2
Mafaz Wali Khan, 3
Ali Fahim Khan
Department of Electrical Engineering
Institute of Space Technology, Pakistan
1
sarmadhassan@hotmail.com
2
mafazwali01@yahoo.com
3
alifahimkhan@gmail.com
Abstract—For a robot that performs autonomously, the
communication between the person and the robot is the most
important factor. A significant awareness has been observed
regarding the usage of such a technology. This research has a
trivial involvement in the development of such robots. A robot that
functions fully autonomously should not only complete the jobs
that are desired of them but also somehow establish a connection
between themselves and the person operating them. A lot of
research has been done of these kinds of robot and a lot of work
still needs to be done. In order for a robot to communicate and
interact with the person, it should also be capable of following that
particular person. Keeping this in mind, there should be a capacity
in the robot to get information from the surroundings while
persuing the required object. The primary goal of our work was to
design and fabricate a robot that not only tracks the target but also
moves towards it while doing the tracking. In order to make things
simpler, a unique handmade tag was placed on the person that the
robot needs to follow. The main hindrance in this kind of work is
that the detection of the target is a sensitive thing to carry out. The
object has to be unique for the robot to recognize it and carry out
the objective. The simple tag removes this problem of uniqueness
and makes the task fairly easy. A small camera records the video
and the processor processes it to extract the desired information
from it. Protecting the robot from collision with the object is
another problem that needs to be tackled so in order to do this, a
sensor is used. All the processing is carried out by the
microprocessor while the control of the motors is carried out by
the controller.
Keywords — Human following, human tracking, visual imaging,
human robot interaction, laser range scanners, image tag,
envelope detection, service robots, depth image mapping, thermal
conductivity.
I. INTRODUCTION
Robotic technology has increased appreciably in past couple
of years. Such innovations were only a dream for some
people a couple of years back. But in this rapid moving
world, now there is a need of robot such as “A Human
Following Robot” that can interact and co-exist with them.
[1]
To perform this task accurately, robot needs a mechanism
that enables it to visualize the person and act accordingly [2]
[3]. The robot must be intelligent enough to follow a person
in the crowded areas, vivid environment and in indoors and
outdoors places [4].
The image processing carried out to get the information about
the surroundings visually is a very important thing. The
following points should be carefully noted while doing the
processing.
 The luminosity conditions should be very stable and
should not fluctuate.
 The ranges should be set properly for the desired
environment on which to perform the tracking.
 The target should not be very far from the visual
sensor as the distance matters a lot.
 We should avoid the use of such colors around the
robot that matches with that of the target. Otherwise
the robot would get confused.
Typically human following robots are equipped with several
different diverse combination of sensors i.e. light detection
and ranging sensor, radio frequency identification module
(RFID), laser ranger finder (LFR), infrared (IR) sensing
modules, thermal imaging sensors, camera, wireless
transmitter/receiver etc. for recognition and locating the
target. All the sensors and modules work in unison to detect
and follow the target.
The capability of a robot to track and follow a moving object
can be used for several purposes.
 To help humans.
 To create ease for people.
 Can be used for defense purpose.
In this paper, we presented a method of a human following
robot based on tag identification and detection by using a
camera. Intelligent tracking of specified target is carried out
by the use of different sensors and modules [5] i.e. ultrasonic
Design and Development of Human Following Robot
80
sensor, magnetometer, infrared sensors and camera. An
intelligent decision is being made by the robot control unit
based on the information obtained from the above sensors
and modules, hence finding and tracking the particular object
by avoiding the obstacles and without collision with the
target.
II. RELATED WORK
So far a lot of research has been done on the kinds of robot
that fall into the category of the “Assisting Robots”. People
have used different logics and alog‟s to implement their
design. All of their primary focus has entirely been on the
design of robots that follows the target.
Laser sensor is used by Burgard in his tour guide robot for
human tracking [6]. LRF was incorporated by D. Schulz to
perform the „following‟. Using the above mentioned process,
they performed the information linking for the detection[7].
Nicola, Husing used a technique for pointing out the
different styles of movement by using LRF. This information
was fused with the information obtained by the camera [8].
Depth imaging was used by Songmin Jia to carry out the
detection. The model of a person was determined using the
depth imaging [9]. The particular style of clothing was used
by Mehrez Kristou. He used a multidirectional camera. LRF
was also incorporated by him in the design [10]. A research
was conducted by Wilhelm with the focus on the color of the
particular person‟s skin. Information from different sensors
was also used by him in the research[11].
Some other research work was also conducted in this regard,
Depth imaging was used by Calisi and the target was persued
by designing a special algorithm[12]. Ess and Leibe carried
out the same work. They did a lot of work on object tracking
and detection. The biggest advantage of their method was that
their algorithm worked in complex environments as well[13]
[14]. Stereo visiion was also carried out by Y. Salih in order
to perform the detection[15]. This method enabled him to
persue the required target with an effective manner. The
combination of different sensors were used by R. Munoz to
.get the information about the target to be tracked. In addition
to using different sensors, he also used stereo vision to get an
accurate information.[16]. The data of the sensors combined
with the information from the camera proved to be very
helpful in carrying out the task[17].
Different algorithms are being developed by the researchers
for the detection purposes. Laser was used in one research to
find the style of the moving legs [18] [19][20] and camera
was used to detect a particular object or a person[3] [21]. A
very simple technique was also used by a research. In this
technique, the person used distance sensors on the robot and
the person. These sensors emitted radio waves and were
detected by the sensors on the person to be followed. This
way the robot followed the required target[22].
III. SYSTEM COMPONENTS
Our system consists of a three wheel robotic vehicle mounted
with a separate microprocessor and control unit along with
different sensors and modules i.e. ultrasonic sensor,
magnetometer, infrared sensors, and camera.
The camera height is vertically self-adjusting and is initially
mounted on robot at a height of 4 ft. from ground to enhance
the visual capability and effectiveness.
This robotic vehicle is controlled by the user by means of an
identification tag.
IV. METHODOLOGY
A systematic research methodology is adopted keeping in
mind the ultimate goal of a fully functional and autonomous
human following robot.
A decentralized top down approach is used for this project.
The project is divided in to five modules. Each module is
independent from one another. Different phases were carried
out step by step, starting from basic sensor testing and
proceeding towards obstacle avoidance, object detection,
object tracking and data transmission.
Due to the decentralized approach, all modules and sensors
act independently. Data obtained by different sensors and
modules is collectively analyzed and an intelligent decision
on the basis of information obtained is made that instruct the
robot to follow a particular direction. Two separate units are
used i.e. microprocessor and a controller. The processing is
carried out by microprocessor and the information obtained
by the sensors is controlled by a controller i.e. Arduino board.
A serial communication between microprocessor and
controller is established to exchange the visual sensing
information.
This approach was most suitable because if there is a fault in
any one of the modules then it would not affect the entire
system. Hence this provides the best possible results by
maintaining accuracy.
Human tracking, obstacle avoidance, maintaining a specific
distance from the object and establishing a communication
Design and Development of Human Following Robot
81
link between microprocessor and controller are the main
aspects of this project.
V. IMPLEMENTATION
The implementation of human interactive robot seems to be
an easy task but there are some problems related to it. For
example, detection of a particular tag. Similarly the tag needs
to be unique so that it should not be merged into colors in a
vivid environment. So we have designed a novel algorithm to
overcome this problem.
There are several features of the project such as the
mechanical structure, design of circuits, tag detection and
intelligent tracking system.
The implementation of human interactive robot is as follows.
A. Design of a custom Tag:
In order to make the target distinctive and unambiguous, a tag
was designed in particular. This tag had four different colors
in box arrangement. The whole research is based on the fact
that the target has to be specific to the robot. The robot
should only detect that target and not some other target. All
these colors enable the robot to detect the person by tracking
the tag. This tag is placed on the person that needs to be
pursued. Once the tag is put on the person the robot detects
the colors and recognizes the target through the detection.
Fig. 1. Custom Tag
B. Design of Mechanical Structure:
The mechanical structure of the robot is comprised of two
layers base and it consist of two wheel differential drive
system and a free wheel. It is designed keeping in view that
the camera on robot has to be mounted over a certain height
from the ground. The height of this camera is adjustable
according to height of a person so that better visual
information can be obtained. So initially the height of the
camera is set up to 4 ft.
The software design of the mechanical structure is shown
below.
Fig. 2. Software Design of Mechanical Structure
C. Design of Relay Based Motor Driver Module:
A relay based motor driver module is designed to control the
differential drive of the robotic platform. The advantage of
this relay based module is that it can easily drive the high
torques motors that require large fluxes. Hence it can easily
handle heavy loads.
D. Image Processing Algorithm:
A novel algorithm to process the real time video is used to
detect and follow the unique tag. We have used a computer
vision camera for recognizing the tag at the back of person
and an OpenCV python Platform to develop this algorithm.
In this algorithm, a tag having the four color are used. The
algorithm is designed in a way that the tag is only detected
when the four colors are all together and it that particular
order. Once the tag is detected, the centroid of the common
region of four colors is found to get the center coordinates of
the tag. These center coordinates are then serially transmitted
to control unit for further processing and to make an
intelligent decision by fusing it with information obtained by
the other sensors and modules.
VI. SYSTEM DESIGN
The system design consists of separate processing and control
unit. The processing unit only makes use of a camera and is
linked with the control unit to serially transmit the visual
information after bulk processing. The control unit is serially
linked with the processor and it makes use of several sensors
and modules i.e. ultrasonic sensor, magnetometer and
infrared sensors.
The above sensors and camera works in unison with each
other and helps the robot in its operation and to navigate its
Design and Development of Human Following Robot
82
path by avoiding the obstacles and maintaining a specific
distance from the object. The decision is made on the basis of
information obtained from all above sensors.
The generalized system design incorporating different sensors
and modules is show below.
Fig. 3. System Design
Looking at the working of the above system, the first phase is
the detection of a tag by means of a camera and carrying out
the substantial processing in the processing unit. The
processor that we have used is Raspberry Pi single board
computer. After the detection of tag next phase is to establish
a serial communication between the processor and control
unit. We have used Arduino as a control unit. In this phase
center point coordinates of tag are serially transmitted to
Arduino for further processing.
Next phase is to interface modules and necessary sensors
with the control unit. For this purpose, we used ultrasonic
sensor, magnetometer and IR sensors for the proper
functioning of robot.
We used ultrasonic sensor for obstacle avoidance and to
maintain a specific distance for the object. The ultrasonic
sensor works accurately works accurately with in a range of 4
meters. Ultrasonic sensors operate by calculating the times
differences. [23].
Fig. 4. Ultrasonic Sensor Principle
This ultrasonic sensor is placed at the top of robot along with
the camera module to maintain accuracy in measuring
distance between the robot and target object.
The flow chart to maintain specific distance from target is
shown below.
Fig. 5: Flow Chart of Maintaining Specific Distance
After ultrasonic sensor we interfaced the magnetometer to get
the orientation of robot in x-y-z coordinates. This module
determines the orientation of robot and tells heading direction
of robot. This heading direction is used to determine the tilt
of robot from its original position. [24]. On the basis of
information obtained from this module, the control unit
determines that how much direction change is required to be
back on track again by after avoiding the obstacle.
After interfacing of above sensors, the next most
important part of this system design is to interface the
encoders to wheel calculate the distance travelled by
the robot to eliminate any further error in the robotic
movement due to displacement. For this purpose we
attached two slot sensors on top of the encoder‟s right beside
the wheels. The slot sensor has IR transmitter and a
photodiode mounted on it and facing each other. The light
emitted by the IR LED is blocked because of alternating slots
Design and Development of Human Following Robot
83
of the encoder disc. This causes the logic level of the photo
diode to change and is detected by the controller [25].
Fig. 6. IR Slot Sensor
The distance is obtained by the number of counts recorded in
memory of the controller and is used for distance calculations
by using the following formula.
Final phase is consisted of fusing all the information obtained
by the sensors and modules in the control unit. Hence, control
unit makes an intelligent decision to change the direction of
robot and to get back on its track again and to follow the
target having tag on basis of information obtained for all
sensors and modules i.e. serially received coordinates from
processor, distance information from ultrasonic sensor,
heading direction from magnetometer, and distance
calculation from the IR sensor.
Fig. 7. Human Following Robotic System.
The system is designed in such a way that it intelligently
makes use of the information obtained from different sensors
and modules. Different sensors are incorporated in the system
to make it able to respond to diverse situations.
VII. EXPERIMENTS AND RESULTS
Different experiments were conducted and the performance
of the human following robot was tested. Each experiment
that was performed took about 10 to 15 minutes. On the basis
of results obtained from these tests and experiments, we made
the necessary changes in the processing and control
algorithm.
First test was performed on the ultrasonic sensor. It was noted
that sensor was working accurately with in a range of 4
meters. Then we performed the test to check that weather the
robot maintains a specific distance with target object. Initially
the we set the stopping of robot to 8 inches. It was observed
that robot collided with the object as the distance between
robot and target object approaches to 8 inches. This problem
behind this was that the stopping distance was small enough
and robot was not stopping quickly because of its load on
board. So we increased the distance to 12 inches. Then we
again verified the routine.
The next test was performed on the magnetometer module.
This module gave us the heading direction of the robot with
respect to some reference. But we observed that there was
some offset error in that heading direction. On observation
we found that this was due to the wrong placement of the
magnetometer module. This module was interfering with the
magnetic field of the electronic components due to which we
were getting the offset error. So we changed its placement
and set it up at bottom layer and in the center of the robotic
structure. Now we had the heading degree without any offset
error.
Then the next experiment was to test the detection of tag. We
observed that in certain lightning conditions the tag was not
detecting properly. So we adjusted the hue, saturation and
value of all the four colors as color thresholding in HSV
varies by the lighting conditions. So after changing the
threshold value we observed that this time the processor was
detecting the Tag properly.
Results are shown below.
Design and Development of Human Following Robot
84
Fig. 8. Tag Detection
Then we check the serial communication between
Arduino and Raspberry pi. We observed that when the
tag was not infront of camera the processor transmitted
a null value for both coordinates and some value when
tag was in front of camera. Hence, processor was
correctly transmitting the coordinates to control unit.
Results are shown below.
Fig. 9. Serial Communication between Arduino and Raspberry Pi
Final test was performed on the differential drive system to
check whether the robot intelligently follow the person or not.
We observed that the results produced were very satisfying.
The robot was perfectly following the person wherever it
goes. We just adjusted its speed at the end.
Hence the objective of implementing a good Human-Robot
interaction was achieved by fast processing of the visual
input combined with the proper handling of the information
from the sensors.
Fig. 10. Human Follower Robot
VIII. APPLICATIONS
Looking deeply into environment or our surroundings, we
will be able interpret that “YES” there is a need of such robot
that can assist humans and can serve them. Such a robot can
be used for many purposes. With a few changings, the robot
can act as a human companion as well.
Some other applications of this robot are
 Can assist in carrying loads for people working in
hospitals, libraries, airports, etc.
 Can service people at shopping centers or public
areas.
 Can assist elderly people, special children and
babies.
 Can follow a particular Vehicle.
IX. FUTURE WORK
There are many interesting applications of this research in
different fields whether military or medical. A wireless
communication functionality can be added in the robot to
make it more versatile and control it from a large distance.
This capability of a robot could also be used for military
purposes. By mounting a real time video recorder on top of
the camera, we can monitor the surroundings by just sitting in
our rooms. We can also add some modifications in the
Design and Development of Human Following Robot
85
algorithm and the structure as well to fit it for any other
purpose. E-g a vehicle follower.
Similarly it can assist the public in shopping malls. So there it
can act as a luggage carrier, hence no need to carry up the
weights or to pull that. Using this algorithm the robot will
automatically follow that person.
X. CONCLUSION
A successful implementation of a person follower robot is
illustrated in this research. This robot does not only have the
detection capability but also the tracking and following
ability as well. The tracking is basically performed on the tag
and the human is followed on the basis of that detection. It
was also kept in mind that the „following‟ capability of the
robot should be as efficient as possible. The tests were
performed on the different conditions to pin point the
mistakes in the algorithm and correct them. The different
sensors that were integrated with the robot added an
additional advantage.
XI. REFERENCES
[1] K. Morioka, J.-H. Lee, and H. Hashimoto, “Human-following
mobile robot in a distributed intelligent sensor network,”
IEEE Trans. Ind. Electron., vol. 51, no. 1, pp. 229–237, Feb.
2004.
[2] Y. Matsumoto and A. Zelinsky, “Real-time face tracking
system for human-robot interaction,” in 1999 IEEE
International Conference on Systems, Man, and Cybernetics,
1999. IEEE SMC ’99 Conference Proceedings, 1999, vol. 2,
pp. 830–835 vol.2.
[3] T. Yoshimi, M. Nishiyama, T. Sonoura, H. Nakamoto, S.
Tokura, H. Sato, F. Ozaki, N. Matsuhira, and H. Mizoguchi,
“Development of a Person Following Robot with Vision
Based Target Detection,” in 2006 IEEE/RSJ International
Conference on Intelligent Robots and Systems, 2006, pp.
5286–5291.
[4] H. Takemura, N. Zentaro, and H. Mizoguchi, “Development
of vision based person following module for mobile robots
in/out door environment,” in 2009 IEEE International
Conference on Robotics and Biomimetics (ROBIO), 2009, pp.
1675–1680.
[5] N. Bellotto and H. Hu, “Multisensor integration for human-
robot interaction,” IEEE J. Intell. Cybern. Syst., vol. 1, no. 1,
p. 1, 2005.
[6] W. Burgard, A. B. Cremers, D. Fox, D. Hähnel, G.
Lakemeyer, D. Schulz, W. Steiner, and S. Thrun, “The
interactive museum tour-guide robot,” in AAAI/IAAI, 1998,
pp. 11–18.
[7] M. Scheutz, J. McRaven, and G. Cserey, “Fast, reliable,
adaptive, bimodal people tracking for indoor environments,”
in Intelligent Robots and Systems, 2004.(IROS 2004).
Proceedings. 2004 IEEE/RSJ International Conference on,
2004, vol. 2, pp. 1347–1352.
[8] N. Bellotto and H. Hu, “People tracking and identification
with a mobile robot,” in Mechatronics and Automation, 2007.
ICMA 2007. International Conference on, 2007, pp. 3565–
3570.
[9] S. Jia, L. Zhao, and X. Li, “Robustness improvement of
human detecting and tracking for mobile robot,” in
Mechatronics and Automation (ICMA), 2012 International
Conference on, 2012, pp. 1904–1909.
[10] M. Kristou, A. Ohya, and S. Yuta, “Target person
identification and following based on omnidirectional camera
and LRF data fusion,” in RO-MAN, 2011 IEEE, 2011, pp.
419–424.
[11] T. Wilhelm, H. J. Böhme, and H. M. Gross, “Sensor fusion
for vision and sonar based people tracking on a mobile
service robot,” in Proceedings of the International Workshop
on Dynamic Perception, 2002, pp. 315–320.
[12] D. Calisi, L. Iocchi, and R. Leone, “Person following through
appearance models and stereo vision using a mobile robot.,”
in VISAPP (Workshop on on Robot Vision), 2007, pp. 46–56.
[13] A. Ess, B. Leibe, and L. Van Gool, “Depth and appearance
for mobile scene analysis,” in Computer Vision, 2007. ICCV
2007. IEEE 11th International Conference on, 2007, pp. 1–8.
[14] A. Ess, B. Leibe, K. Schindler, and L. Van Gool, “A mobile
vision system for robust multi-person tracking,” in Computer
Vision and Pattern Recognition, 2008. CVPR 2008. IEEE
Conference on, 2008, pp. 1–8.
[15] Y. Salih and A. S. Malik, “Comparison of stochastic filtering
methods for 3D tracking,” Pattern Recognit., vol. 44, no. 10,
pp. 2711–2737, 2011.
[16] R. Muñoz-Salinas, E. Aguirre, and M. García-Silvente,
“People detection and tracking using stereo vision and color,”
Image Vis. Comput., vol. 25, no. 6, pp. 995–1007, 2007.
[17] J. Satake and J. Miura, “Robust stereo-based person detection
and tracking for a person following robot,” in ICRA
Workshop on People Detection and Tracking, 2009.
[18] J. H. Lee, T. Tsubouchi, K. Yamamoto, and S. Egawa,
“People Tracking Using a Robot in Motion with Laser Range
Finder,” in 2006 IEEE/RSJ International Conference on
Intelligent Robots and Systems, 2006, pp. 2936–2942.
[19] M. Lindstrom and J. O. Eklundh, “Detecting and tracking
moving objects from a mobile platform using a laser range
scanner,” in 2001 IEEE/RSJ International Conference on
Intelligent Robots and Systems, 2001. Proceedings, 2001,
vol. 3, pp. 1364–1369 vol.3.
[20] M. Kristou, A. Ohya, and S. Yuta, “Panoramic vision and lrf
sensor fusion based human identification and tracking for
autonomous luggage cart,” in Robot and Human Interactive
Communication, 2009. RO-MAN 2009. The 18th IEEE
Design and Development of Human Following Robot
86
International Symposium on, 2009, pp. 711–716.
[21] C. Schlegel, J. Illmann, H. Jaberg, M. Schuster, and R. Wörz,
“Vision Based Person Tracking with a Mobile Robot.,” in
BMVC, 1998, pp. 1–10.
[22] N.-Y. Ko, D.-J. Seo, and Y.-S. Moon, “A Method for Real
Time Target Following of a Mobile Robot Using Heading
and Distance Information,” J. Korean Inst. Intell. Syst., vol.
18, no. 5, pp. 624–631, Oct. 2008.
[23] K. S. Nair, A. B. Joseph, and J. I. Kuruvilla, “Design of a low
cost human following porter robot at airports.”
[24] V. Y. Skvortzov, H.-K. Lee, S. Bang, and Y. Lee,
“Application of Electronic Compass for Mobile Robot in an
Indoor Environment,” in 2007 IEEE International
Conference on Robotics and Automation, 2007, pp. 2963–
2970.
[25] “http://www.nskelectronics.com/slot_sensor.html,” Slot
Sensor. .

More Related Content

What's hot

IRJET- Application of MCNN in Object Detection
IRJET-  	  Application of MCNN in Object DetectionIRJET-  	  Application of MCNN in Object Detection
IRJET- Application of MCNN in Object DetectionIRJET Journal
 
Design & Development of Vision Controlled Snake Robot
Design & Development of Vision Controlled Snake RobotDesign & Development of Vision Controlled Snake Robot
Design & Development of Vision Controlled Snake Robotvivatechijri
 
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...IJMER
 
Robust Motion Detection and Tracking of Moving Objects using HOG Feature and ...
Robust Motion Detection and Tracking of Moving Objects using HOG Feature and ...Robust Motion Detection and Tracking of Moving Objects using HOG Feature and ...
Robust Motion Detection and Tracking of Moving Objects using HOG Feature and ...CSCJournals
 
Reading System for the Blind PPT
Reading System for the Blind PPTReading System for the Blind PPT
Reading System for the Blind PPTBinayak Ghosh
 
Camouflage Color Changing Robot
Camouflage Color Changing RobotCamouflage Color Changing Robot
Camouflage Color Changing RobotHitesh Shinde
 
IRJET - Swarm Robotic System for Mapping and Exploration of Extreme Envir...
IRJET -  	  Swarm Robotic System for Mapping and Exploration of Extreme Envir...IRJET -  	  Swarm Robotic System for Mapping and Exploration of Extreme Envir...
IRJET - Swarm Robotic System for Mapping and Exploration of Extreme Envir...IRJET Journal
 
Camouflage Color Changing Robot For Military Purpose
Camouflage Color Changing Robot For Military PurposeCamouflage Color Changing Robot For Military Purpose
Camouflage Color Changing Robot For Military PurposeHitesh Shinde
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...NidhinRaj Saikripa
 
Scheme for motion estimation based on adaptive fuzzy neural network
Scheme for motion estimation based on adaptive fuzzy neural networkScheme for motion estimation based on adaptive fuzzy neural network
Scheme for motion estimation based on adaptive fuzzy neural networkTELKOMNIKA JOURNAL
 
Rp 3 published
Rp  3 publishedRp  3 published
Rp 3 publishedAman Jain
 
Ed Nunez - Mobile Robot CNC Alternative
Ed Nunez - Mobile Robot CNC AlternativeEd Nunez - Mobile Robot CNC Alternative
Ed Nunez - Mobile Robot CNC AlternativeEduardo Nunez
 
Autonomous Path Planning and Navigation of a Mobile Robot with Multi-Sensors ...
Autonomous Path Planning and Navigation of a Mobile Robot with Multi-Sensors ...Autonomous Path Planning and Navigation of a Mobile Robot with Multi-Sensors ...
Autonomous Path Planning and Navigation of a Mobile Robot with Multi-Sensors ...CSCJournals
 
Developmentof Image Enhancement and the Feature Extraction Techniques on Rura...
Developmentof Image Enhancement and the Feature Extraction Techniques on Rura...Developmentof Image Enhancement and the Feature Extraction Techniques on Rura...
Developmentof Image Enhancement and the Feature Extraction Techniques on Rura...IOSR Journals
 
Hand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture RecognitionHand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture RecognitionAM Publications,India
 

What's hot (20)

IRJET- Application of MCNN in Object Detection
IRJET-  	  Application of MCNN in Object DetectionIRJET-  	  Application of MCNN in Object Detection
IRJET- Application of MCNN in Object Detection
 
Design & Development of Vision Controlled Snake Robot
Design & Development of Vision Controlled Snake RobotDesign & Development of Vision Controlled Snake Robot
Design & Development of Vision Controlled Snake Robot
 
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
 
Robust Motion Detection and Tracking of Moving Objects using HOG Feature and ...
Robust Motion Detection and Tracking of Moving Objects using HOG Feature and ...Robust Motion Detection and Tracking of Moving Objects using HOG Feature and ...
Robust Motion Detection and Tracking of Moving Objects using HOG Feature and ...
 
H011124050
H011124050H011124050
H011124050
 
C0352016019
C0352016019C0352016019
C0352016019
 
Final-Report
Final-ReportFinal-Report
Final-Report
 
Reading System for the Blind PPT
Reading System for the Blind PPTReading System for the Blind PPT
Reading System for the Blind PPT
 
Camouflage Color Changing Robot
Camouflage Color Changing RobotCamouflage Color Changing Robot
Camouflage Color Changing Robot
 
IRJET - Swarm Robotic System for Mapping and Exploration of Extreme Envir...
IRJET -  	  Swarm Robotic System for Mapping and Exploration of Extreme Envir...IRJET -  	  Swarm Robotic System for Mapping and Exploration of Extreme Envir...
IRJET - Swarm Robotic System for Mapping and Exploration of Extreme Envir...
 
Camouflage Color Changing Robot For Military Purpose
Camouflage Color Changing Robot For Military PurposeCamouflage Color Changing Robot For Military Purpose
Camouflage Color Changing Robot For Military Purpose
 
30 ball
30 ball30 ball
30 ball
 
Robotic introduction
Robotic introductionRobotic introduction
Robotic introduction
 
Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...Gesture recognition using artificial neural network,a technology for identify...
Gesture recognition using artificial neural network,a technology for identify...
 
Scheme for motion estimation based on adaptive fuzzy neural network
Scheme for motion estimation based on adaptive fuzzy neural networkScheme for motion estimation based on adaptive fuzzy neural network
Scheme for motion estimation based on adaptive fuzzy neural network
 
Rp 3 published
Rp  3 publishedRp  3 published
Rp 3 published
 
Ed Nunez - Mobile Robot CNC Alternative
Ed Nunez - Mobile Robot CNC AlternativeEd Nunez - Mobile Robot CNC Alternative
Ed Nunez - Mobile Robot CNC Alternative
 
Autonomous Path Planning and Navigation of a Mobile Robot with Multi-Sensors ...
Autonomous Path Planning and Navigation of a Mobile Robot with Multi-Sensors ...Autonomous Path Planning and Navigation of a Mobile Robot with Multi-Sensors ...
Autonomous Path Planning and Navigation of a Mobile Robot with Multi-Sensors ...
 
Developmentof Image Enhancement and the Feature Extraction Techniques on Rura...
Developmentof Image Enhancement and the Feature Extraction Techniques on Rura...Developmentof Image Enhancement and the Feature Extraction Techniques on Rura...
Developmentof Image Enhancement and the Feature Extraction Techniques on Rura...
 
Hand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture RecognitionHand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture Recognition
 

Similar to Human Following Robot Design

Follow Me Robot Technology
Follow Me Robot TechnologyFollow Me Robot Technology
Follow Me Robot Technologyijsrd.com
 
HCI for Real world Applications
HCI for Real world ApplicationsHCI for Real world Applications
HCI for Real world ApplicationsIOSR Journals
 
A novel visual tracking scheme for unstructured indoor environments
A novel visual tracking scheme for unstructured indoor environmentsA novel visual tracking scheme for unstructured indoor environments
A novel visual tracking scheme for unstructured indoor environmentsIJECEIAES
 
A one decade survey of autonomous mobile robot systems
A one decade survey of autonomous mobile robot systems A one decade survey of autonomous mobile robot systems
A one decade survey of autonomous mobile robot systems IJECEIAES
 
IRJET - Automatic Gun Control using Motion Detection System
IRJET - Automatic Gun Control using Motion Detection SystemIRJET - Automatic Gun Control using Motion Detection System
IRJET - Automatic Gun Control using Motion Detection SystemIRJET Journal
 
Object Detection and Tracking AI Robot
Object Detection and Tracking AI RobotObject Detection and Tracking AI Robot
Object Detection and Tracking AI RobotIRJET Journal
 
Interactive full body motion capture using infrared sensor network
Interactive full body motion capture using infrared sensor networkInteractive full body motion capture using infrared sensor network
Interactive full body motion capture using infrared sensor networkijcga
 
Interactive Full-Body Motion Capture Using Infrared Sensor Network
Interactive Full-Body Motion Capture Using Infrared Sensor Network  Interactive Full-Body Motion Capture Using Infrared Sensor Network
Interactive Full-Body Motion Capture Using Infrared Sensor Network ijcga
 
All Terrain Offensive and Defensive Robot
All Terrain Offensive and Defensive RobotAll Terrain Offensive and Defensive Robot
All Terrain Offensive and Defensive Robotijtsrd
 
military surveillance robot
military surveillance robot  military surveillance robot
military surveillance robot KrishGupta94
 
Smartphone controlled spy_robot_with_video_transmi
Smartphone controlled spy_robot_with_video_transmiSmartphone controlled spy_robot_with_video_transmi
Smartphone controlled spy_robot_with_video_transmiMohammed Mudasser
 
A Robot Collision Avoidance Method Using Kinect and Global Vision
A Robot Collision Avoidance Method Using Kinect and Global VisionA Robot Collision Avoidance Method Using Kinect and Global Vision
A Robot Collision Avoidance Method Using Kinect and Global VisionTELKOMNIKA JOURNAL
 
Eye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIEye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIDr. Amarjeet Singh
 
Gesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image ComparisonGesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image Comparisonijait
 
SISR - Smart Indoor Surveillance Robot using IoT for day to day usage PPT.pptx
SISR - Smart Indoor Surveillance Robot using IoT for day to day usage PPT.pptxSISR - Smart Indoor Surveillance Robot using IoT for day to day usage PPT.pptx
SISR - Smart Indoor Surveillance Robot using IoT for day to day usage PPT.pptxSanjaySTippannavar1
 
Sanjaya: A Blind Assistance System
Sanjaya: A Blind Assistance SystemSanjaya: A Blind Assistance System
Sanjaya: A Blind Assistance SystemIRJET Journal
 
Robot operating system based autonomous navigation platform with human robot ...
Robot operating system based autonomous navigation platform with human robot ...Robot operating system based autonomous navigation platform with human robot ...
Robot operating system based autonomous navigation platform with human robot ...TELKOMNIKA JOURNAL
 

Similar to Human Following Robot Design (20)

Follow Me Robot Technology
Follow Me Robot TechnologyFollow Me Robot Technology
Follow Me Robot Technology
 
L01117074
L01117074L01117074
L01117074
 
HCI for Real world Applications
HCI for Real world ApplicationsHCI for Real world Applications
HCI for Real world Applications
 
A novel visual tracking scheme for unstructured indoor environments
A novel visual tracking scheme for unstructured indoor environmentsA novel visual tracking scheme for unstructured indoor environments
A novel visual tracking scheme for unstructured indoor environments
 
A one decade survey of autonomous mobile robot systems
A one decade survey of autonomous mobile robot systems A one decade survey of autonomous mobile robot systems
A one decade survey of autonomous mobile robot systems
 
IRJET - Automatic Gun Control using Motion Detection System
IRJET - Automatic Gun Control using Motion Detection SystemIRJET - Automatic Gun Control using Motion Detection System
IRJET - Automatic Gun Control using Motion Detection System
 
Object Detection and Tracking AI Robot
Object Detection and Tracking AI RobotObject Detection and Tracking AI Robot
Object Detection and Tracking AI Robot
 
Interactive full body motion capture using infrared sensor network
Interactive full body motion capture using infrared sensor networkInteractive full body motion capture using infrared sensor network
Interactive full body motion capture using infrared sensor network
 
Interactive Full-Body Motion Capture Using Infrared Sensor Network
Interactive Full-Body Motion Capture Using Infrared Sensor Network  Interactive Full-Body Motion Capture Using Infrared Sensor Network
Interactive Full-Body Motion Capture Using Infrared Sensor Network
 
All Terrain Offensive and Defensive Robot
All Terrain Offensive and Defensive RobotAll Terrain Offensive and Defensive Robot
All Terrain Offensive and Defensive Robot
 
A moving object tracked by a mobile robot
A moving object tracked by a mobile robotA moving object tracked by a mobile robot
A moving object tracked by a mobile robot
 
military surveillance robot
military surveillance robot  military surveillance robot
military surveillance robot
 
Smartphone controlled spy_robot_with_video_transmi
Smartphone controlled spy_robot_with_video_transmiSmartphone controlled spy_robot_with_video_transmi
Smartphone controlled spy_robot_with_video_transmi
 
A350111
A350111A350111
A350111
 
A Robot Collision Avoidance Method Using Kinect and Global Vision
A Robot Collision Avoidance Method Using Kinect and Global VisionA Robot Collision Avoidance Method Using Kinect and Global Vision
A Robot Collision Avoidance Method Using Kinect and Global Vision
 
Eye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIEye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AI
 
Gesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image ComparisonGesture Based Interface Using Motion and Image Comparison
Gesture Based Interface Using Motion and Image Comparison
 
SISR - Smart Indoor Surveillance Robot using IoT for day to day usage PPT.pptx
SISR - Smart Indoor Surveillance Robot using IoT for day to day usage PPT.pptxSISR - Smart Indoor Surveillance Robot using IoT for day to day usage PPT.pptx
SISR - Smart Indoor Surveillance Robot using IoT for day to day usage PPT.pptx
 
Sanjaya: A Blind Assistance System
Sanjaya: A Blind Assistance SystemSanjaya: A Blind Assistance System
Sanjaya: A Blind Assistance System
 
Robot operating system based autonomous navigation platform with human robot ...
Robot operating system based autonomous navigation platform with human robot ...Robot operating system based autonomous navigation platform with human robot ...
Robot operating system based autonomous navigation platform with human robot ...
 

More from QSC-Fabrication laboratory (18)

3D_Printing_Concrete_A_Review (1).pdf
3D_Printing_Concrete_A_Review (1).pdf3D_Printing_Concrete_A_Review (1).pdf
3D_Printing_Concrete_A_Review (1).pdf
 
A_Simple_Model_to_Compute_the_Characteristic_Param.pdf
A_Simple_Model_to_Compute_the_Characteristic_Param.pdfA_Simple_Model_to_Compute_the_Characteristic_Param.pdf
A_Simple_Model_to_Compute_the_Characteristic_Param.pdf
 
Pre install checklist
Pre install checklistPre install checklist
Pre install checklist
 
Final call-for-papers.pdf
Final call-for-papers.pdfFinal call-for-papers.pdf
Final call-for-papers.pdf
 
9 cnc (1)
9 cnc (1)9 cnc (1)
9 cnc (1)
 
Speedy300 flyer (1)
Speedy300 flyer (1)Speedy300 flyer (1)
Speedy300 flyer (1)
 
Capacitors
CapacitorsCapacitors
Capacitors
 
список элементов-для-таймера
список элементов-для-таймерасписок элементов-для-таймера
список элементов-для-таймера
 
Annex 0201
Annex 0201Annex 0201
Annex 0201
 
Time sheet
Time sheetTime sheet
Time sheet
 
67 sos3 series 3 winner announced english (3)
67 sos3 series 3 winner announced english (3)67 sos3 series 3 winner announced english (3)
67 sos3 series 3 winner announced english (3)
 
wireless power charging
wireless power chargingwireless power charging
wireless power charging
 
Eddy current labo
Eddy current laboEddy current labo
Eddy current labo
 
Batteries chargeur.fr
Batteries chargeur.frBatteries chargeur.fr
Batteries chargeur.fr
 
Flux leakagewireropes
Flux leakagewireropesFlux leakagewireropes
Flux leakagewireropes
 
Crane wire rope damage and inspection methods
Crane wire rope damage and inspection methodsCrane wire rope damage and inspection methods
Crane wire rope damage and inspection methods
 
Asnt houston
Asnt houstonAsnt houston
Asnt houston
 
Qatar foundation magzine
Qatar foundation magzineQatar foundation magzine
Qatar foundation magzine
 

Recently uploaded

High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escortsranjana rawat
 
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZTE
 
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineeringmalavadedarshan25
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVRajaP95
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Dr.Costas Sachpazis
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escortsranjana rawat
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerAnamika Sarkar
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxpranjaldaimarysona
 
Analog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAnalog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAbhinavSharma374939
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingrakeshbaidya232001
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Dr.Costas Sachpazis
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxJoão Esperancinha
 

Recently uploaded (20)

Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptxExploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
 
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
 
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
ZXCTN 5804 / ZTE PTN / ZTE POTN / ZTE 5804 PTN / ZTE POTN 5804 ( 100/200 GE Z...
 
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineering
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCRCall Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
 
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptx
 
Analog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAnalog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog Converter
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writing
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
 

Human Following Robot Design

  • 1. Student Research Paper Conference Vol-2, No-15, July 2015 79 Design and Development of Human Following Robot 1 Muhammad Sarmad Hassan, 2 Mafaz Wali Khan, 3 Ali Fahim Khan Department of Electrical Engineering Institute of Space Technology, Pakistan 1 sarmadhassan@hotmail.com 2 mafazwali01@yahoo.com 3 alifahimkhan@gmail.com Abstract—For a robot that performs autonomously, the communication between the person and the robot is the most important factor. A significant awareness has been observed regarding the usage of such a technology. This research has a trivial involvement in the development of such robots. A robot that functions fully autonomously should not only complete the jobs that are desired of them but also somehow establish a connection between themselves and the person operating them. A lot of research has been done of these kinds of robot and a lot of work still needs to be done. In order for a robot to communicate and interact with the person, it should also be capable of following that particular person. Keeping this in mind, there should be a capacity in the robot to get information from the surroundings while persuing the required object. The primary goal of our work was to design and fabricate a robot that not only tracks the target but also moves towards it while doing the tracking. In order to make things simpler, a unique handmade tag was placed on the person that the robot needs to follow. The main hindrance in this kind of work is that the detection of the target is a sensitive thing to carry out. The object has to be unique for the robot to recognize it and carry out the objective. The simple tag removes this problem of uniqueness and makes the task fairly easy. A small camera records the video and the processor processes it to extract the desired information from it. Protecting the robot from collision with the object is another problem that needs to be tackled so in order to do this, a sensor is used. All the processing is carried out by the microprocessor while the control of the motors is carried out by the controller. Keywords — Human following, human tracking, visual imaging, human robot interaction, laser range scanners, image tag, envelope detection, service robots, depth image mapping, thermal conductivity. I. INTRODUCTION Robotic technology has increased appreciably in past couple of years. Such innovations were only a dream for some people a couple of years back. But in this rapid moving world, now there is a need of robot such as “A Human Following Robot” that can interact and co-exist with them. [1] To perform this task accurately, robot needs a mechanism that enables it to visualize the person and act accordingly [2] [3]. The robot must be intelligent enough to follow a person in the crowded areas, vivid environment and in indoors and outdoors places [4]. The image processing carried out to get the information about the surroundings visually is a very important thing. The following points should be carefully noted while doing the processing.  The luminosity conditions should be very stable and should not fluctuate.  The ranges should be set properly for the desired environment on which to perform the tracking.  The target should not be very far from the visual sensor as the distance matters a lot.  We should avoid the use of such colors around the robot that matches with that of the target. Otherwise the robot would get confused. Typically human following robots are equipped with several different diverse combination of sensors i.e. light detection and ranging sensor, radio frequency identification module (RFID), laser ranger finder (LFR), infrared (IR) sensing modules, thermal imaging sensors, camera, wireless transmitter/receiver etc. for recognition and locating the target. All the sensors and modules work in unison to detect and follow the target. The capability of a robot to track and follow a moving object can be used for several purposes.  To help humans.  To create ease for people.  Can be used for defense purpose. In this paper, we presented a method of a human following robot based on tag identification and detection by using a camera. Intelligent tracking of specified target is carried out by the use of different sensors and modules [5] i.e. ultrasonic
  • 2. Design and Development of Human Following Robot 80 sensor, magnetometer, infrared sensors and camera. An intelligent decision is being made by the robot control unit based on the information obtained from the above sensors and modules, hence finding and tracking the particular object by avoiding the obstacles and without collision with the target. II. RELATED WORK So far a lot of research has been done on the kinds of robot that fall into the category of the “Assisting Robots”. People have used different logics and alog‟s to implement their design. All of their primary focus has entirely been on the design of robots that follows the target. Laser sensor is used by Burgard in his tour guide robot for human tracking [6]. LRF was incorporated by D. Schulz to perform the „following‟. Using the above mentioned process, they performed the information linking for the detection[7]. Nicola, Husing used a technique for pointing out the different styles of movement by using LRF. This information was fused with the information obtained by the camera [8]. Depth imaging was used by Songmin Jia to carry out the detection. The model of a person was determined using the depth imaging [9]. The particular style of clothing was used by Mehrez Kristou. He used a multidirectional camera. LRF was also incorporated by him in the design [10]. A research was conducted by Wilhelm with the focus on the color of the particular person‟s skin. Information from different sensors was also used by him in the research[11]. Some other research work was also conducted in this regard, Depth imaging was used by Calisi and the target was persued by designing a special algorithm[12]. Ess and Leibe carried out the same work. They did a lot of work on object tracking and detection. The biggest advantage of their method was that their algorithm worked in complex environments as well[13] [14]. Stereo visiion was also carried out by Y. Salih in order to perform the detection[15]. This method enabled him to persue the required target with an effective manner. The combination of different sensors were used by R. Munoz to .get the information about the target to be tracked. In addition to using different sensors, he also used stereo vision to get an accurate information.[16]. The data of the sensors combined with the information from the camera proved to be very helpful in carrying out the task[17]. Different algorithms are being developed by the researchers for the detection purposes. Laser was used in one research to find the style of the moving legs [18] [19][20] and camera was used to detect a particular object or a person[3] [21]. A very simple technique was also used by a research. In this technique, the person used distance sensors on the robot and the person. These sensors emitted radio waves and were detected by the sensors on the person to be followed. This way the robot followed the required target[22]. III. SYSTEM COMPONENTS Our system consists of a three wheel robotic vehicle mounted with a separate microprocessor and control unit along with different sensors and modules i.e. ultrasonic sensor, magnetometer, infrared sensors, and camera. The camera height is vertically self-adjusting and is initially mounted on robot at a height of 4 ft. from ground to enhance the visual capability and effectiveness. This robotic vehicle is controlled by the user by means of an identification tag. IV. METHODOLOGY A systematic research methodology is adopted keeping in mind the ultimate goal of a fully functional and autonomous human following robot. A decentralized top down approach is used for this project. The project is divided in to five modules. Each module is independent from one another. Different phases were carried out step by step, starting from basic sensor testing and proceeding towards obstacle avoidance, object detection, object tracking and data transmission. Due to the decentralized approach, all modules and sensors act independently. Data obtained by different sensors and modules is collectively analyzed and an intelligent decision on the basis of information obtained is made that instruct the robot to follow a particular direction. Two separate units are used i.e. microprocessor and a controller. The processing is carried out by microprocessor and the information obtained by the sensors is controlled by a controller i.e. Arduino board. A serial communication between microprocessor and controller is established to exchange the visual sensing information. This approach was most suitable because if there is a fault in any one of the modules then it would not affect the entire system. Hence this provides the best possible results by maintaining accuracy. Human tracking, obstacle avoidance, maintaining a specific distance from the object and establishing a communication
  • 3. Design and Development of Human Following Robot 81 link between microprocessor and controller are the main aspects of this project. V. IMPLEMENTATION The implementation of human interactive robot seems to be an easy task but there are some problems related to it. For example, detection of a particular tag. Similarly the tag needs to be unique so that it should not be merged into colors in a vivid environment. So we have designed a novel algorithm to overcome this problem. There are several features of the project such as the mechanical structure, design of circuits, tag detection and intelligent tracking system. The implementation of human interactive robot is as follows. A. Design of a custom Tag: In order to make the target distinctive and unambiguous, a tag was designed in particular. This tag had four different colors in box arrangement. The whole research is based on the fact that the target has to be specific to the robot. The robot should only detect that target and not some other target. All these colors enable the robot to detect the person by tracking the tag. This tag is placed on the person that needs to be pursued. Once the tag is put on the person the robot detects the colors and recognizes the target through the detection. Fig. 1. Custom Tag B. Design of Mechanical Structure: The mechanical structure of the robot is comprised of two layers base and it consist of two wheel differential drive system and a free wheel. It is designed keeping in view that the camera on robot has to be mounted over a certain height from the ground. The height of this camera is adjustable according to height of a person so that better visual information can be obtained. So initially the height of the camera is set up to 4 ft. The software design of the mechanical structure is shown below. Fig. 2. Software Design of Mechanical Structure C. Design of Relay Based Motor Driver Module: A relay based motor driver module is designed to control the differential drive of the robotic platform. The advantage of this relay based module is that it can easily drive the high torques motors that require large fluxes. Hence it can easily handle heavy loads. D. Image Processing Algorithm: A novel algorithm to process the real time video is used to detect and follow the unique tag. We have used a computer vision camera for recognizing the tag at the back of person and an OpenCV python Platform to develop this algorithm. In this algorithm, a tag having the four color are used. The algorithm is designed in a way that the tag is only detected when the four colors are all together and it that particular order. Once the tag is detected, the centroid of the common region of four colors is found to get the center coordinates of the tag. These center coordinates are then serially transmitted to control unit for further processing and to make an intelligent decision by fusing it with information obtained by the other sensors and modules. VI. SYSTEM DESIGN The system design consists of separate processing and control unit. The processing unit only makes use of a camera and is linked with the control unit to serially transmit the visual information after bulk processing. The control unit is serially linked with the processor and it makes use of several sensors and modules i.e. ultrasonic sensor, magnetometer and infrared sensors. The above sensors and camera works in unison with each other and helps the robot in its operation and to navigate its
  • 4. Design and Development of Human Following Robot 82 path by avoiding the obstacles and maintaining a specific distance from the object. The decision is made on the basis of information obtained from all above sensors. The generalized system design incorporating different sensors and modules is show below. Fig. 3. System Design Looking at the working of the above system, the first phase is the detection of a tag by means of a camera and carrying out the substantial processing in the processing unit. The processor that we have used is Raspberry Pi single board computer. After the detection of tag next phase is to establish a serial communication between the processor and control unit. We have used Arduino as a control unit. In this phase center point coordinates of tag are serially transmitted to Arduino for further processing. Next phase is to interface modules and necessary sensors with the control unit. For this purpose, we used ultrasonic sensor, magnetometer and IR sensors for the proper functioning of robot. We used ultrasonic sensor for obstacle avoidance and to maintain a specific distance for the object. The ultrasonic sensor works accurately works accurately with in a range of 4 meters. Ultrasonic sensors operate by calculating the times differences. [23]. Fig. 4. Ultrasonic Sensor Principle This ultrasonic sensor is placed at the top of robot along with the camera module to maintain accuracy in measuring distance between the robot and target object. The flow chart to maintain specific distance from target is shown below. Fig. 5: Flow Chart of Maintaining Specific Distance After ultrasonic sensor we interfaced the magnetometer to get the orientation of robot in x-y-z coordinates. This module determines the orientation of robot and tells heading direction of robot. This heading direction is used to determine the tilt of robot from its original position. [24]. On the basis of information obtained from this module, the control unit determines that how much direction change is required to be back on track again by after avoiding the obstacle. After interfacing of above sensors, the next most important part of this system design is to interface the encoders to wheel calculate the distance travelled by the robot to eliminate any further error in the robotic movement due to displacement. For this purpose we attached two slot sensors on top of the encoder‟s right beside the wheels. The slot sensor has IR transmitter and a photodiode mounted on it and facing each other. The light emitted by the IR LED is blocked because of alternating slots
  • 5. Design and Development of Human Following Robot 83 of the encoder disc. This causes the logic level of the photo diode to change and is detected by the controller [25]. Fig. 6. IR Slot Sensor The distance is obtained by the number of counts recorded in memory of the controller and is used for distance calculations by using the following formula. Final phase is consisted of fusing all the information obtained by the sensors and modules in the control unit. Hence, control unit makes an intelligent decision to change the direction of robot and to get back on its track again and to follow the target having tag on basis of information obtained for all sensors and modules i.e. serially received coordinates from processor, distance information from ultrasonic sensor, heading direction from magnetometer, and distance calculation from the IR sensor. Fig. 7. Human Following Robotic System. The system is designed in such a way that it intelligently makes use of the information obtained from different sensors and modules. Different sensors are incorporated in the system to make it able to respond to diverse situations. VII. EXPERIMENTS AND RESULTS Different experiments were conducted and the performance of the human following robot was tested. Each experiment that was performed took about 10 to 15 minutes. On the basis of results obtained from these tests and experiments, we made the necessary changes in the processing and control algorithm. First test was performed on the ultrasonic sensor. It was noted that sensor was working accurately with in a range of 4 meters. Then we performed the test to check that weather the robot maintains a specific distance with target object. Initially the we set the stopping of robot to 8 inches. It was observed that robot collided with the object as the distance between robot and target object approaches to 8 inches. This problem behind this was that the stopping distance was small enough and robot was not stopping quickly because of its load on board. So we increased the distance to 12 inches. Then we again verified the routine. The next test was performed on the magnetometer module. This module gave us the heading direction of the robot with respect to some reference. But we observed that there was some offset error in that heading direction. On observation we found that this was due to the wrong placement of the magnetometer module. This module was interfering with the magnetic field of the electronic components due to which we were getting the offset error. So we changed its placement and set it up at bottom layer and in the center of the robotic structure. Now we had the heading degree without any offset error. Then the next experiment was to test the detection of tag. We observed that in certain lightning conditions the tag was not detecting properly. So we adjusted the hue, saturation and value of all the four colors as color thresholding in HSV varies by the lighting conditions. So after changing the threshold value we observed that this time the processor was detecting the Tag properly. Results are shown below.
  • 6. Design and Development of Human Following Robot 84 Fig. 8. Tag Detection Then we check the serial communication between Arduino and Raspberry pi. We observed that when the tag was not infront of camera the processor transmitted a null value for both coordinates and some value when tag was in front of camera. Hence, processor was correctly transmitting the coordinates to control unit. Results are shown below. Fig. 9. Serial Communication between Arduino and Raspberry Pi Final test was performed on the differential drive system to check whether the robot intelligently follow the person or not. We observed that the results produced were very satisfying. The robot was perfectly following the person wherever it goes. We just adjusted its speed at the end. Hence the objective of implementing a good Human-Robot interaction was achieved by fast processing of the visual input combined with the proper handling of the information from the sensors. Fig. 10. Human Follower Robot VIII. APPLICATIONS Looking deeply into environment or our surroundings, we will be able interpret that “YES” there is a need of such robot that can assist humans and can serve them. Such a robot can be used for many purposes. With a few changings, the robot can act as a human companion as well. Some other applications of this robot are  Can assist in carrying loads for people working in hospitals, libraries, airports, etc.  Can service people at shopping centers or public areas.  Can assist elderly people, special children and babies.  Can follow a particular Vehicle. IX. FUTURE WORK There are many interesting applications of this research in different fields whether military or medical. A wireless communication functionality can be added in the robot to make it more versatile and control it from a large distance. This capability of a robot could also be used for military purposes. By mounting a real time video recorder on top of the camera, we can monitor the surroundings by just sitting in our rooms. We can also add some modifications in the
  • 7. Design and Development of Human Following Robot 85 algorithm and the structure as well to fit it for any other purpose. E-g a vehicle follower. Similarly it can assist the public in shopping malls. So there it can act as a luggage carrier, hence no need to carry up the weights or to pull that. Using this algorithm the robot will automatically follow that person. X. CONCLUSION A successful implementation of a person follower robot is illustrated in this research. This robot does not only have the detection capability but also the tracking and following ability as well. The tracking is basically performed on the tag and the human is followed on the basis of that detection. It was also kept in mind that the „following‟ capability of the robot should be as efficient as possible. The tests were performed on the different conditions to pin point the mistakes in the algorithm and correct them. The different sensors that were integrated with the robot added an additional advantage. XI. REFERENCES [1] K. Morioka, J.-H. Lee, and H. Hashimoto, “Human-following mobile robot in a distributed intelligent sensor network,” IEEE Trans. Ind. Electron., vol. 51, no. 1, pp. 229–237, Feb. 2004. [2] Y. Matsumoto and A. Zelinsky, “Real-time face tracking system for human-robot interaction,” in 1999 IEEE International Conference on Systems, Man, and Cybernetics, 1999. IEEE SMC ’99 Conference Proceedings, 1999, vol. 2, pp. 830–835 vol.2. [3] T. Yoshimi, M. Nishiyama, T. Sonoura, H. Nakamoto, S. Tokura, H. Sato, F. Ozaki, N. Matsuhira, and H. Mizoguchi, “Development of a Person Following Robot with Vision Based Target Detection,” in 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006, pp. 5286–5291. [4] H. Takemura, N. Zentaro, and H. Mizoguchi, “Development of vision based person following module for mobile robots in/out door environment,” in 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO), 2009, pp. 1675–1680. [5] N. Bellotto and H. Hu, “Multisensor integration for human- robot interaction,” IEEE J. Intell. Cybern. Syst., vol. 1, no. 1, p. 1, 2005. [6] W. Burgard, A. B. Cremers, D. Fox, D. Hähnel, G. Lakemeyer, D. Schulz, W. Steiner, and S. Thrun, “The interactive museum tour-guide robot,” in AAAI/IAAI, 1998, pp. 11–18. [7] M. Scheutz, J. McRaven, and G. Cserey, “Fast, reliable, adaptive, bimodal people tracking for indoor environments,” in Intelligent Robots and Systems, 2004.(IROS 2004). Proceedings. 2004 IEEE/RSJ International Conference on, 2004, vol. 2, pp. 1347–1352. [8] N. Bellotto and H. Hu, “People tracking and identification with a mobile robot,” in Mechatronics and Automation, 2007. ICMA 2007. International Conference on, 2007, pp. 3565– 3570. [9] S. Jia, L. Zhao, and X. Li, “Robustness improvement of human detecting and tracking for mobile robot,” in Mechatronics and Automation (ICMA), 2012 International Conference on, 2012, pp. 1904–1909. [10] M. Kristou, A. Ohya, and S. Yuta, “Target person identification and following based on omnidirectional camera and LRF data fusion,” in RO-MAN, 2011 IEEE, 2011, pp. 419–424. [11] T. Wilhelm, H. J. Böhme, and H. M. Gross, “Sensor fusion for vision and sonar based people tracking on a mobile service robot,” in Proceedings of the International Workshop on Dynamic Perception, 2002, pp. 315–320. [12] D. Calisi, L. Iocchi, and R. Leone, “Person following through appearance models and stereo vision using a mobile robot.,” in VISAPP (Workshop on on Robot Vision), 2007, pp. 46–56. [13] A. Ess, B. Leibe, and L. Van Gool, “Depth and appearance for mobile scene analysis,” in Computer Vision, 2007. ICCV 2007. IEEE 11th International Conference on, 2007, pp. 1–8. [14] A. Ess, B. Leibe, K. Schindler, and L. Van Gool, “A mobile vision system for robust multi-person tracking,” in Computer Vision and Pattern Recognition, 2008. CVPR 2008. IEEE Conference on, 2008, pp. 1–8. [15] Y. Salih and A. S. Malik, “Comparison of stochastic filtering methods for 3D tracking,” Pattern Recognit., vol. 44, no. 10, pp. 2711–2737, 2011. [16] R. Muñoz-Salinas, E. Aguirre, and M. García-Silvente, “People detection and tracking using stereo vision and color,” Image Vis. Comput., vol. 25, no. 6, pp. 995–1007, 2007. [17] J. Satake and J. Miura, “Robust stereo-based person detection and tracking for a person following robot,” in ICRA Workshop on People Detection and Tracking, 2009. [18] J. H. Lee, T. Tsubouchi, K. Yamamoto, and S. Egawa, “People Tracking Using a Robot in Motion with Laser Range Finder,” in 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006, pp. 2936–2942. [19] M. Lindstrom and J. O. Eklundh, “Detecting and tracking moving objects from a mobile platform using a laser range scanner,” in 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2001. Proceedings, 2001, vol. 3, pp. 1364–1369 vol.3. [20] M. Kristou, A. Ohya, and S. Yuta, “Panoramic vision and lrf sensor fusion based human identification and tracking for autonomous luggage cart,” in Robot and Human Interactive Communication, 2009. RO-MAN 2009. The 18th IEEE
  • 8. Design and Development of Human Following Robot 86 International Symposium on, 2009, pp. 711–716. [21] C. Schlegel, J. Illmann, H. Jaberg, M. Schuster, and R. Wörz, “Vision Based Person Tracking with a Mobile Robot.,” in BMVC, 1998, pp. 1–10. [22] N.-Y. Ko, D.-J. Seo, and Y.-S. Moon, “A Method for Real Time Target Following of a Mobile Robot Using Heading and Distance Information,” J. Korean Inst. Intell. Syst., vol. 18, no. 5, pp. 624–631, Oct. 2008. [23] K. S. Nair, A. B. Joseph, and J. I. Kuruvilla, “Design of a low cost human following porter robot at airports.” [24] V. Y. Skvortzov, H.-K. Lee, S. Bang, and Y. Lee, “Application of Electronic Compass for Mobile Robot in an Indoor Environment,” in 2007 IEEE International Conference on Robotics and Automation, 2007, pp. 2963– 2970. [25] “http://www.nskelectronics.com/slot_sensor.html,” Slot Sensor. .