SlideShare a Scribd company logo
1 of 10
Download to read offline
Journal of Physics: Conference Series
PAPER • OPEN ACCESS
Social Service Robot using Gesture recognition
technique
To cite this article: D. Jessintha et al 2023 J. Phys.: Conf. Ser. 2466 012020
View the article online for updates and enhancements.
You may also like
Scene Understanding Technology of
Intelligent Customer Service Robot Based
on Deep Learning
Jianfeng Zhong
-
Path Planning in Service Robot Based on
Improved A* Algorithm
Junyi Cao and Jie Liu
-
A Review on Service Robots: Mechanical
Design and Localization System
M. Q. Bakri, A. H. Ismail, M.S.M. Hashim
et al.
-
This content was downloaded from IP address 213.230.107.211 on 12/04/2023 at 17:25
Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution
of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
Published under licence by IOP Publishing Ltd
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
1
Social Service Robot using Gesture recognition technique
D. Jessintha1
, P. Praveen kumar2
, S. Jaisiva3
, T. Ananth kumar4
*, Christo
Ananth5
1
Electronics and Communication Engineering, Easwari Engineering College,
Chennai, India
2
Deparment of Information Tehncology, Sri Manakulavinayagar Engineering college ,
Puducherry, India.
3
Department of EEE, M.Kumarasamy College of Engineering, Karur,Tamilnadu,
India
4
Computer Science and Engineering, IFET College of Engineering, Tamilnadu, India.
5
Department of Natural and Exact Sciences, Samarkand State University, Uzbekistan
*tananthkumar@ifet.ac.in
Abstract. A robot is a machine that can automatically do a task or a series of tasks based on its
programming and environment. They are artificially built machines or devices that can perform
activities with utmost accuracy and precision minimizing time constraints. Service robots are
technologically advanced machines deployed to service and maintain certain activities.
Research findings convey the essential fact that serving robots are now being deployed
worldwide. Social robotics is one such field that heavily involves an interaction between
humans and an artificially built machine. These man-built machines interact with humans and
can also understand social terms and words. Modernization has bought changes in design and
mechanisms due to this ever-lasting growth in technology and innovation. Therefore, food
industries are also dynamically adapting to the new changes in the field of automation to
reduce human workload and increase the quality of service. Deployment of a robot in the food
industries which help to aid deaf and mute people who face social constraints is an ever-
growing challenge faced by engineers for the last few decades. Moreover, a contactless form of
speedy service system which accomplishes its task with at most precision and reduced
complexity is a feat yet to be perfected. Preservation of personal hygiene, a better quality of
service, and reduced labour costs is achieved.
1. Introduction
Social service robots, in contrast to their industrial counterparts, have a definite role to play. The
frequent enhancements in the technological field and innovation led to an everlasting development in
robotic systems[1][2]. Intelligent Robotic Systems deploy technologies such as gesture and voice
recognition to overcome human workload as well as to maintain the quality of service thus rendered.
The robot is summoned using a button embedded in an RF transmitter module. After the reception of
the signal, a color sensor is attached to the robot, which utilizes the technique of line following, and
can move along a pre-determined path to reach its destination[3-5]. Constructing a gesture recognition
system programmed using Raspberry Pi 4 to capture the gesture shown by people can help serve the
desired particulars such as water, tea, coffee, or other beverages without any physical contact. On
receiving relevant inputs from the user, the robot arm designed using MG996r and SG90 can grab the
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
2
cup and place it beneath the dispenser. The custom-built dispenser using L298N, the submersible
water pump, transfers the requested beverage to the cup. The robotic arm thereby serves the requested
particulars on a tray. The methodologies used are interpreted in Section 2. The hardware modules used
for this design are discussed in Section 3. The efforts required for such people to get their particulars
without any hassle can be overcome through this method. Robots help reduce the physical stress level
of an employee and maintain a cleaner ambiance without compromising the food quality. A social
service robot should aid people in diverse conditions, such as a shop, restaurant, hospital, or even
home. The sole purpose of this robot is to increase robot - humans interaction through sophisticated
technological advancements. This paper interprets the basic operation which can be applied to various
places for specific tasks[5-9].
2. Related Works
Due to technological advances, robots are becoming more common. Shortly, we may see intelligent
robots helping people in need. Human-robot interaction research is becoming more critical as it
integrates hand sign recognition components[10]. Hand gesture recognition may make human-robot
communication more natural. This may help humans and robots collaborate to increase application
efficiency and avoid problems[11]. De Smedt et al. classified hand gestures using the SVM algorithm
and skeletal and depth data[12]. Nunez et al. classified hand gestures using an HMM and SVM
(HMM). Image acquisition and division used Kinect sensor data and a skeleton-based method[13].
According to Tang et al. [14], hand gestures can be tracked in real-time using a recursive connected
component algorithm and pixelated hand skeletons' 3D geodesic distance. Praveen kumar et al. studied
nonverbal communication and an R-CNN to improve avocado harvesting in a simulated
workspace[15]. This improved efficiency. A robot located workers and determined if they needed help
by recognizing human activity, hand gestures, or flags. et al. CNN/LSTM networks were used (long
short-term memory). The harvester learned from his hands. Hand gesture recognition results may vary
based on image color texture. Due to differences in skin color between people and countries, results
may vary. Light affects color and texture. Shape-based features can also recognize hand gestures. We
took a different approach. Normal thumb and finger lengths are about the same on both hands. Hand
shape-based gesture recognition frame rate is comparable to most existing systems. The number and
accuracy of recognized gestures were among the best. Robots must understand what humans are
saying to collaborate effectively with them. Humans and robots must communicate using natural
gestures in HRC manufacturing[16]. The hand is differentiated from its surroundings using a skin
tone. Principal component analysis categorized all eight static gestures. Pishardy et al. use a restricted
coulomb energy (RCE) neural network to separate a hand from an image. To train a second RCE
neural network to recognize static hand gestures, measure the hand-to-arm distance and number of
spread fingers. 95% accuracy is possible with an eight-size gesture lexicon[17]. A color camera
captures real-time images in full color. In Otsu segmentation, the Y-Cb-Cr color space distinguishes
the moving hand from the constant background. The k-curvature algorithm [18] determines an image's
high and low points. A gesture's peak-to-valley ratio determines its group. The judgments can be
95.2% accurate. This system recognizes six gestures. One factor means this cannot increase work
output. They assume that hands are the camera's most accessible part. The hand's orientation can be
determined using a vector from the center to its farthest point. The robot's movement is controlled by
the hand's orientation, while its straight-line velocity is determined by its distance from the image. El
Makrini et al. found that hand shape can control a robot. Instead of using high-level depth and color
features, all intensity values in a box around the hand are used[19]. This replaces feature extraction.
The average neighborhood margin maximization algorithm reduces feature space dimensions
(ANMM). Four hand gestures are classified using the nearest neighbor classifier. Wadhawan et al.
[20] classify six hand gestures using a geometric property. In color images, hand skin color is
evaluated to identify it. Calculate the distance between the hand's center and each outline point. A
gesture's number of peaks and valleys determines its category. Zhang et al. created a system that
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
3
recognizes hand gestures online in real time[21]. The chamfer distance aligns a hand template with an
image's edges after removing a person's body. This step follows body division. The hand is then given
a realistic skin color model. Tracing the hand's center across multiple frames reveals the final feature.
Vector shows hand location and motion. Hidden Markov models learn hand movements. Due to color
cameras' high frame rate, most current techniques use video sequences. Human-robot interaction
(HRI) has become a focus of research in computer vision and robotics due to the broad range of
applications in the field of human-computer interaction (HCI).
3. Working Prototype
The robot, which comprises different modules, is primarily stationed at the origin of the desired area
intended to be used. The following techniques are adapted to demonstrate the basic idea of this project.
Figure 1 shows the block diagram of the proposed system.
3.1 Transmission and Reception
A transmitter-receiver is enclosed in the base of the robot. This is operated at a frequency of 27
MHz. Radiofrequency utilizes the principles of radio waves to transfer signals to the receiver from the
transmitter by adjusting the current and voltage parameters that alter the oscillation rate. It can send
signals from 20KHz to around 200GHz. This concept is utilized such that when the robot is
summoned at the user's side, an RF signal is transmitted from the transmitter situated at the table to the
receiver, housed within the base of the robot. This receiver module is connected to the microcontroller
that initiates the robot's movement. Consequently, the base of the robot moves to the respective table
where the user is seated using TCS3200 module.
Figure 1. Block diagram of the Proposed system
3.2. Line follower with color sensor
After the successful reception of the signal, the robot is destined to reach the respective table where it
was summoned. DC motors are powered by the motor driver that makes forward and backward
movement possible depending on the command given to it by the microcontroller. Therefore, the base
of the robot has to plan its path. It utilizes a color sensor to check the path that the robot takes. The
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
4
color sensor module is calibrated to detect two colors, Red and Blue. The path to the table is laid using
these two colors. Applying the principles of the line follower technique, the robot has to sense the
right color and reach the table where the person is seated.
3.3. Gesture recognition
The camera encompassed at the head of the robot helps to capture a frame containing the real-time
image of the hand. The threshold of the frame is adjusted manually from its default value. Blur is
introduced into the frame to cancel out unnecessary areas after adjusting the threshold according to the
area's lighting conditions. It is then converted to a grayscale version. Finally, the image is then
extrapolated using the microcomputer that interprets the following image shown in the frame to a
value that is stored in a list.
3.4. I2
C interfacing
After receiving the correct count, this value is passed to the microcontroller. The microcontroller is
powered by a 5v rechargeable lithium-polymer battery, whereas the Raspberry pi 4 utilizes a 3.3v on
its GPIO bus. The essential difference between the Arduino and Pi is the number of I/O ports they
contain, especially Arduino, which contains several analog and digital ports. They can also be used to
handle interrupts and timing circuits based on their usage. Therefore to utilize their maximum
potential, they should be interfaced in such a way that they can communicate with each other. This is
done using pre-built user-defined functions present in Python and Arduino IDE.
3.5. Dispenser
The dispenser is built using a 12v DC pump, unlike a 9v pump with fewer rotations per minute. Being
a submersible type water pump minimizes the expenses and becomes easier to replace if the pump
does not function properly. The pipe wound around the DC motor must be checked and analyzed
before use. Dimensions of the pipe are to be altered depending on the liquid that flows through it and
the pressure applied to push the fluid from the storage unit to the cup. It can dispense up to two
different drinks. The pipe used in the prototype is 5mm in diameter and 30cm in length. By adjusting
the flow rate, the amount of liquid poured into the cup can be increased or decreased. This prototype
can fill 100 ml into the cup for about 3 seconds.
3.6. Robotic arm
A robotic arm is placed beside the dispenser to the right, which helps to place the cup below the
dispenser at a fixed coordinate. After the cup is filled, the robotic arm then serves this cup on top of a
tray. The robotic arm is manipulated using a microcontroller and is maneuvered using a metal gear
servo motor. Being constructed with two degrees of freedom, it can lift heavy objects. Accordingly,
the user can place heavier cups between its claws. This prototype can be attached to a cup holder if the
user requires it. Temporarily the cup has to be placed on the robot's hands to receive the particulars.
This eliminates the hassle of the user doubting the cleanliness of the cup used.
4. Module Description
4.1. Module 1: Raspberry PI 4
The proposed prototype is shown in figure 1. The Raspberry pi 4 is the updated version of the Pi series
of microcomputers that was released a few years ago. It is a 64-bit processor having four cores. Some
of the features are wireless, Ethernet, and Bluetooth connectivity. These cores operate at a
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
5
synchronized speed of 1.5GHz. This is a computer with the smallest size factor possible. This
prototype coordinates the working of each module.
Consequently, the Pi transfers data between itself and the microcontroller establishing a two-way
communication termed I2
C (Inter-Integrating Circuits) Communication. Therefore, it also helps in
processing the finger count captured by the camera, scheduling tasks for each module connected to the
microcontroller, and handling interrupts. However, due to its small form factor, there are some notable
discrepancies, such as time delay and thermal latency issues. Moreover, Pi can also be used as a
microcontroller by connecting an external module.
Figure 2. Proposed Prototype
4.2. Module 2: Arduino UNO
Arduino UNO is a microcontroller that contains a total of 20 input/output pins which are separated as
digital and analog pins. ATmega328 is the chip built into it. The prototype only utilizes the digital
section of the module and is powered by a 5v rechargeable lithium-polymer battery. Arduino is a
microcontroller with more digital and analog pins than a microcomputer. Therefore, it helps to
manipulate certain dependent modules, such as the Motor drivers and Servo motors, keeping the
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
6
expenses to a minimum. Apart from hardware manipulation, the PWM pins present in the digital
section of this module also help in timing and interrupt handling.
4.3. Module 3: RF Transceiver
The 27MHz remotely controlled two-way channel button type transmitter that transmits the signal to
the receiver uses the principles of radio waves. The left button signifies the Red color line, and the
right button signifies the blue color line. The buttons work with the principles of that of a switch.
When the right button is pushed, the signal sent is HIGH, keeping the other signal LOW. The working
of the left button is similar to that of the right button. Accordingly, when we push the control button,
the transmitter sends specific electrical pulses corresponding to that action through the air. The
transmitter has its power source, usually in the form of a 9 to 12-volt battery. Radiofrequency, a short
distant signal with good noise immunity and resistance, is used to command the robot to set its
destination towards which it later moves. Utilizing a 27 MHz frequency also helps minimize latency
issues and power consumption. Without the battery, the transmitter cannot send the radio frequency
signals to the receiver.
4.4. Module 4: Motor driver
L298N utilizes two H-Bridge mechanisms to control the low current-rated motor. This high-power
module is connected to the microcontroller that can control up to four DC circuits, such as DC and
Stepper Motors, with directional and speed control. This prototype utilizes this particular module to
enhance the operability of the DC motors and pumps connected with it, reducing the space needed to
encompass the setup.
4.5. Module 5: camera
Logitech Quickcam Notebook Delux is an optical digital 640X480 resolution camera. The sensor is a
0.3 megapixel CMOS type sensor able to capture images and record real-time videos by allowing light
to pass through the digital lens. Here, this camera module helps in capturing a real-time image of the
hand within a frame of dimension 0.5cm in the X direction and 0.8cm towards the Y direction. This
information is then sent to the microcomputer where it is further processed.
4.6. Module 6: Submersible DC Pump
A submersible pump is a form of a pressure pump that can be fully submerged in water. Using a shaft,
attached to the DC motor, the water is pumped out through the end pipe. The DC motor is protected
from water by an insulated coating made of plastic. The pipe used in the prototype is 5mm in diameter
and 30cm in length. By adjusting the rate of flow, the amount of liquid dispensed into the cup can be
increased or decreased. This prototype can fill 100 ml into the cup for about 3 seconds.
4.7. Module 7: Servo Motor (Sg90)
Servo motors are mechanical shaft like devices possessing high torque employed in fields such as
robotics and automation. Due to high current requirements, the power supply connected to it must be
sufficient enough to power all the servos connected to the microcontroller. In this prototype, the SG90
module helps to control the action of the robotic claw. The geared shaft in the servo motor rotates 1
degree at a time which can be electrically controlled. The robotic claw is set at a default value of 55
degrees after which it will come back to its initial position.
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
7
4.8. Module 8: Servo motor (mg 996r)
Similar to the SG90 module, this MG996r module is a metal gear servo motor that consists of a geared
shaft built out of metal and is electrically controlled. This module can lift heavier and sturdy objects
with ease. The prototype utilizes its function in the working of the robotic arm. Since the robotic arm
is custom-built and bulky, this module helps to preserve the functions of the servo motor without
getting damaged when a heavier object is lifted.
5. Results and Discussion
The proposed prototype helps to tackle the issue between robot and human interaction. A remote-
controlled RF signal is first transmitted to the receiver embedded in the robot's base. After setting its
destination coordinate, using the line follower technique along with the help of a color sensor, limited
to two colors, the robot can move towards the table where the signal was first transmitted. The flow is
then shifted to the gesture recognition system that helps to detect the fingers shown within the frame
captured using a camera. This framework consists of the representation and the decision processes.
The representation process converts the raw numerical data into a form adapted to the decision
process, which then relays this information to the next stage. Therefore, the robot has to be placed in a
surrounding with good lighting conditions.
Consequently, the robotic hand, built with 2 degrees of freedom, activates, thus grabbing the cup
within its gripper and swiveling to place it below the dispenser at a particular assigned coordinate. The
custom-built dispenser pours the required particulars making the claw grab the cup again and serve it
on a tray. The number of beverages poured into the cup can be adjusted manually. Therefore, this
prototype is an uncomplicated model that requires further modification and customization to deploy in
real-time scenarios. Nevertheless, this is a step taken that involves intense engineering and
programming to achieve this feat.
6. Conclusion
Gesture recognition is a topic of language technology to interpret human gestures via mathematical
algorithms. This is a field where researchers are actively working to break the barrier between human
and robot interaction. The need for handheld devices can be reduced by employing this concept of
gesture recognition which opens up an avenue of newer specialized interactive devices. Our project
thus helps bridge the gap between robots and humans using this technology. It serves as a gateway and
inception for those who are deaf and mute. They can now indulge in social interaction without needing
a third person to aid them. Due to the recent pandemic, vendors must adopt innovative strategic ways
of serving and attending to guests. Moreover, even in hospitals and houses, older adults need
assistance to get beverages to satisfy their thirst. By employing this prototype, these issues can be
overcome with ease. Socially and physically challenged people can also use such a machine to serve
their particulars without needing another human being in their care. A contactless form of interaction
is achieved, thus limiting the spread of germs and viruses. This ensures cleanliness and a hygienic way
of serving the ordered particulars. Contactless forms of service in the new future, thereby, the users’
satisfaction and quality of service increase exponentially. Despite its limitations, this prototype can be
further enhanced with technologies such as AI and Machine learning to detect surrounding objects and
Computer Vision to aid the robot in dynamic room mapping.
References
[1] https://github.com/lzane/Fingers-Detection-using-OpenCV-and-Python accessed on 10th
Nov,2022. accessed on 1st
Dec 2022.
[2] Masuda T, Misaki D. Development of Japanese Green Tea Serving Robot “T-Bartender”.
Proceeding of the IEEE International Conference on Mechatronics & Automation; July
2005; Niagara Falls, Canada. Fukuroi-shi, Toyosawa, Japan: Department of Mechanical
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
8
Engineering, Shizuoka Institute of Science and Technology; 2005. p. 1069- 1074.
[3] Takahashi Y, Hosokawa M, Mochizuki T. Tea Serving Robot. SICE; 29-31 July 1997.
Tokushima, Japan. p. 1111-1114.
[4] Bannach, D., Amft, O., Kunze, K.S., Heinz, E.A. Tröster, G., and Lukowicz, P. Waving real-
hand gestures recorded by wearable motion sensors to a virtual car and driver in a mixed-
reality parking game. In Proceedings of the Second IEEE Symposium on Computational
Intelligence and Games (Honolulu, Apr. 1--5, 2007), 32—39
[5] Chen, Y.T. and Tseng, K.T. Developing a multiple-angle hand-gesture-recognition system for
human-machine interactions. In Proceedings of the 33rd Annual Conference of the IEEE
Industrial Electronics Society (Taipei, Nov. 5--8, 2007), 489—492
[6] Niemelä, Marketta, Päivi Heikkilä, and Hanna Lammi. "A social service robot in a shopping
mall: expectations of the management, retailers and consumers." In Proceedings of the
Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction,
pp. 227-228. 2017.
[7] Chang, Woojung, and Kyoungmi Kate Kim. "Appropriate service robots in exchange and
communal relationships." Journal of Business Research 141 (2022): 462-474.
[8] Tojib, Dewi, Ting Hin Ho, Yelena Tsarenko, and Iryna Pentina. "Service robots or human staff?
The role of performance goal orientation in service robot adoption." Computers in Human
Behavior (2022): 107339.
[9] Takanokura, Masato, Ren Kurashima, Tsubasa Ohhira, Yoshihiro Kawahara, and Mitsuharu
Ogiya. "Implementation and user acceptance of social service robot for an elderly care
program in a daycare facility." Journal of Ambient Intelligence and Humanized Computing
(2021): 1-10.
[10] Prasanalakshmi, B. "Deep Regression hybridized Neural Network in human stress detection." In
2022 International Conference on Smart Technologies and Systems for Next Generation
Computing (ICSTSN), pp. 1-5. IEEE, 2022.
[11] Sabapathy, Sundaresan, Surendar Maruthu, Suresh Kumar Krishnadhas, Ananth Kumar
Tamilarasan, and Nishanth Raghavan. "Competent and Affordable Rehabilitation Robots for
Nervous System Disorders Powered with Dynamic CNN and HMM." Intelligent Systems for
Rehabilitation Engineering (2022): 57-93.
[12] De Smedt, Quentin, Hazem Wannous, and Jean-Philippe Vandeborre. "Skeleton-based dynamic
hand gesture recognition." In Proceedings of the IEEE Conference on Computer Vision and
Pattern Recognition Workshops, pp. 1-9. 2016.
[13] Nunez, Juan C., Raul Cabido, Juan J. Pantrigo, Antonio S. Montemayor, and Jose F. Velez.
"Convolutional neural networks and long short-term memory for skeleton-based human
activity and hand gesture recognition." Pattern Recognition 76 (2018): 80-94.
[14] Tang, Danhang, Hyung Jin Chang, Alykhan Tejani, and Tae-Kyun Kim. "Latent regression
forest: structured estimation of 3d hand poses." IEEE Transactions on Pattern Analysis and
Machine Intelligence 39, no. 7 (2016): 1374-1387.
[15] Kumar, P. Praveen, T. Ananth Kumar, R. Rajmohan, and M. Pavithra. "AI-Based Robotics in E-
Healthcare Applications." In Intelligent Interactive Multimedia Systems for E-Healthcare
Applications, pp. 249-269. Apple Academic Press, 2022.
[16] Yao, Yuan, and Yun Fu. "Contour model-based hand-gesture recognition using the Kinect
sensor." IEEE Transactions on Circuits and Systems for Video Technology 24, no. 11
(2014): 1935-1944.
[17] Pisharady, Pramod Kumar, and Martin Saerbeck. "Recent methods and databases in vision-
based hand gesture recognition: A review." Computer Vision and Image Understanding 141
(2015): 152-165.
[18] Barman, H., Gösta H. Granlund, and Hans Knutsson. "A new approach to curvature estimation
and description." In Third International Conference on Image Processing and its
Applications, 1989., pp. 54-58. IET, 1989.
4th National Conference on Communication Systems (NCOCS 2022)
Journal of Physics: Conference Series 2466(2023) 012020
IOP Publishing
doi:10.1088/1742-6596/2466/1/012020
9
[19] El Makrini, Ilias, Shirley A. Elprama, Jan Van den Bergh, Bram Vanderborght, Albert-Jan
Knevels, Charlotte IC Jewell, Frank Stals et al. "Working with walt: How a cobot was
developed and inserted on an auto assembly line." IEEE Robotics & Automation Magazine
25, no. 2 (2018): 51-58.
[20] Wadhawan, Ankita, and Parteek Kumar. "Sign language recognition systems: A decade
systematic literature review." Archives of Computational Methods in Engineering 28, no. 3
(2021): 785-813.
[21] Zhang, Jie, Xiao‐Qing Xu, Jun Liu, Lei Li, and Qiong‐Hua Wang. "Three‐dimensional
interaction and autostereoscopic display system using gesture recognition." Journal of the
Society for Information Display 21, no. 5 (2013): 203-208.

More Related Content

Similar to Social Service Robot using Gesture recognition technique

Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...IJERA Editor
 
TOUCHLESS ECOSYSTEM USING HAND GESTURES
TOUCHLESS ECOSYSTEM USING HAND GESTURESTOUCHLESS ECOSYSTEM USING HAND GESTURES
TOUCHLESS ECOSYSTEM USING HAND GESTURESIRJET Journal
 
Review on Hand Gesture Recognition
Review on Hand Gesture RecognitionReview on Hand Gesture Recognition
Review on Hand Gesture Recognitiondbpublications
 
Human Computer Interaction Based HEMD Using Hand Gesture
Human Computer Interaction Based HEMD Using Hand GestureHuman Computer Interaction Based HEMD Using Hand Gesture
Human Computer Interaction Based HEMD Using Hand GestureIJAEMSJORNAL
 
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGSLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGIRJET Journal
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...eSAT Journals
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...eSAT Publishing House
 
Scheme for motion estimation based on adaptive fuzzy neural network
Scheme for motion estimation based on adaptive fuzzy neural networkScheme for motion estimation based on adaptive fuzzy neural network
Scheme for motion estimation based on adaptive fuzzy neural networkTELKOMNIKA JOURNAL
 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesIRJET Journal
 
Gesture Recognition System
Gesture Recognition SystemGesture Recognition System
Gesture Recognition SystemIRJET Journal
 
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-CamHand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-Camijsrd.com
 
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET Journal
 
Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...IRJET Journal
 
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A SurveyNatural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A SurveyEditor IJCATR
 
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...IJMER
 
How Cyber-Physical Systems Are Reshaping the Robotics Landscape
How Cyber-Physical Systems Are Reshaping the Robotics LandscapeHow Cyber-Physical Systems Are Reshaping the Robotics Landscape
How Cyber-Physical Systems Are Reshaping the Robotics LandscapeCognizant
 
Paper id 21201494
Paper id 21201494Paper id 21201494
Paper id 21201494IJRAT
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand GesturesIRJET Journal
 

Similar to Social Service Robot using Gesture recognition technique (20)

Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
 
TOUCHLESS ECOSYSTEM USING HAND GESTURES
TOUCHLESS ECOSYSTEM USING HAND GESTURESTOUCHLESS ECOSYSTEM USING HAND GESTURES
TOUCHLESS ECOSYSTEM USING HAND GESTURES
 
Review on Hand Gesture Recognition
Review on Hand Gesture RecognitionReview on Hand Gesture Recognition
Review on Hand Gesture Recognition
 
Human Computer Interaction Based HEMD Using Hand Gesture
Human Computer Interaction Based HEMD Using Hand GestureHuman Computer Interaction Based HEMD Using Hand Gesture
Human Computer Interaction Based HEMD Using Hand Gesture
 
Ay4103315317
Ay4103315317Ay4103315317
Ay4103315317
 
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGSLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
 
14 561
14 56114 561
14 561
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
 
An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...An analysis of desktop control and information retrieval from the internet us...
An analysis of desktop control and information retrieval from the internet us...
 
Scheme for motion estimation based on adaptive fuzzy neural network
Scheme for motion estimation based on adaptive fuzzy neural networkScheme for motion estimation based on adaptive fuzzy neural network
Scheme for motion estimation based on adaptive fuzzy neural network
 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand Gestures
 
Gesture Recognition System
Gesture Recognition SystemGesture Recognition System
Gesture Recognition System
 
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-CamHand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
 
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture Recognition
 
Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...
 
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A SurveyNatural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
 
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
 
How Cyber-Physical Systems Are Reshaping the Robotics Landscape
How Cyber-Physical Systems Are Reshaping the Robotics LandscapeHow Cyber-Physical Systems Are Reshaping the Robotics Landscape
How Cyber-Physical Systems Are Reshaping the Robotics Landscape
 
Paper id 21201494
Paper id 21201494Paper id 21201494
Paper id 21201494
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
 

More from Christo Ananth

Call for Papers - Utilitas Mathematica, E-ISSN: 0315-3681, indexed in Scopus
Call for Papers - Utilitas Mathematica, E-ISSN: 0315-3681, indexed in ScopusCall for Papers - Utilitas Mathematica, E-ISSN: 0315-3681, indexed in Scopus
Call for Papers - Utilitas Mathematica, E-ISSN: 0315-3681, indexed in ScopusChristo Ananth
 
Call for Chapters- Edited Book: Quantum Networks and Their Applications in AI...
Call for Chapters- Edited Book: Quantum Networks and Their Applications in AI...Call for Chapters- Edited Book: Quantum Networks and Their Applications in AI...
Call for Chapters- Edited Book: Quantum Networks and Their Applications in AI...Christo Ananth
 
Call for Chapters- Edited Book: Real World Challenges in Quantum Electronics ...
Call for Chapters- Edited Book: Real World Challenges in Quantum Electronics ...Call for Chapters- Edited Book: Real World Challenges in Quantum Electronics ...
Call for Chapters- Edited Book: Real World Challenges in Quantum Electronics ...Christo Ananth
 
Call for Chapters- Edited Book: Real-World Applications of Quantum Computers ...
Call for Chapters- Edited Book: Real-World Applications of Quantum Computers ...Call for Chapters- Edited Book: Real-World Applications of Quantum Computers ...
Call for Chapters- Edited Book: Real-World Applications of Quantum Computers ...Christo Ananth
 
Call for Papers- Thematic Issue: Food, Drug and Energy Production, PERIÓDICO ...
Call for Papers- Thematic Issue: Food, Drug and Energy Production, PERIÓDICO ...Call for Papers- Thematic Issue: Food, Drug and Energy Production, PERIÓDICO ...
Call for Papers- Thematic Issue: Food, Drug and Energy Production, PERIÓDICO ...Christo Ananth
 
Call for Papers - PROCEEDINGS ON ENGINEERING SCIENCES, P-ISSN-2620-2832, E-IS...
Call for Papers - PROCEEDINGS ON ENGINEERING SCIENCES, P-ISSN-2620-2832, E-IS...Call for Papers - PROCEEDINGS ON ENGINEERING SCIENCES, P-ISSN-2620-2832, E-IS...
Call for Papers - PROCEEDINGS ON ENGINEERING SCIENCES, P-ISSN-2620-2832, E-IS...Christo Ananth
 
Call for Papers - Onkologia i Radioterapia, P-ISSN-1896-8961, E-ISSN 2449-916...
Call for Papers - Onkologia i Radioterapia, P-ISSN-1896-8961, E-ISSN 2449-916...Call for Papers - Onkologia i Radioterapia, P-ISSN-1896-8961, E-ISSN 2449-916...
Call for Papers - Onkologia i Radioterapia, P-ISSN-1896-8961, E-ISSN 2449-916...Christo Ananth
 
Call for Papers - Journal of Indian School of Political Economy, E-ISSN 0971-...
Call for Papers - Journal of Indian School of Political Economy, E-ISSN 0971-...Call for Papers - Journal of Indian School of Political Economy, E-ISSN 0971-...
Call for Papers - Journal of Indian School of Political Economy, E-ISSN 0971-...Christo Ananth
 
Call for Papers - Journal of Ecohumanism (JoE), ISSN (Print): 2752-6798, ISSN...
Call for Papers - Journal of Ecohumanism (JoE), ISSN (Print): 2752-6798, ISSN...Call for Papers - Journal of Ecohumanism (JoE), ISSN (Print): 2752-6798, ISSN...
Call for Papers - Journal of Ecohumanism (JoE), ISSN (Print): 2752-6798, ISSN...Christo Ananth
 
Call for Papers- Journal of Wireless Mobile Networks, Ubiquitous Computing, a...
Call for Papers- Journal of Wireless Mobile Networks, Ubiquitous Computing, a...Call for Papers- Journal of Wireless Mobile Networks, Ubiquitous Computing, a...
Call for Papers- Journal of Wireless Mobile Networks, Ubiquitous Computing, a...Christo Ananth
 
Call for Papers - International Journal of Intelligent Systems and Applicatio...
Call for Papers - International Journal of Intelligent Systems and Applicatio...Call for Papers - International Journal of Intelligent Systems and Applicatio...
Call for Papers - International Journal of Intelligent Systems and Applicatio...Christo Ananth
 
Call for Papers- Special Issue: Recent Trends, Innovations and Sustainable So...
Call for Papers- Special Issue: Recent Trends, Innovations and Sustainable So...Call for Papers- Special Issue: Recent Trends, Innovations and Sustainable So...
Call for Papers- Special Issue: Recent Trends, Innovations and Sustainable So...Christo Ananth
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Christo Ananth
 
Call for Papers - Bharathiya Shiksha Shodh Patrika, E-ISSN 0970-7603, indexed...
Call for Papers - Bharathiya Shiksha Shodh Patrika, E-ISSN 0970-7603, indexed...Call for Papers - Bharathiya Shiksha Shodh Patrika, E-ISSN 0970-7603, indexed...
Call for Papers - Bharathiya Shiksha Shodh Patrika, E-ISSN 0970-7603, indexed...Christo Ananth
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Christo Ananth
 
Virtual Science Is A New Scientific Paradigm
Virtual Science Is A New Scientific ParadigmVirtual Science Is A New Scientific Paradigm
Virtual Science Is A New Scientific ParadigmChristo Ananth
 
Wind Energy Harvesting: Technological Advances and Environmental Impacts
Wind Energy Harvesting: Technological Advances and Environmental ImpactsWind Energy Harvesting: Technological Advances and Environmental Impacts
Wind Energy Harvesting: Technological Advances and Environmental ImpactsChristo Ananth
 
Hydrogen Economy: Opportunities and Challenges for a Sustainable Future
Hydrogen Economy: Opportunities and Challenges for a Sustainable FutureHydrogen Economy: Opportunities and Challenges for a Sustainable Future
Hydrogen Economy: Opportunities and Challenges for a Sustainable FutureChristo Ananth
 
The Economics of Transitioning to Renewable Energy Sources
The Economics of Transitioning to Renewable Energy SourcesThe Economics of Transitioning to Renewable Energy Sources
The Economics of Transitioning to Renewable Energy SourcesChristo Ananth
 
Lifecycle Assessment of Solar PV Systems: From Manufacturing to Recycling
Lifecycle Assessment of Solar PV Systems: From Manufacturing to RecyclingLifecycle Assessment of Solar PV Systems: From Manufacturing to Recycling
Lifecycle Assessment of Solar PV Systems: From Manufacturing to RecyclingChristo Ananth
 

More from Christo Ananth (20)

Call for Papers - Utilitas Mathematica, E-ISSN: 0315-3681, indexed in Scopus
Call for Papers - Utilitas Mathematica, E-ISSN: 0315-3681, indexed in ScopusCall for Papers - Utilitas Mathematica, E-ISSN: 0315-3681, indexed in Scopus
Call for Papers - Utilitas Mathematica, E-ISSN: 0315-3681, indexed in Scopus
 
Call for Chapters- Edited Book: Quantum Networks and Their Applications in AI...
Call for Chapters- Edited Book: Quantum Networks and Their Applications in AI...Call for Chapters- Edited Book: Quantum Networks and Their Applications in AI...
Call for Chapters- Edited Book: Quantum Networks and Their Applications in AI...
 
Call for Chapters- Edited Book: Real World Challenges in Quantum Electronics ...
Call for Chapters- Edited Book: Real World Challenges in Quantum Electronics ...Call for Chapters- Edited Book: Real World Challenges in Quantum Electronics ...
Call for Chapters- Edited Book: Real World Challenges in Quantum Electronics ...
 
Call for Chapters- Edited Book: Real-World Applications of Quantum Computers ...
Call for Chapters- Edited Book: Real-World Applications of Quantum Computers ...Call for Chapters- Edited Book: Real-World Applications of Quantum Computers ...
Call for Chapters- Edited Book: Real-World Applications of Quantum Computers ...
 
Call for Papers- Thematic Issue: Food, Drug and Energy Production, PERIÓDICO ...
Call for Papers- Thematic Issue: Food, Drug and Energy Production, PERIÓDICO ...Call for Papers- Thematic Issue: Food, Drug and Energy Production, PERIÓDICO ...
Call for Papers- Thematic Issue: Food, Drug and Energy Production, PERIÓDICO ...
 
Call for Papers - PROCEEDINGS ON ENGINEERING SCIENCES, P-ISSN-2620-2832, E-IS...
Call for Papers - PROCEEDINGS ON ENGINEERING SCIENCES, P-ISSN-2620-2832, E-IS...Call for Papers - PROCEEDINGS ON ENGINEERING SCIENCES, P-ISSN-2620-2832, E-IS...
Call for Papers - PROCEEDINGS ON ENGINEERING SCIENCES, P-ISSN-2620-2832, E-IS...
 
Call for Papers - Onkologia i Radioterapia, P-ISSN-1896-8961, E-ISSN 2449-916...
Call for Papers - Onkologia i Radioterapia, P-ISSN-1896-8961, E-ISSN 2449-916...Call for Papers - Onkologia i Radioterapia, P-ISSN-1896-8961, E-ISSN 2449-916...
Call for Papers - Onkologia i Radioterapia, P-ISSN-1896-8961, E-ISSN 2449-916...
 
Call for Papers - Journal of Indian School of Political Economy, E-ISSN 0971-...
Call for Papers - Journal of Indian School of Political Economy, E-ISSN 0971-...Call for Papers - Journal of Indian School of Political Economy, E-ISSN 0971-...
Call for Papers - Journal of Indian School of Political Economy, E-ISSN 0971-...
 
Call for Papers - Journal of Ecohumanism (JoE), ISSN (Print): 2752-6798, ISSN...
Call for Papers - Journal of Ecohumanism (JoE), ISSN (Print): 2752-6798, ISSN...Call for Papers - Journal of Ecohumanism (JoE), ISSN (Print): 2752-6798, ISSN...
Call for Papers - Journal of Ecohumanism (JoE), ISSN (Print): 2752-6798, ISSN...
 
Call for Papers- Journal of Wireless Mobile Networks, Ubiquitous Computing, a...
Call for Papers- Journal of Wireless Mobile Networks, Ubiquitous Computing, a...Call for Papers- Journal of Wireless Mobile Networks, Ubiquitous Computing, a...
Call for Papers- Journal of Wireless Mobile Networks, Ubiquitous Computing, a...
 
Call for Papers - International Journal of Intelligent Systems and Applicatio...
Call for Papers - International Journal of Intelligent Systems and Applicatio...Call for Papers - International Journal of Intelligent Systems and Applicatio...
Call for Papers - International Journal of Intelligent Systems and Applicatio...
 
Call for Papers- Special Issue: Recent Trends, Innovations and Sustainable So...
Call for Papers- Special Issue: Recent Trends, Innovations and Sustainable So...Call for Papers- Special Issue: Recent Trends, Innovations and Sustainable So...
Call for Papers- Special Issue: Recent Trends, Innovations and Sustainable So...
 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
 
Call for Papers - Bharathiya Shiksha Shodh Patrika, E-ISSN 0970-7603, indexed...
Call for Papers - Bharathiya Shiksha Shodh Patrika, E-ISSN 0970-7603, indexed...Call for Papers - Bharathiya Shiksha Shodh Patrika, E-ISSN 0970-7603, indexed...
Call for Papers - Bharathiya Shiksha Shodh Patrika, E-ISSN 0970-7603, indexed...
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
 
Virtual Science Is A New Scientific Paradigm
Virtual Science Is A New Scientific ParadigmVirtual Science Is A New Scientific Paradigm
Virtual Science Is A New Scientific Paradigm
 
Wind Energy Harvesting: Technological Advances and Environmental Impacts
Wind Energy Harvesting: Technological Advances and Environmental ImpactsWind Energy Harvesting: Technological Advances and Environmental Impacts
Wind Energy Harvesting: Technological Advances and Environmental Impacts
 
Hydrogen Economy: Opportunities and Challenges for a Sustainable Future
Hydrogen Economy: Opportunities and Challenges for a Sustainable FutureHydrogen Economy: Opportunities and Challenges for a Sustainable Future
Hydrogen Economy: Opportunities and Challenges for a Sustainable Future
 
The Economics of Transitioning to Renewable Energy Sources
The Economics of Transitioning to Renewable Energy SourcesThe Economics of Transitioning to Renewable Energy Sources
The Economics of Transitioning to Renewable Energy Sources
 
Lifecycle Assessment of Solar PV Systems: From Manufacturing to Recycling
Lifecycle Assessment of Solar PV Systems: From Manufacturing to RecyclingLifecycle Assessment of Solar PV Systems: From Manufacturing to Recycling
Lifecycle Assessment of Solar PV Systems: From Manufacturing to Recycling
 

Recently uploaded

Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxupamatechverse
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSSIVASHANKAR N
 
Introduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxIntroduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxupamatechverse
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escortsranjana rawat
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSISrknatarajan
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxupamatechverse
 
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service NashikCall Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...ranjana rawat
 
SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )Tsuyoshi Horigome
 
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Call Girls in Nagpur High Profile
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxpurnimasatapathy1234
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSKurinjimalarL3
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...ranjana rawat
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 

Recently uploaded (20)

Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptx
 
Water Industry Process Automation & Control Monthly - April 2024
Water Industry Process Automation & Control Monthly - April 2024Water Industry Process Automation & Control Monthly - April 2024
Water Industry Process Automation & Control Monthly - April 2024
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
 
Roadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and RoutesRoadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and Routes
 
Introduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxIntroduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptx
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSIS
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptx
 
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service NashikCall Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
 
SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )
 
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...Top Rated  Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
Top Rated Pune Call Girls Budhwar Peth ⟟ 6297143586 ⟟ Call Me For Genuine Se...
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptx
 
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINEDJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
 

Social Service Robot using Gesture recognition technique

  • 1. Journal of Physics: Conference Series PAPER • OPEN ACCESS Social Service Robot using Gesture recognition technique To cite this article: D. Jessintha et al 2023 J. Phys.: Conf. Ser. 2466 012020 View the article online for updates and enhancements. You may also like Scene Understanding Technology of Intelligent Customer Service Robot Based on Deep Learning Jianfeng Zhong - Path Planning in Service Robot Based on Improved A* Algorithm Junyi Cao and Jie Liu - A Review on Service Robots: Mechanical Design and Localization System M. Q. Bakri, A. H. Ismail, M.S.M. Hashim et al. - This content was downloaded from IP address 213.230.107.211 on 12/04/2023 at 17:25
  • 2. Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI. Published under licence by IOP Publishing Ltd 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 1 Social Service Robot using Gesture recognition technique D. Jessintha1 , P. Praveen kumar2 , S. Jaisiva3 , T. Ananth kumar4 *, Christo Ananth5 1 Electronics and Communication Engineering, Easwari Engineering College, Chennai, India 2 Deparment of Information Tehncology, Sri Manakulavinayagar Engineering college , Puducherry, India. 3 Department of EEE, M.Kumarasamy College of Engineering, Karur,Tamilnadu, India 4 Computer Science and Engineering, IFET College of Engineering, Tamilnadu, India. 5 Department of Natural and Exact Sciences, Samarkand State University, Uzbekistan *tananthkumar@ifet.ac.in Abstract. A robot is a machine that can automatically do a task or a series of tasks based on its programming and environment. They are artificially built machines or devices that can perform activities with utmost accuracy and precision minimizing time constraints. Service robots are technologically advanced machines deployed to service and maintain certain activities. Research findings convey the essential fact that serving robots are now being deployed worldwide. Social robotics is one such field that heavily involves an interaction between humans and an artificially built machine. These man-built machines interact with humans and can also understand social terms and words. Modernization has bought changes in design and mechanisms due to this ever-lasting growth in technology and innovation. Therefore, food industries are also dynamically adapting to the new changes in the field of automation to reduce human workload and increase the quality of service. Deployment of a robot in the food industries which help to aid deaf and mute people who face social constraints is an ever- growing challenge faced by engineers for the last few decades. Moreover, a contactless form of speedy service system which accomplishes its task with at most precision and reduced complexity is a feat yet to be perfected. Preservation of personal hygiene, a better quality of service, and reduced labour costs is achieved. 1. Introduction Social service robots, in contrast to their industrial counterparts, have a definite role to play. The frequent enhancements in the technological field and innovation led to an everlasting development in robotic systems[1][2]. Intelligent Robotic Systems deploy technologies such as gesture and voice recognition to overcome human workload as well as to maintain the quality of service thus rendered. The robot is summoned using a button embedded in an RF transmitter module. After the reception of the signal, a color sensor is attached to the robot, which utilizes the technique of line following, and can move along a pre-determined path to reach its destination[3-5]. Constructing a gesture recognition system programmed using Raspberry Pi 4 to capture the gesture shown by people can help serve the desired particulars such as water, tea, coffee, or other beverages without any physical contact. On receiving relevant inputs from the user, the robot arm designed using MG996r and SG90 can grab the
  • 3. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 2 cup and place it beneath the dispenser. The custom-built dispenser using L298N, the submersible water pump, transfers the requested beverage to the cup. The robotic arm thereby serves the requested particulars on a tray. The methodologies used are interpreted in Section 2. The hardware modules used for this design are discussed in Section 3. The efforts required for such people to get their particulars without any hassle can be overcome through this method. Robots help reduce the physical stress level of an employee and maintain a cleaner ambiance without compromising the food quality. A social service robot should aid people in diverse conditions, such as a shop, restaurant, hospital, or even home. The sole purpose of this robot is to increase robot - humans interaction through sophisticated technological advancements. This paper interprets the basic operation which can be applied to various places for specific tasks[5-9]. 2. Related Works Due to technological advances, robots are becoming more common. Shortly, we may see intelligent robots helping people in need. Human-robot interaction research is becoming more critical as it integrates hand sign recognition components[10]. Hand gesture recognition may make human-robot communication more natural. This may help humans and robots collaborate to increase application efficiency and avoid problems[11]. De Smedt et al. classified hand gestures using the SVM algorithm and skeletal and depth data[12]. Nunez et al. classified hand gestures using an HMM and SVM (HMM). Image acquisition and division used Kinect sensor data and a skeleton-based method[13]. According to Tang et al. [14], hand gestures can be tracked in real-time using a recursive connected component algorithm and pixelated hand skeletons' 3D geodesic distance. Praveen kumar et al. studied nonverbal communication and an R-CNN to improve avocado harvesting in a simulated workspace[15]. This improved efficiency. A robot located workers and determined if they needed help by recognizing human activity, hand gestures, or flags. et al. CNN/LSTM networks were used (long short-term memory). The harvester learned from his hands. Hand gesture recognition results may vary based on image color texture. Due to differences in skin color between people and countries, results may vary. Light affects color and texture. Shape-based features can also recognize hand gestures. We took a different approach. Normal thumb and finger lengths are about the same on both hands. Hand shape-based gesture recognition frame rate is comparable to most existing systems. The number and accuracy of recognized gestures were among the best. Robots must understand what humans are saying to collaborate effectively with them. Humans and robots must communicate using natural gestures in HRC manufacturing[16]. The hand is differentiated from its surroundings using a skin tone. Principal component analysis categorized all eight static gestures. Pishardy et al. use a restricted coulomb energy (RCE) neural network to separate a hand from an image. To train a second RCE neural network to recognize static hand gestures, measure the hand-to-arm distance and number of spread fingers. 95% accuracy is possible with an eight-size gesture lexicon[17]. A color camera captures real-time images in full color. In Otsu segmentation, the Y-Cb-Cr color space distinguishes the moving hand from the constant background. The k-curvature algorithm [18] determines an image's high and low points. A gesture's peak-to-valley ratio determines its group. The judgments can be 95.2% accurate. This system recognizes six gestures. One factor means this cannot increase work output. They assume that hands are the camera's most accessible part. The hand's orientation can be determined using a vector from the center to its farthest point. The robot's movement is controlled by the hand's orientation, while its straight-line velocity is determined by its distance from the image. El Makrini et al. found that hand shape can control a robot. Instead of using high-level depth and color features, all intensity values in a box around the hand are used[19]. This replaces feature extraction. The average neighborhood margin maximization algorithm reduces feature space dimensions (ANMM). Four hand gestures are classified using the nearest neighbor classifier. Wadhawan et al. [20] classify six hand gestures using a geometric property. In color images, hand skin color is evaluated to identify it. Calculate the distance between the hand's center and each outline point. A gesture's number of peaks and valleys determines its category. Zhang et al. created a system that
  • 4. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 3 recognizes hand gestures online in real time[21]. The chamfer distance aligns a hand template with an image's edges after removing a person's body. This step follows body division. The hand is then given a realistic skin color model. Tracing the hand's center across multiple frames reveals the final feature. Vector shows hand location and motion. Hidden Markov models learn hand movements. Due to color cameras' high frame rate, most current techniques use video sequences. Human-robot interaction (HRI) has become a focus of research in computer vision and robotics due to the broad range of applications in the field of human-computer interaction (HCI). 3. Working Prototype The robot, which comprises different modules, is primarily stationed at the origin of the desired area intended to be used. The following techniques are adapted to demonstrate the basic idea of this project. Figure 1 shows the block diagram of the proposed system. 3.1 Transmission and Reception A transmitter-receiver is enclosed in the base of the robot. This is operated at a frequency of 27 MHz. Radiofrequency utilizes the principles of radio waves to transfer signals to the receiver from the transmitter by adjusting the current and voltage parameters that alter the oscillation rate. It can send signals from 20KHz to around 200GHz. This concept is utilized such that when the robot is summoned at the user's side, an RF signal is transmitted from the transmitter situated at the table to the receiver, housed within the base of the robot. This receiver module is connected to the microcontroller that initiates the robot's movement. Consequently, the base of the robot moves to the respective table where the user is seated using TCS3200 module. Figure 1. Block diagram of the Proposed system 3.2. Line follower with color sensor After the successful reception of the signal, the robot is destined to reach the respective table where it was summoned. DC motors are powered by the motor driver that makes forward and backward movement possible depending on the command given to it by the microcontroller. Therefore, the base of the robot has to plan its path. It utilizes a color sensor to check the path that the robot takes. The
  • 5. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 4 color sensor module is calibrated to detect two colors, Red and Blue. The path to the table is laid using these two colors. Applying the principles of the line follower technique, the robot has to sense the right color and reach the table where the person is seated. 3.3. Gesture recognition The camera encompassed at the head of the robot helps to capture a frame containing the real-time image of the hand. The threshold of the frame is adjusted manually from its default value. Blur is introduced into the frame to cancel out unnecessary areas after adjusting the threshold according to the area's lighting conditions. It is then converted to a grayscale version. Finally, the image is then extrapolated using the microcomputer that interprets the following image shown in the frame to a value that is stored in a list. 3.4. I2 C interfacing After receiving the correct count, this value is passed to the microcontroller. The microcontroller is powered by a 5v rechargeable lithium-polymer battery, whereas the Raspberry pi 4 utilizes a 3.3v on its GPIO bus. The essential difference between the Arduino and Pi is the number of I/O ports they contain, especially Arduino, which contains several analog and digital ports. They can also be used to handle interrupts and timing circuits based on their usage. Therefore to utilize their maximum potential, they should be interfaced in such a way that they can communicate with each other. This is done using pre-built user-defined functions present in Python and Arduino IDE. 3.5. Dispenser The dispenser is built using a 12v DC pump, unlike a 9v pump with fewer rotations per minute. Being a submersible type water pump minimizes the expenses and becomes easier to replace if the pump does not function properly. The pipe wound around the DC motor must be checked and analyzed before use. Dimensions of the pipe are to be altered depending on the liquid that flows through it and the pressure applied to push the fluid from the storage unit to the cup. It can dispense up to two different drinks. The pipe used in the prototype is 5mm in diameter and 30cm in length. By adjusting the flow rate, the amount of liquid poured into the cup can be increased or decreased. This prototype can fill 100 ml into the cup for about 3 seconds. 3.6. Robotic arm A robotic arm is placed beside the dispenser to the right, which helps to place the cup below the dispenser at a fixed coordinate. After the cup is filled, the robotic arm then serves this cup on top of a tray. The robotic arm is manipulated using a microcontroller and is maneuvered using a metal gear servo motor. Being constructed with two degrees of freedom, it can lift heavy objects. Accordingly, the user can place heavier cups between its claws. This prototype can be attached to a cup holder if the user requires it. Temporarily the cup has to be placed on the robot's hands to receive the particulars. This eliminates the hassle of the user doubting the cleanliness of the cup used. 4. Module Description 4.1. Module 1: Raspberry PI 4 The proposed prototype is shown in figure 1. The Raspberry pi 4 is the updated version of the Pi series of microcomputers that was released a few years ago. It is a 64-bit processor having four cores. Some of the features are wireless, Ethernet, and Bluetooth connectivity. These cores operate at a
  • 6. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 5 synchronized speed of 1.5GHz. This is a computer with the smallest size factor possible. This prototype coordinates the working of each module. Consequently, the Pi transfers data between itself and the microcontroller establishing a two-way communication termed I2 C (Inter-Integrating Circuits) Communication. Therefore, it also helps in processing the finger count captured by the camera, scheduling tasks for each module connected to the microcontroller, and handling interrupts. However, due to its small form factor, there are some notable discrepancies, such as time delay and thermal latency issues. Moreover, Pi can also be used as a microcontroller by connecting an external module. Figure 2. Proposed Prototype 4.2. Module 2: Arduino UNO Arduino UNO is a microcontroller that contains a total of 20 input/output pins which are separated as digital and analog pins. ATmega328 is the chip built into it. The prototype only utilizes the digital section of the module and is powered by a 5v rechargeable lithium-polymer battery. Arduino is a microcontroller with more digital and analog pins than a microcomputer. Therefore, it helps to manipulate certain dependent modules, such as the Motor drivers and Servo motors, keeping the
  • 7. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 6 expenses to a minimum. Apart from hardware manipulation, the PWM pins present in the digital section of this module also help in timing and interrupt handling. 4.3. Module 3: RF Transceiver The 27MHz remotely controlled two-way channel button type transmitter that transmits the signal to the receiver uses the principles of radio waves. The left button signifies the Red color line, and the right button signifies the blue color line. The buttons work with the principles of that of a switch. When the right button is pushed, the signal sent is HIGH, keeping the other signal LOW. The working of the left button is similar to that of the right button. Accordingly, when we push the control button, the transmitter sends specific electrical pulses corresponding to that action through the air. The transmitter has its power source, usually in the form of a 9 to 12-volt battery. Radiofrequency, a short distant signal with good noise immunity and resistance, is used to command the robot to set its destination towards which it later moves. Utilizing a 27 MHz frequency also helps minimize latency issues and power consumption. Without the battery, the transmitter cannot send the radio frequency signals to the receiver. 4.4. Module 4: Motor driver L298N utilizes two H-Bridge mechanisms to control the low current-rated motor. This high-power module is connected to the microcontroller that can control up to four DC circuits, such as DC and Stepper Motors, with directional and speed control. This prototype utilizes this particular module to enhance the operability of the DC motors and pumps connected with it, reducing the space needed to encompass the setup. 4.5. Module 5: camera Logitech Quickcam Notebook Delux is an optical digital 640X480 resolution camera. The sensor is a 0.3 megapixel CMOS type sensor able to capture images and record real-time videos by allowing light to pass through the digital lens. Here, this camera module helps in capturing a real-time image of the hand within a frame of dimension 0.5cm in the X direction and 0.8cm towards the Y direction. This information is then sent to the microcomputer where it is further processed. 4.6. Module 6: Submersible DC Pump A submersible pump is a form of a pressure pump that can be fully submerged in water. Using a shaft, attached to the DC motor, the water is pumped out through the end pipe. The DC motor is protected from water by an insulated coating made of plastic. The pipe used in the prototype is 5mm in diameter and 30cm in length. By adjusting the rate of flow, the amount of liquid dispensed into the cup can be increased or decreased. This prototype can fill 100 ml into the cup for about 3 seconds. 4.7. Module 7: Servo Motor (Sg90) Servo motors are mechanical shaft like devices possessing high torque employed in fields such as robotics and automation. Due to high current requirements, the power supply connected to it must be sufficient enough to power all the servos connected to the microcontroller. In this prototype, the SG90 module helps to control the action of the robotic claw. The geared shaft in the servo motor rotates 1 degree at a time which can be electrically controlled. The robotic claw is set at a default value of 55 degrees after which it will come back to its initial position.
  • 8. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 7 4.8. Module 8: Servo motor (mg 996r) Similar to the SG90 module, this MG996r module is a metal gear servo motor that consists of a geared shaft built out of metal and is electrically controlled. This module can lift heavier and sturdy objects with ease. The prototype utilizes its function in the working of the robotic arm. Since the robotic arm is custom-built and bulky, this module helps to preserve the functions of the servo motor without getting damaged when a heavier object is lifted. 5. Results and Discussion The proposed prototype helps to tackle the issue between robot and human interaction. A remote- controlled RF signal is first transmitted to the receiver embedded in the robot's base. After setting its destination coordinate, using the line follower technique along with the help of a color sensor, limited to two colors, the robot can move towards the table where the signal was first transmitted. The flow is then shifted to the gesture recognition system that helps to detect the fingers shown within the frame captured using a camera. This framework consists of the representation and the decision processes. The representation process converts the raw numerical data into a form adapted to the decision process, which then relays this information to the next stage. Therefore, the robot has to be placed in a surrounding with good lighting conditions. Consequently, the robotic hand, built with 2 degrees of freedom, activates, thus grabbing the cup within its gripper and swiveling to place it below the dispenser at a particular assigned coordinate. The custom-built dispenser pours the required particulars making the claw grab the cup again and serve it on a tray. The number of beverages poured into the cup can be adjusted manually. Therefore, this prototype is an uncomplicated model that requires further modification and customization to deploy in real-time scenarios. Nevertheless, this is a step taken that involves intense engineering and programming to achieve this feat. 6. Conclusion Gesture recognition is a topic of language technology to interpret human gestures via mathematical algorithms. This is a field where researchers are actively working to break the barrier between human and robot interaction. The need for handheld devices can be reduced by employing this concept of gesture recognition which opens up an avenue of newer specialized interactive devices. Our project thus helps bridge the gap between robots and humans using this technology. It serves as a gateway and inception for those who are deaf and mute. They can now indulge in social interaction without needing a third person to aid them. Due to the recent pandemic, vendors must adopt innovative strategic ways of serving and attending to guests. Moreover, even in hospitals and houses, older adults need assistance to get beverages to satisfy their thirst. By employing this prototype, these issues can be overcome with ease. Socially and physically challenged people can also use such a machine to serve their particulars without needing another human being in their care. A contactless form of interaction is achieved, thus limiting the spread of germs and viruses. This ensures cleanliness and a hygienic way of serving the ordered particulars. Contactless forms of service in the new future, thereby, the users’ satisfaction and quality of service increase exponentially. Despite its limitations, this prototype can be further enhanced with technologies such as AI and Machine learning to detect surrounding objects and Computer Vision to aid the robot in dynamic room mapping. References [1] https://github.com/lzane/Fingers-Detection-using-OpenCV-and-Python accessed on 10th Nov,2022. accessed on 1st Dec 2022. [2] Masuda T, Misaki D. Development of Japanese Green Tea Serving Robot “T-Bartender”. Proceeding of the IEEE International Conference on Mechatronics & Automation; July 2005; Niagara Falls, Canada. Fukuroi-shi, Toyosawa, Japan: Department of Mechanical
  • 9. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 8 Engineering, Shizuoka Institute of Science and Technology; 2005. p. 1069- 1074. [3] Takahashi Y, Hosokawa M, Mochizuki T. Tea Serving Robot. SICE; 29-31 July 1997. Tokushima, Japan. p. 1111-1114. [4] Bannach, D., Amft, O., Kunze, K.S., Heinz, E.A. Tröster, G., and Lukowicz, P. Waving real- hand gestures recorded by wearable motion sensors to a virtual car and driver in a mixed- reality parking game. In Proceedings of the Second IEEE Symposium on Computational Intelligence and Games (Honolulu, Apr. 1--5, 2007), 32—39 [5] Chen, Y.T. and Tseng, K.T. Developing a multiple-angle hand-gesture-recognition system for human-machine interactions. In Proceedings of the 33rd Annual Conference of the IEEE Industrial Electronics Society (Taipei, Nov. 5--8, 2007), 489—492 [6] Niemelä, Marketta, Päivi Heikkilä, and Hanna Lammi. "A social service robot in a shopping mall: expectations of the management, retailers and consumers." In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pp. 227-228. 2017. [7] Chang, Woojung, and Kyoungmi Kate Kim. "Appropriate service robots in exchange and communal relationships." Journal of Business Research 141 (2022): 462-474. [8] Tojib, Dewi, Ting Hin Ho, Yelena Tsarenko, and Iryna Pentina. "Service robots or human staff? The role of performance goal orientation in service robot adoption." Computers in Human Behavior (2022): 107339. [9] Takanokura, Masato, Ren Kurashima, Tsubasa Ohhira, Yoshihiro Kawahara, and Mitsuharu Ogiya. "Implementation and user acceptance of social service robot for an elderly care program in a daycare facility." Journal of Ambient Intelligence and Humanized Computing (2021): 1-10. [10] Prasanalakshmi, B. "Deep Regression hybridized Neural Network in human stress detection." In 2022 International Conference on Smart Technologies and Systems for Next Generation Computing (ICSTSN), pp. 1-5. IEEE, 2022. [11] Sabapathy, Sundaresan, Surendar Maruthu, Suresh Kumar Krishnadhas, Ananth Kumar Tamilarasan, and Nishanth Raghavan. "Competent and Affordable Rehabilitation Robots for Nervous System Disorders Powered with Dynamic CNN and HMM." Intelligent Systems for Rehabilitation Engineering (2022): 57-93. [12] De Smedt, Quentin, Hazem Wannous, and Jean-Philippe Vandeborre. "Skeleton-based dynamic hand gesture recognition." In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 1-9. 2016. [13] Nunez, Juan C., Raul Cabido, Juan J. Pantrigo, Antonio S. Montemayor, and Jose F. Velez. "Convolutional neural networks and long short-term memory for skeleton-based human activity and hand gesture recognition." Pattern Recognition 76 (2018): 80-94. [14] Tang, Danhang, Hyung Jin Chang, Alykhan Tejani, and Tae-Kyun Kim. "Latent regression forest: structured estimation of 3d hand poses." IEEE Transactions on Pattern Analysis and Machine Intelligence 39, no. 7 (2016): 1374-1387. [15] Kumar, P. Praveen, T. Ananth Kumar, R. Rajmohan, and M. Pavithra. "AI-Based Robotics in E- Healthcare Applications." In Intelligent Interactive Multimedia Systems for E-Healthcare Applications, pp. 249-269. Apple Academic Press, 2022. [16] Yao, Yuan, and Yun Fu. "Contour model-based hand-gesture recognition using the Kinect sensor." IEEE Transactions on Circuits and Systems for Video Technology 24, no. 11 (2014): 1935-1944. [17] Pisharady, Pramod Kumar, and Martin Saerbeck. "Recent methods and databases in vision- based hand gesture recognition: A review." Computer Vision and Image Understanding 141 (2015): 152-165. [18] Barman, H., Gösta H. Granlund, and Hans Knutsson. "A new approach to curvature estimation and description." In Third International Conference on Image Processing and its Applications, 1989., pp. 54-58. IET, 1989.
  • 10. 4th National Conference on Communication Systems (NCOCS 2022) Journal of Physics: Conference Series 2466(2023) 012020 IOP Publishing doi:10.1088/1742-6596/2466/1/012020 9 [19] El Makrini, Ilias, Shirley A. Elprama, Jan Van den Bergh, Bram Vanderborght, Albert-Jan Knevels, Charlotte IC Jewell, Frank Stals et al. "Working with walt: How a cobot was developed and inserted on an auto assembly line." IEEE Robotics & Automation Magazine 25, no. 2 (2018): 51-58. [20] Wadhawan, Ankita, and Parteek Kumar. "Sign language recognition systems: A decade systematic literature review." Archives of Computational Methods in Engineering 28, no. 3 (2021): 785-813. [21] Zhang, Jie, Xiao‐Qing Xu, Jun Liu, Lei Li, and Qiong‐Hua Wang. "Three‐dimensional interaction and autostereoscopic display system using gesture recognition." Journal of the Society for Information Display 21, no. 5 (2013): 203-208.