This document describes a gesture control robot project. The goal is to create a robot that can be controlled through hand gestures rather than a remote control, to allow for contactless delivery in situations where social distancing is required, like hospitals. It discusses the literature around existing remote controlled robots and human-machine interfaces. The objective is to make an affordable and user-friendly robot that can be used for various purposes. The methodology describes using sensors to detect hand gestures and send signals to a microcontroller and transmitter to control a receiver and motors on the robot. Components include an Arduino, RF module, accelerometer, motor driver. Circuit diagrams and a demo image are provided. The conclusion states the goal was accomplished by controlling robot movements through
3. INTRODUCTION
THIS PROJECT AIMS TO BUILD A HAND GESTURE-
CONTROLLED ROBOT THAT CAN BE USED IN HOSPITALS,
SHOPS, HOTELS, HOMES, ETC., WHERE CONTACT-LESS
DELIVERY IS NECESSARY SO THAT SOCIAL DISTANCING IS
STRICTLY FOLLOWED AMID THE COVID-19 SITUATION
4. LITERATURE SURVEY
THE CONVENTIONAL REMOTE-CONTROLLED ROBOTS TURN OUT TO BE EXTREMELY
BULKY AND IT ADDITIONALLY RESTRICTS THE RANGE OF OPERATION OF THE ROBOTS
FROM THE REMOTE HANDLING USER. FOR THIS REASON, THERE HAVE BEEN SEVERAL
ADVANCEMENTS IN REALM OF HUMAN-MACHINE ASSOCIATION.
SO AS TO EXPAND THE UTILIZATION OF ROBOT IN AREAS WHERE DEPENDENCY ON
MANPOWER CAN BE REDUCED SUCH AS IN THE FIELD OF MEDICINE, IN OPERATION
THEATRES TO MOVE SURGERY CARTS, IN THE AUTOMOBILE ASSEMBLING INDUSTRY, IN
RESTAURANTS FOR SERVING FOOD TO TABLES, AND ESPECIALLY IN CONDITIONS ARE
NOT SUITABLE, EASILY ACCESSIBLE AND SECURE
5. OBJECTIVE
THE OBJECTIVE OF THIS MODEL IS TO CREATE A CONVENIENT, EASY-
TO-OPERATE HUMAN-MACHINE INTERFACE FOR CONTROLLING A
ROBOT - BE IT A VEHICLE, ROBOTIC ARM, TOY OR A WHEELCHAIR.
OUR GOAL IS TO MAKE THIS GADGET USER-FRIENDLY AND COST-
EFFECTIVE SO IT CAN BE AFFORDABLE TO THE GENERAL PUBLIC
AND ACCESSIBLE TO A BIG MARKET, ALONG WITH BEING UTILIZED
FOR NUMBER OF PURPOSES
6. METHODOLOGY
1. SENSED THE HAND GESTURE MOVEMENT FROM HAND TO MICROCONTROLLER.
2. MCU RECEIVES THE DATAAND MAKE INSTRUCTIONS FOR THE ROBOT. SENDS THE INSTRUCTION TO THE ENCODER IC.
3. ENCODED DATA TRANSMITS THROUGH THE TRANSMITTER.
4. IN THE RECEIVERS END THE RECEIVER RECEIVES THE ENCODED DATA.
5. RECEIVER SENDS THE ENCODED DATA TO THE DECODER.
6. DECODER DECODES THE DATAAND SENDS TO THE MOTOR DRIVER.
7. MOTOR DRIVER DRIVES THE MOTOR IN ALL MOVEMENTS BY FOLLOWING THE INSTRUCTION AND GESTURES.
8. FINALLY, THE ROBOT MOVES WITH THE GESTURES.
11. CONCLUSION
WE ACCOMPLISHED OUR GOAL WITH NO OBSTACLES I.E., CONTROLLING A ROBOT
WITH GESTURES RATHER THAN REMOTE CONTROLLED DEVICES. OUR ROBOT IS
INDICATING LEGITIMATE REACTIONS AT WHATEVER POINT WE MOVE OUR HAND.
THE OUTPUT FROM THE FOUR PRE-DEFINED HAND MOTIONS TO MAKE THE ROBOT
MOVE IN DESIRED DIRECTIONS ARE: FLEXION FOR FORWARD MOTION, EXTENSION
FOR BACKWARD MOTION, TILT RIGHT FOR RIGHT TURN AND TILT LEFT FOR LEFT
TURN.