The document summarizes a project report for an autonomous robot named NA-YATR submitted to The Robotics Club. The report describes the components used including Arduino, motors, sensors, and software. It explains the working of the robot using A* pathfinding and ultrasonic sensors for obstacle avoidance. Experimental results showed the robot could efficiently detect obstacles and change paths to reach its destination. Future enhancements are discussed to improve functionality.
The document describes a humanoid bipedal robot project that was designed using principles of zero momentum point and passive dynamics. The robot uses various sensors such as object detection, motion, and touch sensors. An ATMEL 89S52 microcontroller is used to interface the motor drivers and sensors. The challenges of using a basic microcontroller board for interfacing multiple components are discussed. The project aims to be a starting point for further research in humanoid design.
IRJET- Design and Fabrication of Automated Wheel ChairIRJET Journal
This document describes the design and fabrication of an automated wheelchair. The wheelchair is designed to help disabled individuals gain mobility. It uses various sensors and a microcontroller to detect obstacles and navigate automatically. The wheelchair can be controlled via a joystick or gestures. It has an ultrasonic sensor to detect obstacles and a GPS module to track its location. When an object is detected, the wheelchair turns right, moves forward a short distance, and stops to avoid collisions. The LCD display provides information like location coordinates and movement directions to the user. The goal is to develop a smart wheelchair that can be safely operated with minimal user input.
The document presents a second presentation on a wireless gesture controlled robot developed by students at Dr. Ambedkar Institute of Technology for Handicapped. The presentation covers an introduction to gesture controlled robots, different types of gesture recognition including glove-based and vision-based, the working principle using an accelerometer, block and circuit diagrams of the proposed system, applications, and a conclusion stating the system was successfully developed at low cost and power to control a robot through natural gestures.
This document provides details about a project to create a smart wheelchair that can be controlled through hand gestures. A robotic arm is attached to the wheelchair to allow objects to be manipulated. The wheelchair and arm will be controlled by an Arduino using image processing of hand gestures from a camera. Inverse kinematics algorithms will be used to position the robotic arm. The project aims to help people with disabilities move freely and interact with objects.
Shirsha Yaathra - Head Movement controlled Wheelchair - Research PaperChamila Wijayarathna
The document describes a project to develop a wheelchair mobility control system that is controlled by head movements for an army officer who has lost motor control below his neck. A tilt sensor is used to track head movements and send signals to an Arduino board which controls the wheelchair motors. Voice commands are also used for control. Ultrasonic sensors were added for obstacle detection and avoidance. The prototype was successfully tested and provides mobility for disabled individuals unable to use standard wheelchairs.
This document describes an android-based automated smart wheelchair that can be controlled via smartphone. Key points:
1) It uses a smartphone's built-in accelerometer sensors and Bluetooth technology to transmit control signals to a microcontroller connected to DC motors that power the wheelchair's wheels.
2) The microcontroller receives the Bluetooth signals and controls the wheelchair motion.
3) It allows for easier mobility for disabled users by automating the wheelchair and controlling it remotely with a smartphone.
This document discusses pick and place robots. It describes how a pick and place robot uses a stud mechanism with threads on both ends to grip and move objects. One end of the stud is connected to a DC motor and the other end is attached to a longitudinal beam and gripper. As the motor rotates, it causes the stud to rotate and loosen or tighten the thread, moving the beam and opening/closing the gripper. The document then provides background information on end-of-arm tooling, robot control systems, robot market trends, robot working processes, robot types and features, and robot applications.
it is a smart wheelchair which uses voice and bluetooth commands . Also consists of temperature and heartbeat sensors for continuous monitoring by the doctor.
The document describes a humanoid bipedal robot project that was designed using principles of zero momentum point and passive dynamics. The robot uses various sensors such as object detection, motion, and touch sensors. An ATMEL 89S52 microcontroller is used to interface the motor drivers and sensors. The challenges of using a basic microcontroller board for interfacing multiple components are discussed. The project aims to be a starting point for further research in humanoid design.
IRJET- Design and Fabrication of Automated Wheel ChairIRJET Journal
This document describes the design and fabrication of an automated wheelchair. The wheelchair is designed to help disabled individuals gain mobility. It uses various sensors and a microcontroller to detect obstacles and navigate automatically. The wheelchair can be controlled via a joystick or gestures. It has an ultrasonic sensor to detect obstacles and a GPS module to track its location. When an object is detected, the wheelchair turns right, moves forward a short distance, and stops to avoid collisions. The LCD display provides information like location coordinates and movement directions to the user. The goal is to develop a smart wheelchair that can be safely operated with minimal user input.
The document presents a second presentation on a wireless gesture controlled robot developed by students at Dr. Ambedkar Institute of Technology for Handicapped. The presentation covers an introduction to gesture controlled robots, different types of gesture recognition including glove-based and vision-based, the working principle using an accelerometer, block and circuit diagrams of the proposed system, applications, and a conclusion stating the system was successfully developed at low cost and power to control a robot through natural gestures.
This document provides details about a project to create a smart wheelchair that can be controlled through hand gestures. A robotic arm is attached to the wheelchair to allow objects to be manipulated. The wheelchair and arm will be controlled by an Arduino using image processing of hand gestures from a camera. Inverse kinematics algorithms will be used to position the robotic arm. The project aims to help people with disabilities move freely and interact with objects.
Shirsha Yaathra - Head Movement controlled Wheelchair - Research PaperChamila Wijayarathna
The document describes a project to develop a wheelchair mobility control system that is controlled by head movements for an army officer who has lost motor control below his neck. A tilt sensor is used to track head movements and send signals to an Arduino board which controls the wheelchair motors. Voice commands are also used for control. Ultrasonic sensors were added for obstacle detection and avoidance. The prototype was successfully tested and provides mobility for disabled individuals unable to use standard wheelchairs.
This document describes an android-based automated smart wheelchair that can be controlled via smartphone. Key points:
1) It uses a smartphone's built-in accelerometer sensors and Bluetooth technology to transmit control signals to a microcontroller connected to DC motors that power the wheelchair's wheels.
2) The microcontroller receives the Bluetooth signals and controls the wheelchair motion.
3) It allows for easier mobility for disabled users by automating the wheelchair and controlling it remotely with a smartphone.
This document discusses pick and place robots. It describes how a pick and place robot uses a stud mechanism with threads on both ends to grip and move objects. One end of the stud is connected to a DC motor and the other end is attached to a longitudinal beam and gripper. As the motor rotates, it causes the stud to rotate and loosen or tighten the thread, moving the beam and opening/closing the gripper. The document then provides background information on end-of-arm tooling, robot control systems, robot market trends, robot working processes, robot types and features, and robot applications.
it is a smart wheelchair which uses voice and bluetooth commands . Also consists of temperature and heartbeat sensors for continuous monitoring by the doctor.
IRJET- A Survey of Trolley/Wheelchair based Smart System for Exclusive Medica...IRJET Journal
This document summarizes a research paper on developing a smart trolley system for exclusive medical applications. The proposed system is designed to provide better service for patients, children, and elderly people by making them partially independent. The smart trolley consists of a motorized trolley controlled through an Android app or voice commands. Sensors and an Arduino controller are used to navigate the trolley and detect obstacles. The trolley can deliver medication, food, and other supplies to those in need on a predefined schedule and route, reducing their dependence on caregivers. The document reviews the state of research on smart trolleys and wheelchairs and discusses various technologies like machine learning, path following, localization, and navigational assistance that could enable
This document presents a final year project on modelling and fabricating a 5 degree of freedom robotic arm. The objectives were to model the arm using CAD software, select materials and components, perform torque calculations, and analyze the workspace and kinematics. A group of students designed and built a prototype arm controlled by an Arduino mega board without any external controls. Future recommendations include adding measurement devices and controlling the arm with Matlab instead of Arduino. The project aimed to teach robotics concepts and could be useful for educational purposes.
IRJET- Design and Construction of Electric Drive -A Smart System for Disabled...IRJET Journal
This document summarizes the design and construction of an electric wheelchair with multiple control systems and therapy facilities. The wheelchair can be controlled through a joystick, voice commands using an Android app, or obstacle detection sensors. It also includes several therapy features like arm exercises, vibration pads, and heating/cooling elements. The goal is to provide various mobility and therapy options in a single affordable wheelchair to help disabled users live more independently.
This document describes the design and manufacturing of an autonomous cart capable of following a user. The cart uses a Microsoft Kinect sensor to recognize and track users through voice and gesture recognition. An Arduino microcontroller controls the cart's motors to follow the user while avoiding obstacles. Two prototypes were created, with the second using stronger aluminum wheels, an acrylic base, and improved Kinect mounting. The cart aims to autonomously follow a single identified user based on their commands, maintaining distance and navigating obstacles. The document evaluates the cart's stress resistance and discusses challenges in accurate human tracking and autonomous navigation.
This document discusses machine vision and its application in robotic arms. It begins with an introduction and overview of concepts related to machine vision and robotic arms. It then discusses the working of machine vision through image processing steps like grayscale conversion, edge detection, dilation and finding bounding boxes. It describes algorithms used for object recognition and controlling the robotic arm. Some advantages and applications of machine vision in robotic arms are presented, along with potential enhancements and references.
Implementation of pid control to reduce wobbling in a line following roboteSAT Journals
This document summarizes the implementation of a PID control system on a line following robot to reduce wobbling and improve tracking of the line. It describes the components of the robot including sensors, microcontroller, motors and power source. It discusses line following without PID which resulted in large deviations and inability to follow at high speeds. The document then provides details on how PID control was implemented, including definitions of target position, measured position, error, and the proportional, integral and derivative components. It explains how these factors were coded into the microcontroller to calculate motor speeds. The results showed much smoother line following with minimal wobbling even at high speeds compared to without PID control.
MODELING (mechanical) AND ANALYSIS OF ROBO-ARM FOR PICK AND PLACE OPERATION I...ijsrd.com
Robo- arm is assembly of number of joints which can work in 180 degree direction that allows the object to 'move' in its require direction, and is commonly used in mechanical industry where pick and place operation are carried out .It consists of a pair of hinges located close together, oriented at maximum 90° to each other, connected by a pin joint .Now, this project is based from ceramic industry in which the robo-arm perform its operation for pick and place activity very quickly. Here, I design the mechanical structure of robo-arm. Robo-arm can work at which places where, human can't work continuously in ceramic industry. For example at Furnace division .Robo-arm has its own end effectors. with the help of it, rob-arm can pick the object easily and safely. Basic design concept is taken from ceramic industry at the furnace division where, the working temperature is more than ambient temperature .With the help robo -arm we can save the time and cost, as compare to crane operated loading system and manual belt conveyor system, because robo-arm can place the component at particular place of the part storage area.
It consisting Mobile app, Arduino, Bluetooth Receiver module, L293D ic etc. The movement of robot is controlled by the voice which catch by the microphone inside the mobile.
Pick N Place robots are used to pick up objects and place them in desired locations. They consist of a rover body with joints, an end effector for gripping objects, actuators like motors to move the robot, and sensors and a controller. The basic operation involves the wheels moving the base to the object's location, the rigid body bending to reach it, and the end effector picking up and placing the object. The robot can be controlled wirelessly via a keypad that sends commands to move motors in different directions. Pick N Place robots are used in manufacturing, defense applications like bomb diffusion, and medical operations for their accuracy and flexibility.
This document describes an automated gesture-based wireless wheelchair control system using an accelerometer. The system uses an accelerometer sensor to detect hand gestures which are converted to electrical signals and transmitted wirelessly. The receiver then converts the signals and uses them to control a wheelchair's movement and direction. The system was developed to help paralyzed people move independently using hand gestures to tilt the wheelchair forward, backward, left, right, or stop. It allows for movement over 200 yards and detects obstacles using an ultrasonic sensor.
This document describes an obstacle avoiding car project created by Utkarsh Bingewar, Shubham Thakur, and Rupesh Rote, with guidance from their assistant professor Mrs. Varsha Nanaware. The car uses an ultrasonic sensor and Arduino board to detect obstacles and navigate around them. When an obstacle is detected, the Arduino controls the motors to turn the car left or right to avoid the obstacle. The obstacle avoiding car has applications in areas like surveillance, hazardous environments, and unmanned vehicle navigation.
The document describes the design of the Rapido wheelchair. The wheelchair is controlled by head movements detected by sensors in a Bluetooth headset or by joystick. It uses ultrasonic sensors to detect obstacles and an Arduino microcontroller. A mobile app allows the user and attendant to communicate and share the user's location and sensor data. The app has features like emergency calling, voice commands converted to text, and locating nearby hospitals. The wheelchair is intended to help disabled individuals move independently without needing constant assistance.
Pic & Place - Thesis poster-template@AIUBNusrat Mary
The document describes the design and implementation of a pick and place robot controlled by a microcontroller with servo motors. The robot uses an ATmega16 microcontroller and servo motors to pick up and place objects within a certain range and angle. The robot was able to successfully pick up and place objects as programmed. Future work could include improving the gripper design to handle different sized objects and adding feedback control for more precise placement.
This document summarizes key aspects of robotics and a line following robot project. It discusses that robotics involves designing and building intelligent mechanical agents to perform tasks autonomously or with guidance. It then describes a line following robot that uses infrared sensors to detect and follow a black line on a white surface without human control. The robot is able to correct itself to stay on the track and uses different motor speeds to enable turns. Microcontrollers like the ATmega8L are used as the processing system to generate outputs from sensor inputs.
This document describes the design and manufacturing of a driverless cart capable of following a user. The cart uses a Microsoft Kinect sensor and Arduino microcontroller to track and follow a human target while avoiding obstacles. Two prototypes were created, with the second prototype addressing issues like reduced vibrations and increased load capacity. A stress analysis showed the cart base can withstand over 31870 N/m^2 of stress. The goal is to create an autonomous companion robot that can accurately track and follow a user through voice and gesture commands.
The document discusses robotics competitions that are important for engineering students. It describes several major robotics competitions including Robotics Olympiad, Robocon, Robotryst, Full Throttle, and Robowars. These competitions involve designing and building robots to complete tasks like racing, fighting, and solving engineering challenges. They provide opportunities for students to apply their engineering skills and gain experience that can help their careers in robotics.
IRJET- Development of Robotic Arm using Arduino and Matlab R2016aIRJET Journal
This document describes the development of a robotic arm using Arduino and MATLAB R2016a. The robotic arm is controlled through a graphical user interface in MATLAB. Commands to move the arm left, right, up, down and to grip or release objects are sent from the GUI to an Arduino Uno board connected to the computer. A camera mounted on the arm allows the user to see the position of objects and guide movements. The arm uses three motors for accurate movement and an object detection system to facilitate pick and place tasks.
IRJET - Wheelchair Control using Eye-MotionIRJET Journal
This document describes a system for controlling a wheelchair using eye motion detection. A camera is used to track the user's eye movement and send control signals to the wheelchair based on whether the user is looking left, right, or forward. An ultrasonic sensor is also included for obstacle detection to stop the wheelchair if an obstruction is sensed.
Design and Fabrication of Obstacle Avoiding Robotic VehicleIRJET Journal
The document describes the design and fabrication of an obstacle avoiding robotic vehicle. Some key points:
- The robotic vehicle uses an Arduino microcontroller and ultrasonic sensors to detect obstacles in its path. It is able to maneuver autonomously in unknown environments without collisions.
- When an obstacle is detected, the microcontroller redirects the robot by controlling the motors to move in an alternate direction and avoid the obstacle.
- The low-cost components like the Arduino, ultrasonic sensors, motor driver and DC motors make the robot easily replicable. The robot is able to fulfill goals like autonomous obstacle detection and avoidance in real-time without external control.
ACCELEROMETER BASED HAND GESTURE CONTROLLED ROBOT USING ARDUINOSnehasis Mondal
WORKING ARDUINO CODE:
/* * Gesture Recognition Robot * Coder – Raj,Rajib,Saity,Snehasis * This program lets you to control your robot with gesture made by your hand */ int GNDPin=A4; //Set Analog pin 4 as GND int VccPin=A5; //Set Analog pin 5 as VCC int xPin=A3; //X axis input int yPin=A2; //Y axis input int zPin=A1; //Z axis input(not used) int Q1=10,Q2=11,Q3=12,Q4=13; //Output pins to be connected to 10, 11, 12, 13 of Decoder IC long x; //Variabe for storing X coordinates long y; //Variabe for storing Y coordinates long z; //Variabe for storing Z coordinates void setup() { Serial.begin(9600); pinMode(Q1,OUTPUT); pinMode(Q2,OUTPUT); pinMode(Q3,OUTPUT); pinMode(Q4,OUTPUT); pinMode(GNDPin, OUTPUT); pinMode(VccPin, OUTPUT); digitalWrite(GNDPin, LOW); //Set A4 pin LOW digitalWrite(VccPin, HIGH); //Set A5 pin HIGH } void loop() { x = analogRead(xPin); //Reads X coordinates y = analogRead(yPin); //Reads Y coordinates z = analogRead(zPin); //Reads Z coordinates (Not Used) if(x<340) // Change the value for adjusting sensitivity forward(); else if(x>400) // Change the value for adjusting sensitivity backward(); else if(y>400) // Change the value for adjusting sensitivity right(); else if(y<340) // Change the value for adjusting sensitivity left(); else stop_(); } void stop_() { Serial.println(""); Serial.println("STOP"); digitalWrite(Q1,LOW); digitalWrite(Q2,LOW); digitalWrite(Q3,LOW); digitalWrite(Q4,LOW); } void forward() { Serial.println(""); Serial.println("Forward");
digitalWrite(Q1,HIGH); digitalWrite(Q2,LOW); digitalWrite(Q3,HIGH); digitalWrite(Q4,LOW); } void backward() { Serial.println(""); Serial.println("Backward"); digitalWrite(Q1,LOW); digitalWrite(Q2,HIGH); digitalWrite(Q3,LOW); digitalWrite(Q4,HIGH); } void left() { Serial.println(""); Serial.println("Left"); digitalWrite(Q1,LOW); digitalWrite(Q2,HIGH); digitalWrite(Q3,HIGH); digitalWrite(Q4,LOW); } void right() { Serial.println(""); Serial.println("Right"); digitalWrite(Q1,HIGH); digitalWrite(Q2,LOW); digitalWrite(Q3,LOW); digitalWrite(Q4,HIGH); }
IRJET- A Survey of Trolley/Wheelchair based Smart System for Exclusive Medica...IRJET Journal
This document summarizes a research paper on developing a smart trolley system for exclusive medical applications. The proposed system is designed to provide better service for patients, children, and elderly people by making them partially independent. The smart trolley consists of a motorized trolley controlled through an Android app or voice commands. Sensors and an Arduino controller are used to navigate the trolley and detect obstacles. The trolley can deliver medication, food, and other supplies to those in need on a predefined schedule and route, reducing their dependence on caregivers. The document reviews the state of research on smart trolleys and wheelchairs and discusses various technologies like machine learning, path following, localization, and navigational assistance that could enable
This document presents a final year project on modelling and fabricating a 5 degree of freedom robotic arm. The objectives were to model the arm using CAD software, select materials and components, perform torque calculations, and analyze the workspace and kinematics. A group of students designed and built a prototype arm controlled by an Arduino mega board without any external controls. Future recommendations include adding measurement devices and controlling the arm with Matlab instead of Arduino. The project aimed to teach robotics concepts and could be useful for educational purposes.
IRJET- Design and Construction of Electric Drive -A Smart System for Disabled...IRJET Journal
This document summarizes the design and construction of an electric wheelchair with multiple control systems and therapy facilities. The wheelchair can be controlled through a joystick, voice commands using an Android app, or obstacle detection sensors. It also includes several therapy features like arm exercises, vibration pads, and heating/cooling elements. The goal is to provide various mobility and therapy options in a single affordable wheelchair to help disabled users live more independently.
This document describes the design and manufacturing of an autonomous cart capable of following a user. The cart uses a Microsoft Kinect sensor to recognize and track users through voice and gesture recognition. An Arduino microcontroller controls the cart's motors to follow the user while avoiding obstacles. Two prototypes were created, with the second using stronger aluminum wheels, an acrylic base, and improved Kinect mounting. The cart aims to autonomously follow a single identified user based on their commands, maintaining distance and navigating obstacles. The document evaluates the cart's stress resistance and discusses challenges in accurate human tracking and autonomous navigation.
This document discusses machine vision and its application in robotic arms. It begins with an introduction and overview of concepts related to machine vision and robotic arms. It then discusses the working of machine vision through image processing steps like grayscale conversion, edge detection, dilation and finding bounding boxes. It describes algorithms used for object recognition and controlling the robotic arm. Some advantages and applications of machine vision in robotic arms are presented, along with potential enhancements and references.
Implementation of pid control to reduce wobbling in a line following roboteSAT Journals
This document summarizes the implementation of a PID control system on a line following robot to reduce wobbling and improve tracking of the line. It describes the components of the robot including sensors, microcontroller, motors and power source. It discusses line following without PID which resulted in large deviations and inability to follow at high speeds. The document then provides details on how PID control was implemented, including definitions of target position, measured position, error, and the proportional, integral and derivative components. It explains how these factors were coded into the microcontroller to calculate motor speeds. The results showed much smoother line following with minimal wobbling even at high speeds compared to without PID control.
MODELING (mechanical) AND ANALYSIS OF ROBO-ARM FOR PICK AND PLACE OPERATION I...ijsrd.com
Robo- arm is assembly of number of joints which can work in 180 degree direction that allows the object to 'move' in its require direction, and is commonly used in mechanical industry where pick and place operation are carried out .It consists of a pair of hinges located close together, oriented at maximum 90° to each other, connected by a pin joint .Now, this project is based from ceramic industry in which the robo-arm perform its operation for pick and place activity very quickly. Here, I design the mechanical structure of robo-arm. Robo-arm can work at which places where, human can't work continuously in ceramic industry. For example at Furnace division .Robo-arm has its own end effectors. with the help of it, rob-arm can pick the object easily and safely. Basic design concept is taken from ceramic industry at the furnace division where, the working temperature is more than ambient temperature .With the help robo -arm we can save the time and cost, as compare to crane operated loading system and manual belt conveyor system, because robo-arm can place the component at particular place of the part storage area.
It consisting Mobile app, Arduino, Bluetooth Receiver module, L293D ic etc. The movement of robot is controlled by the voice which catch by the microphone inside the mobile.
Pick N Place robots are used to pick up objects and place them in desired locations. They consist of a rover body with joints, an end effector for gripping objects, actuators like motors to move the robot, and sensors and a controller. The basic operation involves the wheels moving the base to the object's location, the rigid body bending to reach it, and the end effector picking up and placing the object. The robot can be controlled wirelessly via a keypad that sends commands to move motors in different directions. Pick N Place robots are used in manufacturing, defense applications like bomb diffusion, and medical operations for their accuracy and flexibility.
This document describes an automated gesture-based wireless wheelchair control system using an accelerometer. The system uses an accelerometer sensor to detect hand gestures which are converted to electrical signals and transmitted wirelessly. The receiver then converts the signals and uses them to control a wheelchair's movement and direction. The system was developed to help paralyzed people move independently using hand gestures to tilt the wheelchair forward, backward, left, right, or stop. It allows for movement over 200 yards and detects obstacles using an ultrasonic sensor.
This document describes an obstacle avoiding car project created by Utkarsh Bingewar, Shubham Thakur, and Rupesh Rote, with guidance from their assistant professor Mrs. Varsha Nanaware. The car uses an ultrasonic sensor and Arduino board to detect obstacles and navigate around them. When an obstacle is detected, the Arduino controls the motors to turn the car left or right to avoid the obstacle. The obstacle avoiding car has applications in areas like surveillance, hazardous environments, and unmanned vehicle navigation.
The document describes the design of the Rapido wheelchair. The wheelchair is controlled by head movements detected by sensors in a Bluetooth headset or by joystick. It uses ultrasonic sensors to detect obstacles and an Arduino microcontroller. A mobile app allows the user and attendant to communicate and share the user's location and sensor data. The app has features like emergency calling, voice commands converted to text, and locating nearby hospitals. The wheelchair is intended to help disabled individuals move independently without needing constant assistance.
Pic & Place - Thesis poster-template@AIUBNusrat Mary
The document describes the design and implementation of a pick and place robot controlled by a microcontroller with servo motors. The robot uses an ATmega16 microcontroller and servo motors to pick up and place objects within a certain range and angle. The robot was able to successfully pick up and place objects as programmed. Future work could include improving the gripper design to handle different sized objects and adding feedback control for more precise placement.
This document summarizes key aspects of robotics and a line following robot project. It discusses that robotics involves designing and building intelligent mechanical agents to perform tasks autonomously or with guidance. It then describes a line following robot that uses infrared sensors to detect and follow a black line on a white surface without human control. The robot is able to correct itself to stay on the track and uses different motor speeds to enable turns. Microcontrollers like the ATmega8L are used as the processing system to generate outputs from sensor inputs.
This document describes the design and manufacturing of a driverless cart capable of following a user. The cart uses a Microsoft Kinect sensor and Arduino microcontroller to track and follow a human target while avoiding obstacles. Two prototypes were created, with the second prototype addressing issues like reduced vibrations and increased load capacity. A stress analysis showed the cart base can withstand over 31870 N/m^2 of stress. The goal is to create an autonomous companion robot that can accurately track and follow a user through voice and gesture commands.
The document discusses robotics competitions that are important for engineering students. It describes several major robotics competitions including Robotics Olympiad, Robocon, Robotryst, Full Throttle, and Robowars. These competitions involve designing and building robots to complete tasks like racing, fighting, and solving engineering challenges. They provide opportunities for students to apply their engineering skills and gain experience that can help their careers in robotics.
IRJET- Development of Robotic Arm using Arduino and Matlab R2016aIRJET Journal
This document describes the development of a robotic arm using Arduino and MATLAB R2016a. The robotic arm is controlled through a graphical user interface in MATLAB. Commands to move the arm left, right, up, down and to grip or release objects are sent from the GUI to an Arduino Uno board connected to the computer. A camera mounted on the arm allows the user to see the position of objects and guide movements. The arm uses three motors for accurate movement and an object detection system to facilitate pick and place tasks.
IRJET - Wheelchair Control using Eye-MotionIRJET Journal
This document describes a system for controlling a wheelchair using eye motion detection. A camera is used to track the user's eye movement and send control signals to the wheelchair based on whether the user is looking left, right, or forward. An ultrasonic sensor is also included for obstacle detection to stop the wheelchair if an obstruction is sensed.
Design and Fabrication of Obstacle Avoiding Robotic VehicleIRJET Journal
The document describes the design and fabrication of an obstacle avoiding robotic vehicle. Some key points:
- The robotic vehicle uses an Arduino microcontroller and ultrasonic sensors to detect obstacles in its path. It is able to maneuver autonomously in unknown environments without collisions.
- When an obstacle is detected, the microcontroller redirects the robot by controlling the motors to move in an alternate direction and avoid the obstacle.
- The low-cost components like the Arduino, ultrasonic sensors, motor driver and DC motors make the robot easily replicable. The robot is able to fulfill goals like autonomous obstacle detection and avoidance in real-time without external control.
ACCELEROMETER BASED HAND GESTURE CONTROLLED ROBOT USING ARDUINOSnehasis Mondal
WORKING ARDUINO CODE:
/* * Gesture Recognition Robot * Coder – Raj,Rajib,Saity,Snehasis * This program lets you to control your robot with gesture made by your hand */ int GNDPin=A4; //Set Analog pin 4 as GND int VccPin=A5; //Set Analog pin 5 as VCC int xPin=A3; //X axis input int yPin=A2; //Y axis input int zPin=A1; //Z axis input(not used) int Q1=10,Q2=11,Q3=12,Q4=13; //Output pins to be connected to 10, 11, 12, 13 of Decoder IC long x; //Variabe for storing X coordinates long y; //Variabe for storing Y coordinates long z; //Variabe for storing Z coordinates void setup() { Serial.begin(9600); pinMode(Q1,OUTPUT); pinMode(Q2,OUTPUT); pinMode(Q3,OUTPUT); pinMode(Q4,OUTPUT); pinMode(GNDPin, OUTPUT); pinMode(VccPin, OUTPUT); digitalWrite(GNDPin, LOW); //Set A4 pin LOW digitalWrite(VccPin, HIGH); //Set A5 pin HIGH } void loop() { x = analogRead(xPin); //Reads X coordinates y = analogRead(yPin); //Reads Y coordinates z = analogRead(zPin); //Reads Z coordinates (Not Used) if(x<340) // Change the value for adjusting sensitivity forward(); else if(x>400) // Change the value for adjusting sensitivity backward(); else if(y>400) // Change the value for adjusting sensitivity right(); else if(y<340) // Change the value for adjusting sensitivity left(); else stop_(); } void stop_() { Serial.println(""); Serial.println("STOP"); digitalWrite(Q1,LOW); digitalWrite(Q2,LOW); digitalWrite(Q3,LOW); digitalWrite(Q4,LOW); } void forward() { Serial.println(""); Serial.println("Forward");
digitalWrite(Q1,HIGH); digitalWrite(Q2,LOW); digitalWrite(Q3,HIGH); digitalWrite(Q4,LOW); } void backward() { Serial.println(""); Serial.println("Backward"); digitalWrite(Q1,LOW); digitalWrite(Q2,HIGH); digitalWrite(Q3,LOW); digitalWrite(Q4,HIGH); } void left() { Serial.println(""); Serial.println("Left"); digitalWrite(Q1,LOW); digitalWrite(Q2,HIGH); digitalWrite(Q3,HIGH); digitalWrite(Q4,LOW); } void right() { Serial.println(""); Serial.println("Right"); digitalWrite(Q1,HIGH); digitalWrite(Q2,LOW); digitalWrite(Q3,LOW); digitalWrite(Q4,HIGH); }
This document summarizes a research paper on a shortest path follower robot. It describes the design of a line following robot that can detect the shortest path using IR sensors to follow a black line on a white surface. The robot uses an Arduino microcontroller connected to IR sensors and motors to determine the optimal path between a starting point and destination. It aims to solve the single source shortest path problem by identifying obstacles and navigating efficiently. The system architecture includes IR sensors to detect the line, motors to move the robot, and an Arduino board to process sensor readings and control the motors to follow the shortest route.
DESIGN & DEVELOPMENT OF UNMANNED GROUND VEHICLEIRJET Journal
This document describes the design and development of an unmanned ground vehicle (UGV). The UGV was designed to perform tasks in dangerous situations as a replacement for humans. It uses sensors like a camera and metal detection sensor to monitor its surroundings. An Arduino microcontroller controls DC motors and servo motors via an L293D motor driver to move the vehicle and camera. The UGV is designed to detect metals using the sensor and send location coordinates via GSM if anything is found. The document outlines the objectives, methodology, components used, and working principle of the UGV. It was developed over phases that included research, design using CATIA software, component selection, manufacturing, and coding to create a working prototype.
IRJET- Automatic Detection of Crack Fault in Railway TrackIRJET Journal
The document describes a proposed system to automatically detect cracks in railway tracks using sensors to improve safety. The system uses IR sensors mounted on a robot to detect cracks, a GPS module to determine the location of any cracks found, and Bluetooth to send the crack location data to a mobile phone or control room. When a crack is detected, the IR sensors send a signal to a NodeMCU microcontroller which triggers an alarm and sends the crack coordinates via Bluetooth. The proposed low-cost system aims to detect cracks faster and more reliably than manual inspection methods to help prevent railway accidents.
Maneuvering Robotic Vehicle based on Motion Sensor FeedbackIshanMalpotra
This is the project report of my Final year Project which bagged us
"Best Project Award" at NATIONAL LEVEL in "13th ISTE Tamilnadu & Puducherry Section Annual Convention for Engineering Students - 2014" held on 23rd - 24th January, 2014."
Development Of Industrial Automatic Multi Colour Sorting and Counting Machine...theijes
Sorting of the products in the industry is very difficult task and continuous manual sorting creates issues. It is very desirable to create a machine that identify the objects and relocate them if the object meets certain criteria. This paper presents a solution to sort the coloured objects with the help of the robotic arm. The objects when placed on the conveyor belt are sorted based on colour sensing and are relocated to specific location. When an object moves from one location to another on the conveyor belt, the sensors give the input to the microcontroller which then give the command to the robotic arm to do the task. TCS3200 colour sensor is used for detection of the colour of the object. DC motors are used to move the conveyor belt, gripper and the lifter. Arduino Nano microcontroller is used to give the commands. L293D motor driver is used to drive the motors and LCD display makes the system user friendly
Design and Development of Device Used for Detection of Cracks on Railway TracksIRJET Journal
This document describes the design and development of a device to detect cracks in railway tracks. Sensors like IR sensors, ultrasonic sensors, PIR sensors, GPS and GSM modules are used. The IR sensor detects cracks in the tracks, ultrasonic sensor detects obstacles, and PIR sensor determines if obstacles are moving or stationary. When a crack is detected, the GPS module identifies the location which is sent via GSM module as an SMS to the nearest station master. The device is designed with a modular aluminum chassis and runs on batteries to autonomously inspect railway tracks for cracks and obstacles. Testing showed it can run for 3 hours and cover 32 km while achieving the target speed of 3 m/s. This automatic crack detection
IRJET - A Real-Time Pothole Detection Approach for a Safety Transportation Sy...IRJET Journal
This document proposes a real-time pothole detection system to improve transportation safety. The system uses an accelerometer sensor to detect potholes by measuring deviations in road surface acceleration. An onboard GPS module provides the location of detected potholes. This location data, along with light and noise readings from an LDR and microphone, is uploaded to the cloud and plotted on Google Maps. The goal is to reduce accidents caused by unexpected potholes by making drivers aware of their locations.
This project involves designing and developing a wireless, microcontroller-based humanoid arm with a five-fingered hand. Solidworks was used to design the mechanical components of the hand to resemble a human hand. Analysis was performed to ensure design specifications such as weight and cost were met. The hand will be 3D printed based on CAM codes generated in NX-CAM. An Arduino microcontroller board will enable wireless control of servo motors in the hand using a sensor-equipped control glove. The goal is to create a low-cost, biologically inspired robotic hand for applications such as bomb disposal, prosthetics and surgery.
The document describes the design of an autonomous navigation robot that can avoid obstacles. An ATmega328P microcontroller is used to process signals from ultrasonic sensors and direct the robot's movement. When an obstacle is detected, the microcontroller determines the distance and redirects the robot by turning or reversing direction to avoid collisions. The robot's movement is controlled by the microcontroller sending signals to motors through a motor driver. The goal is for the robot to intelligently navigate unknown environments without needing remote control by detecting obstacles with sensors and maneuvering autonomously.
This document describes a graduate project submitted by Zainab Falaih Hasan Ulla Ahmed Ouda for the degree of Bachelor of Automated Manufacturing Engineering. The project involves designing and building a prototype of a black line tracking robot. The robot uses sensors and a microcontroller to follow a black line on a white surface and maneuver turns. It is intended to function autonomously within an automated factory environment. The document provides background on the project, acknowledges those involved in advising and supporting the work, and outlines the various chapters that will comprise the project report, including the robot design, hardware components, implementation details, results, and proposals for future work.
IRJET-Smart Farm Application: A Modern Farming Technique using Android Applic...IRJET Journal
This document describes the design of an automated solar-powered lawn mower called the Automated Mower Robo. It uses an Arduino board and ultrasonic sensors to cut grass automatically while avoiding obstacles. The mower is powered by a 12V battery that charges from an attached solar panel. It was tested on different grass types and was able to cut the grass shorter than the expected height, showing 90% efficiency. The mower provides an environmentally friendly alternative to gas-powered mowers by using solar energy for an automated grass cutting system.
Bluetooth Controlled Garbage Collection Robot ArmIRJET Journal
We've designed a semi-autonomous robotic arm that can collect scrap materials. The robotic arm has 5 degrees of freedom and is controlled via an Arduino board and Bluetooth module. It uses servo motors and stepper motors to manipulate the arm and a claw to pick up scrap. The arm was 3D printed using PLA plastic. An Android app was created to send commands to the Arduino via Bluetooth to control the arm's movement and allow remote operation. The goal was to develop an affordable robotic system to assist with scrap collection in a safe and efficient manner.
This document describes a bus tracking system that uses GPS and GSM modules to track the location of buses in real-time and provide that information to users. The system hardware installed on buses uses a GPS module to detect location and a GSM module to send the location data to the cloud. An Android mobile app then allows users to view buses' current locations on a map. The system aims to address issues with public transportation systems like not knowing arrival times or available seats. It provides real-time bus tracking to improve the user experience.
Design and implementation of an sms based robotic system for hole- detection ...eSAT Journals
Abstract This paper presents the design and implementation of SMS-based robotic system for hole-detection in surface pipes using GPS/GPRS/GSM technology. In industries today, surface pipelines are used to transport fluids; some of these pipelines have joints which are welded together. In a situation whereby the welding is not perfectly done, leakage of fluids may occur, leading to product loss and downtime. Therefore to checkmate this issue of pipeline leakage, an SMS-based robotic system is designed to inspect the pipelines for holes and if light is sensed in the pipelines through the sensing devices on the robot, a Short Message Service (SMS) indicating the location where the light was sensed is sent to the operator’s phone with help of the SIM 900 GPS/GPRS/GSM MODEM installed on the robot hardware system. The robot system is designed with an Arduino Uno Micro-Controller and some sensing devices installed on it. The system is programmed using embedded C language. After the implementation, the robot system was tested which could detect holes in surface pipelines and sent short message to the personnel’s mobile phone. Keywords: Arduino Uno Board, GPS/GPRS/GSM technology, Robotics, SIM900 Modem, Sensors
Autonomous Campus Tour Guide Robot by using Ultrasonic Range Sensors and QR c...ShwetonKedia
This undertaking depends on QR (Quick Response) codes to give area references to portable robots. The versatile robot is outfitted with a Smartphone that is modified to identify and peruse data on QR codes that are deliberately put in the working condition of the robot. The portable robot can play out the self-governing keep running all through the guide course by utilizing ongoing QR code acknowledgment. The lab data on QR code is played to the guests utilizing Text-to-Speech gave through Android gadget. Ultrasonic range sensors which can distinguish articles and measure separations with high precision are utilized to actualize the divider following and obstruction evasion practices. The gathered sonar data by ultrasonic range sensors is processed by a microcontroller that self-sufficiently controls the tour guide robot. A calculation dependent on the Proportional-integral-derivative (PID) control is applied to the tour guide robot to perform increasingly precise robot movement control. A Bluetooth innovation is utilized to send flag to the Arduino from the Smartphone to operate the tour guide robot remotely.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Project Management Semester Long Project - Acuityjpupo2018
Acuity is an innovative learning app designed to transform the way you engage with knowledge. Powered by AI technology, Acuity takes complex topics and distills them into concise, interactive summaries that are easy to read & understand. Whether you're exploring the depths of quantum mechanics or seeking insight into historical events, Acuity provides the key information you need without the burden of lengthy texts.
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxSitimaJohn
Ocean Lotus cyber threat actors represent a sophisticated, persistent, and politically motivated group that poses a significant risk to organizations and individuals in the Southeast Asian region. Their continuous evolution and adaptability underscore the need for robust cybersecurity measures and international cooperation to identify and mitigate the threats posed by such advanced persistent threat groups.
Ocean lotus Threat actors project by John Sitima 2024 (1).pptx
Na yatr report
1. Project Report on
NA - YATR
Submission to the THE ROBOTICS CLUB as a part of INDUCTION'19
TEAM 9
THE ROBOTICS CLUB-SNIST
SREENIDHI INSTITUTE OF SCIENCE AND TECHNOLOGY
(AUTONOMOUS)
(Affiliated to JNT University, Hyderabad)
Yamnampet, Ghatkesar, Hyderabad – 501 301.
2019
2. CERTIFICATE
This is the project work titled ‘NA-YATR’ by ‘K.Anirudh’, ‘ESK Karthik’,’Harihara
Viswakarma’,’RagaMohana’,’ShaikShoiab’,’AkashReddy’,’Sindhu’,’Varun’ under the
mentorship of ‘Datta Lohith’, ‘Abhishek’ for the recruitment into THE ROBOTICS
CLUB-SNIST and is a record of the project work carried out by them during the year
2018-2019 as part of INDUCTION under the guidance and supervision of
T.V. HARI KRISHNA MUHAMMAD TABISH
Technical Head Technical Head
Ms. V. Manasa
The President of THE ROBOTICS CLUB
Dr. A. Purushotham
Technical Advisor
Mechanical Department
3. NA-YATR Page - 3 -
DECLARATION
The project work reported in the present thesis titled “NA-YATR” is a record work
done by Team “9” in THE ROBOTICS CLUB as a part of INDUCTION-19.
No part of the thesis is copied from books/ journals/ Internet and wherever the
portion is taken, the same has been duly referred in the text. The report is based on
the project work done entirely by TEAM “9” and not copied from any other source.
4. NA-YATR Page - 4 -
ACKNOWLEDGMENT
This project report is the outcome of the efforts of many people who have driven our passion
to explore into implementation of NA-YATR. We have received great guidance,
encouragement and support from them and have learned a lot because of their willingness to
share their knowledge and experience.
Primarily, we would like to express our gratitude to our mentors, ‘Datta Lohith’ and
‘Abhishek’. Their guidance has been of immense help in surmounting various hurdles along
the path of our goal.
We thank our Technical Heads T.V. Hari Krishna and M. Tabish for being with us till the
end of the project completion.
We thank the Executive Body and the Technical Advisory Board of The Robotics Club
for helping us with crucial parts of the project. We are deeply indebted to Ms. V. Manasa -
the President, Mr. R. Sai Sandeep -the Vice-President and Mr. Aditya Gupta –the General
Secretary of THE ROBOTICS CLUB respectively and also every other person who spared
their valuable time without any hesitation whenever we wanted.
We also thank our Technical advisor Dr. A. Purushotham, Professor, Mechanical
Department, who encouraged us during this project by rendering his help when needed.
5. NA-YATR Page - 5 -
Index
Contents 6
List of Figures 7
Abstract 8
Source Codes 22-32
Experimental Results 33
Conclusion 34
Appendix 35
Record of expenses 36
6. NA-YATR Page - 6 -
Contents
Chapter 1 Introduction 9-10
1.1 Problem Statement 9
1.2 Introduction of the project 9
1.3 Literature survey 9
1.4 Organization of project 10
Chapter 2 Architecture 11-17
2.1 Components used 11
2.1.1 Hardware 11
2.1.2 Software 11
2.2 Components description 11-17
2.2.1 Hardware 11-15
2.2.2 Software 16-17
Chapter 3 Implementation and working 18-20
3.1 Block diagram and Circuit Diagram 18
3.2 Working 19
3.3 Algorithm 19-20
Chapter 4 Experimental Results and Conclusions 21
4.1 Results 21
4.2 Future enhancements 21
4.3 Conclusions . 21
7. NA-YATR Page - 7 -
List of Figures
Fig. No. Description Page number
1
2
3
4
5
6
7
8
9
10
11
12
13
SIDE SHAFT MOTORS
ARDUINO MEGA Microcontroller
Neo-6m GPS
HMC5883L DIGITAL COMPASS
HC-SR04 ULTRASONIC SENSOR
L298n MOTOR DRIVERS
SERVO MOTOR
WHEELS
Xbee s2c
GUI using MATLAB
ARDUINO IDE
Block Diagram
Circuit Diagram
12
12
13
13
14
14
15
15
16
16
17
18
18
8. NA-YATR Page - 8 -
Abstract
To move from one place to another is very important to us, but it is not important
for a human to take us from one place to another. The work of transporting us or
some objects can be done without the interference oh humans. We have the
technology and resources to make that possible. The main objective of this robot is
to learn the new coordinates of its target location and to go there while avoiding
any obstacles that come in between. We make use of a GPS module, a compass
and ultrasonic sensor to make this aspect of the robot possible. A robot like this
will find many uses in the world like small robots in home or office space, self
driving cars. This project makes use of A* Algorithm to find the shortest possible
distance, in this procedure the buildings are detected by their roof colour and the
search area is divided into a grid depending on the approximate cell decomposition
method.
The selected path is sent to the autonomous robot using Xbee S2C as a series of set
points for real implementation in outdoors..
9. NA-YATR Page - 9 -
CHAPTER 1
INTRODUCTION
1.1 PROBLEM STATEMENT: To make an autonomous self operating bot where
the user will mark the location (longitude and latitude) and will be sent to the bot via serial
communication after that the bot will enter a loop moving around ,avoiding any obstacle in its
path, until it reach the final point
1.2 INTRODUCTION TO PROJECT: The present technology used in this bot is
the use of ultrasonic sensors which uses high frequency waves of sound and this sound wave
will be bounced back to calculate the amount of distance. The gps technology which detects
the positioning of the car by triangulating using satellites .The Servo motor is used
additionally to reduce the number of ultrasonic sensors to be used and rotate when obstacles
are detected where the cost of the bot is reduced. The robot can be improved in multiple ways
for the future, we can try to make it voice automated, where it will reach a location just by
hearing its name.
1.3 LITERATURE SURVEY: Basically the source of idea is obtained from the
autonomous moving or the driverless car. As we know the autonomous driving will be the next
big innovation in future ,it supposed to have a massive impacts in society, like reducing accidents
will increase travel time reliability and reduce congestion as we know that the human errors are
responsible for almost 90 % of accidents and people with mobility restriction i.e who do not
know to drive and is elderly i.e not a license holder or a physically challenged person can use
these type of driverless cars. Moreover, compared to today’s technology this is just a small
representation of what can be done, Google is making use of a self -driving car, the final goal is
to reach that level if finesse.
10. NA-YATR Page - 10 -
1.4 ORGANISATION OF THE PROJECT:
1.Chapter 1 describes about the aim and the introduction part of the project
2.Chapter 2 gives the description of the components in detail used in project.
3.Chapter 3 describes implementation of the project with block diagram and the circuit
description.
4. Chapter 4 shows the results and discussions what had been done is explained in detail
11. NA-YATR Page - 11 -
CHAPTER 2
ARCHITECTURE
2.1 COMPONENTS USED
2.2.1 Hardware
• SIDE SHAFT MOTORS
• ARDUINO MEGA Microcontroller
• Neo-6m GPS
• HMC5883L DIGITAL COMPASS
2.1.2 Software
• GUI using MATLAB
• ARDUINO IDE
2.2 COMPONENTS DESCRIPTION
2.2.1 Hardware
DC MOTORS:
• A DC motor is any of a class of rotary electrical machines that converts direct current
electrical energy into mechanical energy. The most common types rely on the forces
produced by magnetic fields.
Common types of motors:
• Centre shaft Motors
• Side shaft Motors.
• Side shaft motors are generally used for higher torque and centre shaft motors are used
for high speed.
12. NA-YATR Page - 12 -
• SIDE SHAFT MOTORS:
Side Shaft motors are generally used for high torque. The unique design lets the motor to
have good torque as it is placed at some distance from the centre creating that required
torque and efficiency of the motor.
Fig 1
• ARDUINO MEGA Microcontroller:
Arduino Mega known after the company Arduino is a whole assorted form of printed circuit
board consisting of microcontroller, analog pins, digital pins, power supply, ground pins, heat
sinking pins, pulse width modulation pins etc. Arduino is generally used to dump the code
which intern is used for the interaction of hardware and software of a respective bot.
Fig 2
13. NA-YATR Page - 13 -
• Neo-6m GPS:
GPS is generally referred as GLOBAL POSITIONING SYSTEM is used to detect the
location of the object. This is mainly possible through satellite navigation.the time to
time location of the bot or any automobile using GPS is taken and sent to satellites.
Three satellites form a triangle which are nearby to the position of the bot and
coincide downward side and give the exact and accurate location of the respective
object moving with respect to time and speed.thus time to time location is obtained
using appropriate GPS.
Fig 3
• HMC5883L DIGITAL COMPASS:
HMC5883L is a 3-axis digital compass used for two general purposes: to measure the
magnetization of a magnetic material like a ferromagnet, or to measure the strength
and, in some cases, the direction of the magnetic field at a point in space. These types
of compasses are very essential for proper chanelling and direction of the bot.
Fig 4
14. NA-YATR Page - 14 -
• HC-SR04 ULTRASONIC SENSOR:
HC-SR04 Ultrasonic (US) sensor is a 4 pin module, whose pin names are Vcc, Trigger,
Echo and Ground respectively. This sensor is a very popular sensor used in many
applications where measuring distance or sensing objects are required. The module has
two eyes like projects in the front which forms the Ultrasonic transmitter and Receiver.
The Ultrasonic transmitter transmits an ultrasonic wave, this wave travels in air and when
it gets objected by any material it gets reflected back toward the sensor this reflected
wave is observed by the Ultrasonic receiver module.
Fig 5a and 5b
• L298N MOTOR DRIVERS:
L298N H-bridge Dual Motor Controller Module 2A with Arduino. This allows you to
control the speed and direction of two DC motors, or control one bipolar stepper motor with
ease. The L298N H-bridge module can be used with motors that have a voltage of between 5
and 35V DC. There is also an onboard 5V regulator, so if your supply voltage is up to 12V
you can also source 5V from the board.
Fig 6
15. NA-YATR Page - 15 -
• SERVO MOTOR:
A servomotor is a rotary actuator or linear actuator that allows for precise control of angular
or linear position, velocity and acceleration. A servomotor is a closed-loop that uses position
feedback to control its motion and final position. The input to its control is a signal
representing the position commanded for the output shaft.
Fig 7
• WHEELS:
Robot wheels are used for locomotive purpose or movement of the robot accordingly. Wheels
are mechanical part of the robot which are basic step for chassis essential for the
transportation of the bot in respective field.
Fig 8
• Xbee S2C:
This Xbee module is an RF module used for wireless communication. Its transmission
frequency is between 2.4GHz to 2.5GHz. Its range is 200ft indoors and upto 4000ft
16. NA-YATR Page - 16 -
outdoors. Its supply voltage is from 2.1V to 3.6V. It is useful for home automation and
medium range wireless communications.
Fig 9
2.2.2 Software
GUI using MATLAB:
GUIs (graphical user interfaces) provide point-and-click control of software applications,
eliminating the need to learn a language or type commands in order to run the application.
MATLAB apps are self-contained MATLAB programs with GUI front ends that automate a task
or calculation. The GUI typically contains controls such as menus, toolbars, buttons, and sliders.
Many MATLAB products, such as Curve Fitting Toolbox, Signal Processing Toolbox, and
Control System Toolbox include apps with custom user interfaces. You can also create your own
custom apps, including their corresponding UIs, for others to use.
Fig 10
17. NA-YATR Page - 17 -
• ARDUINO IDE:
The Arduino integrated development environment (IDE) is a cross-platform application
for Windows, mac OS, Linux that is written in the programming language Java. It is used to
write and upload programs to Arduino board. The Arduino IDE supports the
languages C and C++ using special rules of code structuring.
Fig 11
19. NA-YATR Page - 19 -
3.2 WORKING
A-star method:
A-star method is an computer algorithm which used for pathfinding. This method converts the
plane into multiple modules and find the shortest path avoiding the obstacles. A-star method is
most efficient than any other method. This method uses heuristics to give maximum accuracy and
performance.
Anti-collision system:
Anti-collision system can be used to avoid obstacles and damage of the bot. For the application of
anti-collision system we use ultrasonic radar by attaching a ultrasonic sensor on the top of servo
motor. The software developed and uploaded into the microcontroller works in a way that the
ultrasonic sensors detect an obstacle less than a particular distance the bot stops. Then the bot
looks for an alternate way to move ahead which has an obstacle at maximum distance. This
process continues until the bot reaches its final destination.
3.3 ALGORITHM
Step1:The person should give a location on the map.
Step2: The computer software extracts only the route and obstacles in between. Excludes all other
information available from the map.
Step3: Using A-star method computer finds a shortest route for the bot to reach the destination.
Step4: This information is transferred from the computer to the microcontroller using Xbee
transmission devices.
20. NA-YATR Page - 20 -
Step5: The information is analyzed by the microcontroller and using GPS & compass the bot
makes it’s move.
Step6: Any obstacles are identified in the way the ultrasonic sensors detects it and changes the
direction.
Step7: The followings continue in a loop like A-star method, GPS tracking, anti-collision system
etc… until the destination is arrived.
Step8: When the destination is arrived it stops.
Step9: End.
21. NA-YATR Page - 21 -
CHAPTER 4
EXPERIMENTAL RESULTS
4.1 RESULTS
The robot is performing motions in these directions: forward, backward, side-wards and also
rotating around its own axis. The robot is efficient and precise in obstacle detection and
changing its path accordingly. The robot moves in the path defined to it whilst avoiding any
obstacles without any ambiguity.
4.2 FUTURE ENHANCEMENTS
We are planning to improvise this robot by adding the following features:
1. By adding image processing technology to it.
2. By adding a light sensor after which it can be used as a fire detecting and fighting robot in
future.
3. By adding an IR sensor.
4. By using Lidar for 3D mapping and developing a indoor route map.
5. By adding a camera.
4.3 CONCLUSION
The overall conclusion of the project is that we have successfully constructed a robot that can
be useful to work on any terrain given with some satisfactory gaits. This robot is able to
communicate with all of its components with utmost efficiency. This robot with little
variations can be used in various fields from Military to Agriculture. The robot has wide
range of application scaling from domestic to industries. The features of this robot shows
how human capital can be reduced and how human safety is not compromised while using it
in mines and military operations.
23. NA-YATR Page - 23 -
float heading;
TinyGPS gps;
SoftwareSerial ss(1,2);
float x2lon = radians(00000000), x2lat = radians(000000000); //enter final location here as
lon and lat
HMC5883L comp;
int16_t mx, my, mz;
float head, distance = 0.0;
#define trigPin 13
#define echoPin 12
//Right motor
int enA = 11;
int in1 = 8;
int in2 = 7;
//Left Motor
int enB = 10;
int in3 = 6;
int in4 = 5;
const int danger = 50; //danger distance in cm
int leftDistance, rightDistance;
Servo uservo;
long dur;
long dist;
void setup() {
// put your setup code here, to run once:
uservo.attach(4);
uservo.write(90);
pinMode(trigPin, OUTPUT);
pinMode(echoPin, INPUT);
pinMode(enA, OUTPUT);
24. NA-YATR Page - 24 -
pinMode(enB, OUTPUT);
pinMode(in1, OUTPUT);
pinMode(in2, OUTPUT);
pinMode(in3, OUTPUT);
pinMode(in4, OUTPUT);
Wire.begin();
comp.initialize();
ss.begin(9600);
Serial.begin(9600);
Serial.println("Complete");
Serial.println("Ready!");
}
void loop() {
// put your main code here, to run repeatedly:
if (f1 != 0)
Forward();
cm1 = ping();
delay(50);
cm1 = ping();
Serial.print("cm");
Serial.print(cm1);
if (cm1>danger){
comp.getHeading(&mx, &my, &mz);
heading = atan2(my, mx);
if (heading < 0)
heading += 2 * M_PI;
heading = heading * 180 / M_PI;
if ((i % 2) == 0)
{ head = heading + 100;
if (head > 360)
head = head - 360;
Left();
25. NA-YATR Page - 25 -
}
else
{ head = heading - 100;
if (head < 0)
head = head + 360;
Right();
}
while ((heading > head + 8) || (heading < head - 8)) // this loop turns the bot till its facing
(head)degrees east of north
{
turn();
delay(5);
comp.getHeading(&mx, &my, &mz);
heading = atan2(my, mx);
if (heading < 0)
heading += 2 * M_PI;
heading = heading * 180 / M_PI;
}
Stop();
delay(100);
}
else {
Forward();
delay(500);
Stop();
do {
gpshead();
} while (distance == 0.0);
if (distance < 15)
26. NA-YATR Page - 26 -
while (1)
Stop();
if (f1 == 0){
comp.getHeading(&mx, &my, &mz);
heading = atan2(my, mx);
if (heading < 0)
heading += 2 * M_PI;
heading = heading * 180 / M_PI;
while ((heading > head + 8) || (heading < head - 8))
{
turn();
delay(5);
comp.getHeading(&mx, &my, &mz);
heading = atan2(my, mx);
if (heading < 0)
heading += 2 * M_PI;
heading = heading * 180 / M_PI;
}
f1 = 4;
}
f1--;
}
}
void turn(){
float tur = heading - head;
if (tur < 0.0)
{ if (tur > -180.0)
Right();
else
Left();
}
else
27. NA-YATR Page - 27 -
{ if (tur < 180.0)
Left();
else Right();
}
}
void gpshead()
{
bool newData = false;
for (unsigned long start = millis(); millis() - start < 1000;)
{
while (ss.available())
{
char c = ss.read();
Serial.write(c);
if (gps.encode(c))
newData = true;
}
}
if (newData)
{
float flat1, flon1;
unsigned long age;
gps.f_get_position(&flat1, &flon1, &age);
flon1 = radians(flon1); //also must be done in radians
flat1 = radians(flat1); //also must be done in radians
head = atan2(sin(x2lon - flon1) * cos(x2lat), cos(flat1) * sin(x2lat) - sin(flat1) * cos(x2lat)
* cos(x2lon - flon1));
head = head * 180 / 3.1415926535; // convert from radians to degrees
float dist_calc = 0;
float diflat = 0;
float diflon = 0;
diflat = x2lat - flat1; //notice it must be done in radians
diflon = (x2lon) - (flon1); //subtract and convert longitudes to radians
distance = (sin(diflat / 2.0) * sin(diflat / 2.0));
28. NA-YATR Page - 28 -
dist_calc = cos(flat1);
dist_calc *= cos(x2lat);
dist_calc *= sin(diflon / 2.0);
dist_calc *= sin(diflon / 2.0);
distance += dist_calc;
distance = (2 * atan2(sqrt(distance), sqrt(1.0 - distance)));
distance *= 6371000.0; //Converting to meters
if (head < 0) {
head += 360; //if the heading is negative then add 360 to make it positive
}
}
else
{
head = 0.0;
distance = 0.0;
}
}
void movement()
{
int distanceFwd = ping();
if (distanceFwd>danger) //if path is clear
{
Forward(); //move forward
}
else //if path is blocked
{
Stop();
uservo.write(0);
delay(500);
rightDistance = ping(); //scan to the right
delay(500);
uservo.write(180);
delay(700);
leftDistance = ping(); //scan to the left
29. NA-YATR Page - 29 -
delay(500);
uservo.write(90); //return to center
delay(100);
compareDistance();
}
}
void compareDistance()
{
if (leftDistance>rightDistance) //if left is less obstructed
{
Left(); //turn left
delay(500);
}
else if (rightDistance>leftDistance) //if right is less obstructed
{
Right(); //turn right
delay(500);
}
else //if they are equally obstructed
{
Backward(); //turn 180 degrees
delay(1000);
}
}
long ping()
{
//Trigger the sensor to send out a ping
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);
digitalWrite(trigPin, LOW);
dur = pulseIn(echoPin, HIGH);
dist = 0.034*dur/2;
Serial.println(+distance);
30. NA-YATR Page - 30 -
return dist;
}
void Forward() //This function tells the robot to go forward
{
//for (int i=0; i <= 50; i++){
Serial.println("");
Serial.println("Moving forward");
// turn on left motor
digitalWrite(in1, LOW);
digitalWrite(in2, HIGH);
// set speed out of possible range 0~255
analogWrite(enA, 155);
// turn on right motor
digitalWrite(in3, HIGH);
digitalWrite(in4, LOW);
// set speed out of possible range 0~255
analogWrite(enB, 155);
delay(100);
//}
//Stop();
}
void Backward() //This function tells the robot to move backward
{
//for (int i=0; i <= 50; i++){
Serial.println("");
Serial.println("Moving backward");
// turn on left motor
digitalWrite(in1, HIGH);
digitalWrite(in2, LOW);
// set speed out of possible range 0~255
analogWrite(enA, 155);
// turn on right motor
31. NA-YATR Page - 31 -
digitalWrite(in3, LOW);
digitalWrite(in4, HIGH);
// set speed out of possible range 0~255
analogWrite(enB, 155);
delay(100);
//}
//Stop();
}
void Left() //This function tells the robot to turn left
{
//for (int i=0; i <= 50; i++){
Serial.println("");
Serial.println("Moving left");
digitalWrite(in1, HIGH);
digitalWrite(in2, LOW);
// set speed out of possible range 0~255
analogWrite(enA, 140);
digitalWrite(in3, HIGH);
digitalWrite(in4, LOW);
//set speed out of possible range 0~255
analogWrite(enB, 155);
delay(100);
// }
// Stop();
}
void Right() //This function tells the robot to turn right
{
// for (int i=0; i <= 50; i++){
Serial.println("");
Serial.println("Moving right");
digitalWrite(in1, LOW);
digitalWrite(in2, HIGH);
32. NA-YATR Page - 32 -
// set speed out of possible range 0~255
analogWrite(enA, 135);
digitalWrite(in3, LOW); //Was High
digitalWrite(in4, HIGH);
analogWrite(enB, 155);
delay(100);
// }
// Stop();
}
void Stop() //This function tells the robot to stop moving
{
Serial.println("");
Serial.println("Stopping");
// now turn off motors
digitalWrite(in1, LOW);
digitalWrite(in2, LOW);
//analogWrite(enA, 0);
digitalWrite(in3, LOW);
digitalWrite(in4, LOW);
//analogWrite(enB, 0);
}
33. NA-YATR Page - 33 -
EXPERIMENTAL RESULTS
The robot is able is able avoid obstacles, it is able to check its surrounding before taking a
decision whether to turn left or right.
The A* algorithm works as follows:
34. NA-YATR Page - 34 -
CONCLUSION
It has been proven that, the robot is able to form a path as short as possible to the target and
avoid any obstacles that come in between using the ultrasonic sensor mounted on a servo
motor acting as a radar.
The top speed the robot has achieved is 2 meters/minute.
35. NA-YATR Page - 35 -
APPENDIX
Global Positioning System(GPS) - Gives the location of the bot
A*star Path Finding Algorithm - Gives the path of the bot
L298n - Controls the motors
Xbee - For wireless communication
Arduino Mega – Micro controller to process the data
Compass – Gives the heading of the bot