This undertaking depends on QR (Quick Response) codes to give area references to portable robots. The versatile robot is outfitted with a Smartphone that is modified to identify and peruse data on QR codes that are deliberately put in the working condition of the robot. The portable robot can play out the self-governing keep running all through the guide course by utilizing ongoing QR code acknowledgment. The lab data on QR code is played to the guests utilizing Text-to-Speech gave through Android gadget. Ultrasonic range sensors which can distinguish articles and measure separations with high precision are utilized to actualize the divider following and obstruction evasion practices. The gathered sonar data by ultrasonic range sensors is processed by a microcontroller that self-sufficiently controls the tour guide robot. A calculation dependent on the Proportional-integral-derivative (PID) control is applied to the tour guide robot to perform increasingly precise robot movement control. A Bluetooth innovation is utilized to send flag to the Arduino from the Smartphone to operate the tour guide robot remotely.
Autonomous Campus Tour Guide Robot by using Ultrasonic Range Sensors and QR code Recognition in Indoor Environment
1. FINAL REPORT OF
Autonomous Campus Tour Guide Robot by using Ultrasonic Range
Sensors and QR code Recognition in Indoor Environment
A Project Report submitted in partial fulfilment
Of the requirement for the award of the degree of
BACHELOR OF TECHNOLOGY
In
Instrumentation and Control Engineering
Submitted by
Shweton Kedia
140921011
Under the guidance of
Ms. Sneha Nayak
Assistant Professor
Department of Instrumentation & Control Engineering
Manipal Institute of Technology
MAY 2019
2. 2 | P a g e
DEPARTMENT OF INSTRUMENTATION AND CONTROL ENGINEERING
MANIPAL INSTITUTE OF TECHNOLOGY
(A Constituent Institution of Manipal Academy of Higher Education)
MANIPAL – 576 104 (KARNATAKA), INDIA
Manipal
9th May
CERTIFICATE
This is to certify that the project titled Autonomous Campus Tour Guide Robot by using
Ultrasonic Range Sensors and QR code Recognition in Indoor Environment is a record
of the bonafide work done by Shweton Kedia (140921011), submitted in partial fulfilment
of the requirements for the award of the Degree of Bachelor of Technology (BTech) in
INSTRUMENTATION AND CONTROL ENGINEERING of Manipal Institute of
Technology, Manipal, Karnataka, (A Constituent Institution of Manipal Academy of
Higher Education), during the academic year 2018-19.
Ms. Sneha Nayak
(Assistant Professor)
Project Guide
Prof. Dr. Dayananda Nayak
HOD, ICE.
M.I.T, MANIPAL
3. 3 | P a g e
ACKNOWLEDGMENTS
I want to thank my guide, Ms. Sneha Nayak, for allowing me the chance to get exposure in this
research field. It has been an incredible encounter working under her direction. Her steady
inspiration, backing and confidence guided me in fruitful finish of the task.
I would likewise want to thank my project coordinator Prof. Dr. Sandra D'Souza for her steady
help and intrigue.
I give my true gratitude to Prof. Dr. Dayananda Nayak, the Head of the division,
Instrumentation and Control Engineering who constantly bolstered and helped me at whatever
point I required any help with different scholastic issues.
Finally, I want to thank our family and companions for their most extreme confidence in us
and for supporting us through.
Shweton Kedia (140921011)
4. 4 | P a g e
ABSTRACT
.
This undertaking tends to the test of portable robot route in indoor situations. There is a basic
requirement for practical, solid, and genuinely precise answers to satisfy the needs of indoor
robotic applications. Right now, analysts are investigating different methodologies for this
issue.
This undertaking depends on QR (Quick Response) codes to give area references to portable
robots. The versatile robot is outfitted with a Smartphone that is modified to identify and peruse
data on QR codes that are deliberately put in the working condition of the robot. The portable
robot can play out the self-governing keep running all through the guide course by utilizing
ongoing QR code acknowledgment. The lab data on QR code is played to the guests utilizing
Text-to-Speech gave through Android gadget. Ultrasonic range sensors which can distinguish
articles and measure separations with high precision are utilized to actualize the divider
following and obstruction evasion practices. The gathered sonar data by ultrasonic range
sensors is processed by a microcontroller that self-sufficiently controls the tour guide robot. A
calculation dependent on the Proportional-integral-derivative (PID) control is applied to the
tour guide robot to perform increasingly precise robot movement control. A Bluetooth
innovation is utilized to send flag to the Arduino from the Smartphone to operate the tour guide
robot remotely.
The test setup of the tour guide robot alongside the effective execution of the proficient strategy
for a route system is introduced.
Software used for this project is MIT App Inventor.
5. 5 | P a g e
LIST OF TABLES
Table No. Table Title Page No.
2.1 Literature Survey 11
A Project Schedule 29
6. 6 | P a g e
LIST OF FIGURES
Figure No Figure Title Page No
3.2.1 Atmega328 based Arduino Uno R3 microcontroller 13
3.2.2 Arduino motor shield based on L298 13
3.2.3 Ultrasonic Sensor 14
3.2.4 Bluetooth module 14
3.2.5 Chassis 14
3.2.6 DC motor of 100rpm 15
3.2.7 LIPO battery 15
3.4.1 Internal connection 17
3.4.2 Motor Connection with wheels 18
3.4.3 Top view of robot 19
3.5.1 Interface of application 20
3.5.2 Background working of application. 21
3.5.3 Bluetooth Block 21
3.5.4 QR code scanner block 22
3.5.5 Delay block 23
3.6.1 Block Diagram of Tour guide 24
4.2.1 QR code (Introduction) 25
4.2.2 QR code (LAB 1) 25
4.2.3 QR code (LAB 2) 26
4.2.4 QR code (LAB 3) 26
7. 7 | P a g e
Contents
Page No
Acknowledgement 3
Abstract 4
List of Tables 5
List of figures 6
Chapter 1 INTRODUCTION 8
1.1 Introduction 8
1.2 Motivation 9
1.3 Objective 9
1.4 Organization of Report 10
Chapter 2 LITERATURE REVIEW 11
Chapter 3 METHODOLOGY 12
3.1 Introduction 12
3.2 Hardware components 13
3.3 Flowchart 16
3.4 Hardware Connections 17
3.5 Application Used 20
3.6 Block Diagram 23
Chapter 4 RESULT ANALYSIS 23
4.1 Results obtained 23
4.2 QR codes 23
Chapter 5 CONCLUSION 25
5.1 Work Conclusion 25
5.2 Future Scope of Work 26
PROJECT SCHEDULE 27
REFERENCES 28
ANNEXURES 29
PROJECT DETAILS 39
8. 8 | P a g e
CHAPTER 1
INTRODUCTION
1.1 INTRODUCTION
Today robots have become quite an integral part of people’s daily lives. From cleaning robots
to medical robots to some indoor applications such as robotic vacuum cleaners, security and
surveillance applications.
Lately, there have been different preliminaries to stretch out mechanical autonomy innovation
to benefit applications in broad daylight spaces. Particularly, numerous scientists have an
enthusiasm for the guide robot since it is firmly identified with two basic issues in momentum
automated research, which are collaboration with human and route in unique situations.
The undertaking displayed centers around the advancement of an indoor self-ruling versatile
robot that can be utilized as a visit direct for grounds visits. QR Code Tags have been utilized
in light of the fact that they are less expensive, when contrasted and RFID labels.
9. 9 | P a g e
1.2 MOTIVATION
As robots expect an ever-increasing number of jobs in individual’s everyday lives, their impact
on society keeps on developing. There is a critical need for cost-effective, reliable, and fairly
accurate solutions to meet the demands of indoor robotic applications. Mechanical vacuum
cleaners, security, and observation applications are some case of fruitful indoor robot
application.
Another instance of vital genuine uses of indoor administration robots is the utilization of self-
ruling portable robots as tour guides in historical centers or displays.
The work displayed in this project is centered on the advancement of an indoor self-governing
versatile robot that can be utilized as a visit control for grounds visits, for instance amid
University Open Houses, using QR codes, Ultrasonic sensors and a smart phone.
More importantly, this type of a robot could be really helpful in showing around people through
certain campus, mainly helpful for people who are partially or fully impaired.
1.3 OBJECTIVE
The Project tends to the test of portable robot route in indoor situations. The project being
displayed will autonomously guide the people through campus with mainly the help of
Ultrasonic range sensors, QR codes recognition and an Android device. This will greatly help
in the navigation of the blind people when going to new places.
The following objectives are to be achieved upon the completion of project:
Left wall following.
Obstacle detection and avoidance.
QR code detection and scanning.
Text to speech conversion.
10. 10 | P a g e
1.4 ORGANIZATION OF REPORT
The first chapter of the report gives the gist of the whole project, it includes the motivation
behind the selection of the project and also gives the objectives that are going to be achieved
upon completion of the project.
The second chapter or the literature survey includes the information of four mega projects
based on the tour guide robot.
The third chapter is titled “methodology” includes the working of the tour guide robot. It
includes all the hardware components used, all the pin connections made, the concept
flowchart, the software used to create an application.
The fourth chapter presents the results obtained.
Chapter five give the conclusion and the future scope of the project.
11. 11 | P a g e
CHAPTER 2
LITERATURE SURVEY
SERIAL
NO
PAPER TITLE AUTHOR YEAR OF
PUBLICATION
IMPORTANT POINT
1 “The Autonomous
Tour-Guide Robot
Jinny”,
Gunhee kim,
Woojin
Chung,
Kyung-Rock
Kim, Munsang
Kim
2004 1) Adaptive
navigation in
dynamic and
unmodified
Environment.
2) Interaction
which attracts
and engages
people’s
interest.
2 CATE: Central
Automated Tour
Experience
J. Beckwith,
R. Lefief, S.
Sherbrook, M.
Williams, and
K. Yelamarthi
2012 1) Mechanical
Stability
2) Embedded
System
3) Localization and
obstacle
avoidance
3 Indoor Navigation
and Product
Recognition for Blind
People Assisted
Shopping
Diego Lopez-
de-Ipina,
Tania Lorido,
and Unai
Lopez
2011 1) Eyes-free
product
selection and
browsing
2) Free
navigating
within the
store
3) Utilization of
existing
devices
4 Development of Guide
Robot by Using QR
Code Recognition
Tansuriyavong
Suriyon, Higa
Keisuke and
Boonmee
choompol
2011 1) This paper
provides the
result for human
tracking with
help of QR
codes and how
the make life
easy.
Table 2.1
12. 12 | P a g e
CHAPTER 3
METHODOLOGY
3.1 INTRODUCTION
All the components of the project are acquired accordingly and the robot is made with the
chassis, wheel and motors.
Both the motors on the left side are shorted. Same thing is done with motors on the right side.
PWM pins are used to control speed and direction of the robot.
The robot has Ultrasonic range sensors on the front and on the sides to check for clear path/
obstacle collision and avoidance and perform wall following. These sensors send and receive
waves and measure the distance by a mathematical formula with help of speed of sound and
time taken by the waves to come back.
Initially the robot moves along the wall maintaining a predetermined distance from the wall
(threshold distance). As the robot moves it supposed to stop at the QRcode site after the android
devices senses any QR code type picture.
The scanning is done with help of a QR code scanner application which is used in the android
device. QR codes are generated through the website named www.qr-code-generator.com .
Thus, now the QR codes already have the required information fed in them.
The robot must stop at the exact location where the QR code tags are placed. Stopping of the
tour guide robot is done through the help of the android device that would be mounted on top
it, as it will detect and scan the QR sticker.
After scanning the QR code, for time being the text is converted to speech.
The detection, scanning and text-to-speech conversion is done through the application which
was developed using MIT App Inventor 2.
After completing the whole above procedure, the tour guide moves on to the next location of
the campus.
13. 13 | P a g e
3.2 HARDWARE USED
Following are the components used up in the project:
1. Atmega328 based Arduino Uno R3 microcontroller
This processor has 14 digital input/output pins in total in which 6 can be used for
Pulse Width Modulation (PWM) outputs, 6 analog inputs.
Operating voltage would be around 5V.
Pulse width modulation is a technique for getting analog results with digital means
where digital control is used to create a square wave, signal switched between on
and off.
Fig. 3.2.1
2. Arduino motor shield based on L298.
L293 could have also been used instead but after assessing it was decided that L298
would be better because of more power requirement.
Output current of L298 is 2 Amps whereas L293 is 1 Amps
The motor shield can drive up to two DC motors, controlling their speed, to and fro
motion and for turning.
The speed control is achieved through conventional PWM which is obtained from
Arduino PWM output pin 5 and 6.
Two L298 would be used to control 4 DC motors.
Fig. 3.2.2
14. 14 | P a g e
3. Ultrasonic Range Sensors
HC-SR04 is used over here.
Sensor is powered using 5V supply.
Used to avoid and detect obstacles
Used to measure distance within a wide range of 2cm to 400cm.
Fig. 3.2.3
4. Bluetooth module
HC-05 Bluetooth module is used.
Range of the module is 10 meters.
It is directly connected to Arduino for wireless communication with the android
device.
Fig. 3.2.4
5. Chassis of the guided robot
Size of the chassis was decided according to the components that would be mounted
on top it.
Fig. 3.2.5
15. 15 | P a g e
6. DC motor of 100rpm
Four DC gain motor of 100rpm would be used over here connected to the 4 wheels.
Initially 60 rpm DC motor was also thought to be used here but just on the
precautionary side, 100rpm is used.
Speed could be controlled.
Fig. 3.2.6
7. LIPO Battery
11.1 V with 2200 mah.
A lithium polymer battery is a rechargeable battery of lithium ion technology
using a polymer electrolyte instead of a liquid electrolyte.
These batteries provide higher specific energy and used in applications where
weight is a critical feature.
Fig. 3.2.7
16. 16 | P a g e
3.3 FLOWCHART OF THE TOUR GUIDE ROBOT
START
Obstacle
detection
Micro-
controller
Unit
Driving
Circuit
DC Motor Start
QR code
detection by
smart phone
Alternate
route
detection
Tour Guide stops at the
desired QR code location
Text-to-speech conversion
through android device
Proceed to next location
END
Obstacle
Present
Obstacle
Absent
1
1
2
2
As the robot moves forward it
will eventually detect an obstacle
(if present),upon detection the
microcontroller unitwill respond
to change the direction of the
robot, or else the robot continues
to follow its path.
The driving circuitis
used to run the DC
motor which controls
the motion.
Then comes the lastphase ofthe working of
robot where it detects and scans the QR code
placed on our destination using a camera which
is mounted on top of the tour guide robot. Text
to speech conversion is then processed using
an android application.
17. 17 | P a g e
3.4 HARDWARE CONNECTIONS
Fig. 3.4.1
The figure above (fig. 3.4.1) shows the internal connection of the tour guide
robot. The Arduino is connected to the Arduino motor shield L298 which
drives the four motors of the robot.
18. 18 | P a g e
Fig. 3.4.2
The figure above (fig. 3.4.2) shows the four driver motors connected with
the wheels to support the motion of the robot.
Both the motors on the left side are shorted. Same thing is done with motors
on the right side.
19. 19 | P a g e
Fig. 3.4.3
The figure above (fig. 3.4.3) shows the above view of the tour guide robot.
The three ultrasonic sensors are connected in the robot in the above way.
20. 20 | P a g e
3.5 APPLICATION USED
After completion of the first two objectives, that is, left wall following and obstacle detection
and avoidance it was now needed to make an application for phone to complete the following
objectives: -
QR code detection and scanning
Text to speech conversion of the data scanned through the Android device scanner.
To achieve the above objectives, a mobile application is made on MIT App Inventor 2
platform.
Fig. 3.5.1
In the above figure it can be seen that how the interface of the application would look on the
android device. It is being made on the MIT App Inventor 2.
21. 21 | P a g e
Fig. 3.5.2
The above figure is the background working of the application that is, it says the manner in
which the application would work. More about it is explained later.
Fig. 3.5.3
In the above figure, the block elements used here are for finding the available Bluetooth devices
in the vicinity and connecting with the one -required.
In the first block of the above figure it says that when button ListPicker1 is clicked all
the available Bluetooth available would be shown here to connect with.
In the second block after picking the Bluetooth device required, it is connected with
it.
22. 22 | P a g e
Fig. 3.5.4
In the above figure, working of the QR code scanner is shown.
The first block says that when the Scan button is clicked the function calls for the
barcode scanner to perform QR code scanning.
The second block tells what the application would do when a QR code is scanned. It is
divided into three different categories.
As soon as the QR code is scanned a command to stop the robot is given
through a serial 1-byte number ‘0’. The Arduino is coded in such a way that
when ‘0’ is sent to the Bluetooth module it stops the robot at the particular
position for 12 seconds.
In the second part the application checks through the different databases to tell
what data is stored in there.
In the third part ‘text to speech’ block is used. So, it speaks out the result
scanned through the scanner.
23. 23 | P a g e
Fig. 3.5.5
In the above figure, requirement of delay is shown. Current system time is stored in the global
variable and then 2 seconds is added to it. The application does not scan for 2 seconds, basically
it waits till the current system time equals the global variable.
This is done to avoid the QR code scanner to scan the same QR code twice.
So, a delay of 2 seconds is given after it scans and speaks the QR code.
3.6 BLOCK DIAGRAM
Below in the next page is the block diagram of tour guide robot.
24. 24 | P a g e
Ultrasonic sensor
Ultrasonic
sensor
Ultrasonic
sensor
Left side Right
l
side
L298 Motor driver
1
1
2
2
+5V GND
Bluetooth Module
LIPO BatterySwitc
h
25. 25 | P a g e
CHAPTER 4
RESULT ANALYSIS
4.1 RESULTS OBTAINED
(a) The tour guide robot has achieved wall following with a little precision
error.
(b) The objective of obstacle detection has been achieved successfully; the
robot turns right every time it detects the obstacle.
(c) QR code generation and detection has been done successfully. These QR
codes are generated on a website, and the data required is typed and saved
in the QR code
(d) Detection and scanning of the QR codes are successfully done through the
application made on the MIT App Inventor 2.
(e) Text to speech conversion is also completed.
4.2 QR CODES
Fig. 4.2.1
Fig. 4.2.2
26. 26 | P a g e
The figures show the QR code tag.
1. The QR code is generated via a website called www.qr-code-generator.com
2. The QR code tags are pasted on the doors of different labs or any desired place. When
the robot reaches a particular QR location and scans the QR code, it announces the
name of the location with help of an android application.
3. Upon reaching the destination, the tour guide robot will scan the QRcode and will give
the result via text to speech conversion.
4. The figure 4.2.1 stores data saying, “Hello, how are you?”.
5. The figure 4.2.2 stores data saying, “You have reached Lab 1, the process control lab
of instrumentation and control department.”.
6. The figure 4.2.3 stores data saying, “You have reached Lab 2, the Control System lab
of instrumentation and control department.”.
7. The figure 4.2.4 stores data saying, “You have reached Lab 3, the Microprocessor and
Microcontroller lab of instrumentation and control and department.
Fig. 4.2.3
Fig. 4.2.4
27. 27 | P a g e
CHAPTER 5
CONCLUSION
5.1 WORK CONCLUSION
1. The code for motion control of the tour guide robot to achieve wall following and
obstacle avoidance has been developed and implemented successfully.
2. The robot follows the wall and turns upon obstacle detection but with a little error in
precision.
3. Error correction will be achieved after correctly tuning the robot with PID control.
4. The QR code tag are obtained and the required data is stored in it successfully.
5. An android device is mounted on top of our robot for QR code detection upon reaching
the destination.
6. Text to speech conversion is done using an android application.
28. 28 | P a g e
5.2 FUTURE SCOPE
This project can prove to be very beneficial in the future.
Few of the future applications of such tour guide robot are: -
1. One application of indoor service robots is the use of autonomous mobile robots as tour
guides in museums or exhibitions.
2. The proposed indoor navigation and product recognition technique to provide blind
people shopping support system.
3. For route and direction, the framework can likewise RFID and Smartphone innovation
to empower available looking for visually impaired individuals in Supermarket.
4. This kind of robot can likewise be utilized to do the day by day errands of human with
extra innovation.
29. 29 | P a g e
PROJECT SCHEDULE
TABLE: PROJECT SCHEDULE:
January 2019
Topic selection and Literature Review.
Synopsis submission
Acquirement of components
February 2019
Hardware design
Testing of Ultrasonic Range Sensors
Connection of DC motor to the driving circuit and further to
the microcontroller unit and testing of each unit.
March 2019
Software design
Designing of Codes on Arduino
Implementing codes on Android device
April 2019
Testing of Tour Guide
May 2019
Final Report
Table A
30. 30 | P a g e
REFERENCES
1. Gunhee kim, Woojin Chung, Kyung-Rock Kim, Munsang Kim, “The Autonomous
Tour-Guide Robot Jinny”, Proceedings of the IEEE/RSJ International Conference
on Intelligent Robots and Systems, vol. 4, pp. 3450-3455, October 2004.
2. D. Rodríguez-Losada, F. Matia, R. Galán, M. Hernando, J. M. Montero and J. M.
Lucas, "Urbano, an Interactive Mobile Tour-Guide Robot". Advances in Service
Robotics. Ed. H. Seok. In-Teh, 2008.
3. J. Beckwith, R. Lefief, S. Sherbrook, M. Williams, and K. Yelamarthi, “CATE:
Central Automated Tour Experience,” Proceedings of the 2012 ASEE North-
Central Section Conference.
4. Kumar Yelamarthi, Stephen Sherbrook, Jonathan Beckwith, Matt Williams, Robert
Lefief, “An RFID Based Autonomous Indoor Tour Guide Robot” Circuits and
Systems (MWSCAS), 2012 IEEE 55th International Midwest Symposium.
5. V. Kulyukin, C. Gharpure, J. Nicholson, and S. Pavithran, "RFID in robot-assisted
indoor navigation for the visually impaired," in Intelligent Robots and Systems,
2004.(IROS 2004). Proceedings. 2004 IEEE/RSJ International Conference on,
2004, pp. 1979-1984.
6. Seok Ju Lee, Jongil Lim, Girma Tewolde, Jaerock Kwon. "Autonomous tour guide
robot by using ultrasonic range sensors and QR code recognition in indoor
environment" , IEEE International Conference on Electro/Information Technology,
2014
31. 31 | P a g e
ANNEXURES
#define TPL 7 // Arduino pin tied to trigger pin on ping sensor.
#define EPL 6 // Arduino pin tied to echo pin on ping sensor.
#define TPF 9 // Arduino pin tied to trigger pin on ping sensor.
#define EPF 8 // Arduino pin tied to echo pin on ping sensor.
#define TPR 13 // Arduino pin tied to trigger pin on ping sensor.
#define EPR 12 // Arduino pin tied to echo pin on ping sensor.
float lSensor, rSensor, fSensor, duration1, duration2, duration3;
float PB =30;
float TD=30;
float PID_error, PID_p, PID_i, PID_il, PID_d, PID_R, PID_pasterror, Time, elapsedtime,
prevtime, PID1, PIDL, PIDR;
boolean frontwall ;
boolean leftwall ;
boolean rightwall ;
boolean leftwall_1 ;
boolean frontoldwall;
boolean leftoldwall;
boolean rightoldwall ;
int en1 = 2 ;
int en2 = 3 ;
int en3 = 4 ;
int en4 = 5 ;
int enA = 10 ;
int enB = 11 ;
32. 32 | P a g e
int dir;
#define FORWARD 1
#define STOP 0
#define BACKWARD 2
#define RIGHT 3
#define LEFT 4
int basespeed = 100;
int left_threshold = 40 ;
int right_threshold = 20 ;
int front_threshold = 20 ;
int command = 1;
void setup() {
Serial.begin(9600); // Open serial monitor at 115200 baud to see ping results.
leftwall_1 = true;
for (int i = 2; i <= 5; i++)
{pinMode(i, OUTPUT);}
for (int j = 9; j <= 11; j++)
{pinMode(j, OUTPUT);}
pinMode(7, OUTPUT);
pinMode(13, OUTPUT);
pinMode(6, INPUT);
pinMode(8, INPUT);
pinMode(12,INPUT);
}
33. 33 | P a g e
void loop() {
elapsedtime = millis();
ReadSensors();
walls();
command = Serial.read();
Serial.println(command);
// Serial.print(" Left : ");
// Serial.print(lSensor);
// Serial.print(" cm ");
// Serial.print(" Front : ");
// Serial.print(fSensor);
// Serial.print(" cm ");
// Serial.print(" Right : ");
// Serial.print(rSensor);
// Serial.print(" cm ");
// Serial.println(leftwall_1);
// Serial.print(frontwall-frontoldwall);
// Serial.print(" ");
// Serial.print(rightoldwall-rightoldwall);
// Serial.print(" ");
// Serial.print(leftoldwall-leftoldwall);
// Serial.println(" ");
if (command == 1 || command == -1){
if ((leftwall-leftoldwall) == 1 || (leftwall-leftoldwall) == -1 || (rightwall-rightoldwall) == 1 ||
(rightwall-rightoldwall) == -1 || (frontwall-frontoldwall) == 1 || (frontwall-frontoldwall) == -
1)
{
setDirection(0);
delay(1000);
36. 36 | P a g e
else
{setDirection(0);
delay(6000);
}
leftoldwall = leftwall;
rightoldwall= rightwall;
frontoldwall=frontwall;
}
void ReadSensors() {
digitalWrite(TPL, LOW); // Clears the trigPin
delayMicroseconds(2); // Sets the trigPin on HIGH state for 10 micro seconds
digitalWrite(TPL, HIGH);
delayMicroseconds(10);
digitalWrite(TPL, LOW);
duration1 = pulseIn(EPL, HIGH); // Reads the echoPin, returns the sound wave travel time
in microseconds
lSensor= duration1*0.034/2; // Calculating the distance
if (lSensor > 40)
{
lSensor= 40;
}
digitalWrite(TPF, LOW); // Clears the trigPin
delayMicroseconds(2); // Sets the trigPin on HIGH state for 10 micro seconds
digitalWrite(TPF, HIGH);
delayMicroseconds(10);
digitalWrite(TPF, LOW);
duration2 = pulseIn(EPF, HIGH); // Reads the echoPin, returns the sound wave travel time
37. 37 | P a g e
in microseconds
fSensor= duration2*0.034/2; // Calculating the distance
if (fSensor > 40)
{
fSensor= 40;
}
digitalWrite(TPR, LOW); // Clears the trigPin
delayMicroseconds(2); // Sets the trigPin on HIGH state for 10 micro seconds
digitalWrite(TPR, HIGH);
delayMicroseconds(10);
digitalWrite(TPR, LOW);
duration3 = pulseIn(EPR, HIGH); // Reads the echoPin, returns the sound wave travel time
in microseconds
rSensor= duration3*0.034/2; // Calculating the distance
if (rSensor > 40)
{
rSensor= 40;
}
}
void walls() {
if ( lSensor < left_threshold ) {
leftwall = true ;
}
else {
leftwall = false ;
}
38. 38 | P a g e
if ( rSensor < right_threshold ) {
rightwall = true ;
}
else {
rightwall = false ;
}
if ( fSensor < front_threshold ) {
frontwall = true ;
leftwall_1 =false;
}
else {
frontwall = false ;
}
}
void setDirection(int dir) {
if ( dir == FORWARD ) {
digitalWrite(en1, HIGH); // Left wheel forward
digitalWrite(en2, LOW);
digitalWrite(en3, HIGH); // Right wheel forward
digitalWrite(en4, LOW);
39. 39 | P a g e
}
else if ( dir == STOP ) {
digitalWrite(en1, LOW );
digitalWrite(en2, LOW );
digitalWrite(en3, LOW );
digitalWrite(en4, LOW );
}
else if ( dir == BACKWARD ) {
digitalWrite(en1, LOW ); // Left wheel backward
digitalWrite(en2, HIGH );
digitalWrite(en3, LOW ); // Right wheel backward
digitalWrite(en4, HIGH );
}
else if ( dir == RIGHT ) {
digitalWrite(en1, HIGH ); // Left wheel backward
digitalWrite(en2, LOW );
digitalWrite(en3, LOW ); // Right wheel backward
digitalWrite(en4, HIGH );
}
else if ( dir == LEFT ) {
digitalWrite(en1, LOW ); // Left wheel backward
digitalWrite(en2, HIGH );
digitalWrite(en3, HIGH ); // Right wheel backward
digitalWrite(en4, LOW );
}
}
void PID() {
ReadSensors();
PID_error = 20 - lSensor;
PID_p = PID_error*100/PB;
PID_d = (((PID_error - PID_pasterror)/elapsedtime)*TD*100)/PB/1000;
41. 41 | P a g e
PROJECT DETAILS
Student Details
Student Name Kushagra Raje
Register Number 150921192 Section / Roll No B/37
Email Address kushraje@gmail.com Phone No (M) 8447014599
Student Name Shweton Kedia
Register Number 140921011 Section / Roll No B/46
Email Address skedia2014@gmail.com Phone No (M) 9591371457
Student Name Bhupender Singh
Register Number 150921214 Section/Roll No B/41
Email Address bhupendrasinghdewra@gmail.com Phone No(M) 9829076751
Project Details
Project Title
Autonomous Campus Tour Guide Robot by
using Ultrasonic Range Sensors and QR code
Recognition in Indoor Environment
Project Duration 4 months Date of reporting 3-01-2019
Expected date of
completion of project 9-.5-2019
Organization Details
Organization Name MANIPAL INSTITUTE OF TECHNOLOGY
Full postal address
Instrumentation & Control Engineering.,
M.I.T, Manipal-576104
with pin code
Website address www.manipal.edu
Supervisor Details
Supervisor Name Dr. Sandra D’Souza
Designation Asst. Professor Selection Grade
Full contact address
Dept. of Instrumentation & Control
Engineering, Manipal Institute of
with pin code
Technology, Manipal – 576 104
(Karnataka State), INDIA
Email address sandra.dsouza@manipal.edu Phone No (M) 9886093003
Internal Guide Details
Faculty Name Ms. Sneha Nayak (Assistant Professor)
Full contact address Dept. of Instrumentation & Control Engineering, Manipal Institute of
with pin code Technology, Manipal – 576 104 (Karnataka State), INDIA
Email address sneha.nayak@manipal.edu