The project is to develop a autonomous navigation system along with mapping of the path.
A robot which senses the edges of the object in the path and move without colliding the object. This application equipped with camera as main component which captures the images and transmitted to workstation through wireless antenna.
The processing of the image is done on a workstation or computer using MATLAB-2013a. An IR ranging device, which senses any objects ahead of it and accordingly the robot change its direction to avoid any collision.
Thus we ensure that even in cases of circumstances leading to errors in the output of the image processing algorithm, a decision can be made using the input from the IR sensors.
SIMULTANEOUS MAPPING AND NAVIGATION FOR RENDEZVOUS IN SPACE APPLICATIONS
1. SIMULTANEOUS MAPPING AND
NAVIGATION FOR RENDEZVOUS IN SPACE
APPLICATIONS
( An Image based TRACKING )
UNDER THE GUIDANCE OF :
PROF ANAND HIREMATH SENIOR SCIENTIST SUDEESH B
DEPT OF COMPUTER SCIENCE & ENGG DEPT OF CDEG ISRO SATELLITE CENTRE
VIJAYPUR BENGALURU
PRESENTED BY
NANDAKISHOR JAHAGIRDAR 2BL12CS041
1
2. Contents
Abstract
Introduction
Literature Survey
Objectives
Problem Statement
Methodology
Proposed System
System Design
System Initialization
Image Acquisition , processing and Analysing
Comparison of Operators
Results
Conclusion and Future Work
References
Acknowledgement
2
3. Abstract
The project is to develop a autonomous navigation system along with
mapping of the path.
A robot which senses the edges of the object in the path and move
without colliding the object. This application equipped with camera as
main component which captures the images and transmitted to
workstation through wireless antenna.
The processing of the image is done on a workstation or computer using
MATLAB-2013a. An IR ranging device, which senses any objects ahead of
it and accordingly the robot change its direction to avoid any collision.
Thus we ensure that even in cases of circumstances leading to errors in the
output of the image processing algorithm, a decision can be made using
the input from the IR sensors.
3
4. Introduction
Line follower and path followers are a class of robots that are autonomously
navigating by following a given path or line. An autonomous robot is an
intelligent machine that performs desired tasks with a high degree of
accuracy.
Some of the characteristics of an autonomous navigation robot are:
• Moving along the calculated path.
• Computing safe local paths.
In this project a Robotic hardware is used which is commercially known as
FIREBIRD V, which is an Omni-directional Robotic Platform shown in the Fig. 1,
it has three wheels of Omni direction which are placed at 120º with respect to
each another. FIREBIRD V supports ATMEGA2560 as microcontroller
adapter boards.
4
Fig. 1
5. Literature Survey
This project involves the use of the computer vision in the process of Autonomous
Rendezvous [1] and Capture (AR&C) for space application between the vehicles.
Qin Zhang, et.al [2] Vision system is one of the most powerful and popular sensing method
used for autonomous navigation.
AnandKrishnan N, et. al.[3] measures a pixels adjacent in each of the four directions over each
window is calculated for Feature extraction of image, which is used to detect ‘L’ junctions in
image.
A good description of the various edge detection techniques can be found in [4].
Single image based navigation can be found in [5].
5
6. Objectives
The main objective of this project is
1)To develop an algorithm which can be incorporated on an
Omnidirectional Robotic Platform to senses the edges of the object in
the path.
2)To Move Robot without colliding the object.
3) The processing of the image shall be done on a workstation using
MATLAB-2013a and the path decided accordingly.
6
7. Problem Statement
The Indian Space Research Organization (ISRO) although well versed with the
dynamic of space, presently it does not engage in developing simultaneous
mapping and navigation for the AR&C (Autonomous Rendezvous and
Capture)capabilities.
Though the line followers are essential for class of robotic systems they have
limitations like
I) Drifts and sharp turns can make the robot lose the track and it
cannot recover.
II) It will follow and turn the path if the angle of the line is on minimum
so there is the possibility of losing the track if the angle is too high.
7
8. Methodology to design and
Implement
The images are captured and sent to the workstation wirelessly for
processing.
Decision is taken at the Workstation and commands sent to robot using
Wi-Fi module.
Based on received commands and response from the IR sensors,
microcontroller controls the direction of movement.
8
9. Proposed System
The Fig. 3 shows the system used during
the experimentation, the external
workstation (PC here) is used to process
the images and provide decision
commands.
9
Fig. 3
10. System Design
In the proposed project, the robot which has been adapted is commercially
known as “FIREBIRD V” which has a mounted camera with AV (Audio-
Video) transmitter which transmits the images to the AV receiver.
The output of the receiver is connected to a TV tuner card which intern is
connected to the computer. The Fig. 2 shows Block Diagram of the
proposed project.
10
12. System Initialization
This involves initializing the model and all its individual components. The
system initialization involves the following steps:
• Initializing the COM Port.
• Creating and testing Wi-Fi Module.
• Initializing Wi-Fi wireless Module.
• Initializing the Camera for video previewing.
• Initializing MATLAB for real time frame capturing.
12
13. Image Acquisition
Image Acquisition process involves acquiring the image/video [2] using
Camera and transferring to the base station serially using AV Transmitter
and receiver wirelessly.
The AV Transmitter is mounted on the system and receiver connected to
the base station.
• Capture image/ video [4] by the robot.
• Transmit data serially using AV Transmitter.
• Receive data using the AV Receiver wirelessly.
• Data sent to the work station (PC here).
13
14. Image Processing & Image analysis
Image Processing
The captured image is then processed to obtain the objects in the image.
Various algorithms are used to restore, enhance and segment the image.
Thresholding and image segmentation is carried out to retain the required
region and remove unwanted regions from the image.
Image Analysis
After Image Segmentation, the edge detection algorithm extract the
edges, here prewitt edge detection algorithm is applied.
Obstacles are detected along the path and appropriate safe local path
(forward, backward, right and left) is sent through Wi-Fi Module.
14
15. Comparison of Image Processing
Operator
15
Fig 4.1 Original Fig 4.2 Canny only Fig 4.3 Sobel only Fig 4.4 Prewitt Only
Morphological Structured
with
Fig 4.5 Canny Fig 4.6 Sobel Fig 4.7 Prewitt
16. Results Of Image Processing
Algorithm
In the proposed project, the image captured from the camera mounted on
robot is sent to the workstation (PC here) through the sender Wi-Fi
module and received on the receiving Wi-Fi module which is connected to
the computer and is processed in MATLAB by applying Edge detection [3]
Algorithm (here Prewitt edge detection algorithm is applied).
The Original image is shown in the Fig. 4 and the result obtained after
edge detection is shown in the Figure 5.
16
18. Conclusion & Future work
The project “Simultaneous Mapping and Navigation for Rendezvous in
Space Applications”is being carried out at ISAC ISRO Bangalore. This
project uses robotic hardware commercially known as FIREBIRD V. This
application uses camera as a main component, which is mounted on top of
the robot. The processing of the image is done on a computer using
MATLAB-2013a and the decision for the robot is taken according to the
algorithm.
The Future work involves testing the same image using OpenCV (Open
Source Computer vision) Library for better performance and testing the IR
range device, which senses any objects ahead of it and accordingly the
robot change its direction to avoid any collision.
18
19. References
1) Buzz Aldrin, “Line-of-sight guidance techniques for manned orbital rendezvous.” PhD diss.,
Massachusetts Institute of Technology, 1963.
2) Qin Zhang, ,“Single image-based path planning for a spherical robot,” in 5th IEEE Conference on
Industrial Electronics and Applications (ICIEA), 15-17 June 2010, pp. 1879–1884,
DOI:10.1109/ICIEA.2010.5515441.
3) AnandKrishnan N., S. Santosh Baboo, “An Evaluation of Popular Edge Detection Techniques in
Digital Image Processing.”, In IEEE International Conference on Intelligent Computing Applications
(ICICA), pp.213-217,2014.
4) Alan J. Lipton, Hironobu Fujyoshi and Raju S. Patil, “Moving Target Classification and Tracking
from Real-time Video”, in Proceedings Fourth IEEE workshop on Applications of Computer Vision,
pp. 8-14, 1998.
5) Aziz N. et al. “Real-Time Tracking using Edge and Color Feature.” in IEEE International Conference
on Computer and Communication Engineering (ICCCE), pp. 247-250, 2014.
19
20. Acknowledgement
I acknowledge the ISAC (Indian Space Research Organization Satellite
Centre, Bangalore), for providing the excellent opportunity to use their
hardware kit and the required support to carry out this work.
20