SlideShare a Scribd company logo
Hazard Detection Algorithm for Safe
Autonomous Landing
Xander Masotto Narender Gupta
Computer Science Computer Science
masotto2@illinois.edu ngupta18@illinois.edu
Ayush Jain Aliya Burkit
Computer Science Aerospace Engineering
ajain42@illinois.edu burkit1@illinois.edu
Abstract
This paper describes a novel and reliable algorithm for detecting safe landing zones for a planetary
lander. The algorithm utilizes a computationally efficient method for finding critical features of
the terrain taken from the elevation map and improves the accuracy of its solution by comparing
the rock height features to the high resolution image using shadow analysis. Furthermore, the
algorithm employs a boosted decision tree supervised learning with 10% of the features to train
the algorithm with true solutions of the safe landing zones.
1 Introduction
The success of planetary missions often hinges on the ability to land a space probe safely in
a moderately hazardous area. Scientifically interesting areas on these planets are often
hazardous to land in. Additionally, the process of selecting a safe landing site during the
descent has to be fully automated. This is primarily because the communication lag between
the space probe and its handler on Earth does not allow real-time control.
Different types of sensors (SONAR/LIDAR) are often deployed on these space probes to
provide a low-resolution digital elevation map (DEM) of the area of interest. Additionally,
during descent, the space probe takes a higher resolution descent image (DI) of the terrain.
We leverage these separate modalities of information to develop a computationally efficient
algorithm that is able to predict safe landing zones with high accuracy.
1.1 Problem Definition
The objective of the algorithm is to detect safe landing zones for planetary lander. The algorithm
utilizes a 1000x1000 pixel programmable grey map image of the terrain with resolution of
0.1m/pixel and a 500x500 digital elevation map (DEM) of the terrain with resolution of
0.2m/pixel. The DEM is perfectly geo-registered with the image of the terrain. Each terrain
features rocks and craters (which account for roughness) and slopes. There are four types of
terrains that increase in roughness and slope, which the algorithm needs to analyze. Moreover, the
true solutions to the given sets of terrain images were provided.
The planetary lander has the following specifications:
1) 3.4 m diameter base plate
2) four 0.5 m diameter footpads at 90 degree intervals on the outer edge of the base plate
3) 0.39 m height between the bottom of the footpad and the bottom of the base plate.
The criteria for safe landing zones are identified as:
1) the distance above the plane defined by four triplets of lander footpads for all possible
orientations about vertical is less than 0.39 m
2) the maximum angle defined by the triplets of lander footpads for all orientations about
vertical is less than 10 degrees.
A visual interpretation of safe landing criteria is illustrated in Figure 1.
Figure 1: Criteria for safe landing zones for a planetary lander.
1.2 Approach
There exist numerous approaches to image based hazard detection for safe landing. Some of
the methods include but are not limited to: K-means clustering [1], slope evaluation from
gray level images [2], homography based slope estimation [3], and others. The
aforementioned methods solely utilize the gray images without the knowledge of the
elevation map, so these algorithms only process the image to obtain solutions of the safe
landing zones.
As noted in [4], safe landing on Mars was studied using image-based techniques by the Mars
Program. Their approach was simple - they split the image into tiles and calculate variance
in each tile, assuming that a low variance corresponds to a safe area. Although their
approach is not quantitative and slightly primitive, it is extremely fast. Some approaches
also try to detect explicit features on a planetary surface. These methods, because of
computational costs, are designed to run offline. Segmentation and texture methods are
useful in this class of algorithms. Texture methods have been proposed in [6]. Using
shadows to detect rocks has been implemented in [5].
Integrating multiple data modalities from multiple sensors has also received attention
recently. In [7], the authors present multi-decision fusion methods to deal with the
heterogeneous sensor inputs. Likewise, in [8], a fuzzy logic methodology for fusing sensed
data has been proposed.
Our approach takes advantage of the existing elevation map and processes it to determine the
critical features of the terrain and to identify the hazardous values. After an initial step of
finding slope and height features, the algorithm utilizes the 1000x1000 .pgm image to
improve the accuracies of rock heights by using shadow information. After all necessary data
about the features of the terrain are collected, the final step is to utilize the known true
solutions of the safe landing zones to train the algorithm in order to increase the confidence
of the predicted safe zones.
2 Algorithm
2.1 Algorithm overview
A flow chart of the algorithm structure is provided in Figure 2. The algorithm primarily consists of
three steps – (1) a preprocessing stage that resizes the elevation map to match the information in
the higher resolution descent image, (2) a feature extraction stage that extracts meaningful features
from the DEM and DI that are able to model the constraints in the problem, and (3) a prediction
stage in which we train a classifier on these features to predict the safety of a particular site.
We next describe each of these stages in detail.
Figure 2: The structure of the algorithm.
2.2 Preprocessing
A. Resizing Elevation Map: The low-resolution elevation map is first upsampled to
1000x1000 using bi-cubic interpolation. Since the boulders are close to spherical in
shape, bilinear interpolation fails to estimate their heights accurately. Various
interpolation techniques were tried out and bi-cubic interpolation was found to work
best.
B. Peak Correction: Since the elevation map is lower in resolution than the image,
there are cases where the height map does not capture the peaks of boulders, due to
aliasing errors. Accurate detection of peaks is important in order to satisfy constraint
(1) (see Section 1.1). To get the exact height and location of peaks, we add insights
from the image. Specifically, we detect the extremities of a boulder’s shadow using
gradient images. The boulder’s peak in the shadow is detected. Given the sun’s
elevation, this peak in the shadow is used to calculate the boulder’s height using
simple trigonometry. We store these image-based height estimates along with the
existing DEM-based height estimates.
2.3 Feature Extraction
We extracted multiple features informative features from the resized elevation map and the
image. The details are provided below:
A. Angle Calculation: For each location of the lander, there are several possible
orientations. Given a particular location and orientation, we calculate the plane
passing through the lander footpads. The slope of this plane is obtained by
calculating its angle with the vertical line. For each location, the maximum slope
(across all orientations) is taken as a feature. Since there are four footpads, no
unique plane may pass through the four landing points. To handle such cases, we
take the slopes of all possible planes passing through any three of the lander
footpads.
B. Overlap Detection: For the area taken by the lander, we calculate the difference
between the peak heights and the underlying plane (as calculated for angle
calculation). This difference will be used to handle constraint (1) along with
craters on inclined surfaces, and hence is used as a feature. Since we have two
estimates for the peak height (DEM based and image based), we end up with two
versions of this feature.
C. Surface Features: We extract several surface features for each point: surface angle,
surface roughness, and lander pad roughness. We generate 5 surface angle features
by computing a 5-level Gaussian pyramid of the DEM and computing the angle of
the tangent plane at each point. Surface roughness is calculated as the second
derivative of the DEM. Lander pad roughness takes the maximum roughness value
corresponding to the position of the lander pads.
2.4 Prediction
We use a decision tree classifier with AdaBoost filtering – This is because the constraints
defined in Section 1 are essentially decision rules, given our choice of features. We use 5
learners with a learning rate of 0.1. Since only four images are provided for both training
and testing, we train on sub-images. From each image, a contiguous sub-image comprising
of 10% of the pixels is used for training. We note that the alternative of sampling 10% points
randomly from the image is susceptible to over-fitting as the safety of a site is spatially
correlated with the safety of nearby sites. Additionally, in line with the evaluation metrics,
false positives were penalized more heavily than false negatives during training.
3 Results
The following figures and tables show the predicted safe landing zones and error rates for
four types of terrains of increasingly complexity. The safe landing zones are in white, the
false positives, or pixels identified as safe while not safe in a true solution, are in blue, and
false negatives, or pixels identified as unsafe while safe in a true solution, are in dark red.
The rate of detecting false positives should be as low as possible since these pixels have
higher penalties when compared to a true solution and they increase the chances of mission
failure significantly.
3.1 Terrain 1
This terrain has only minor surface roughness. There are no sloped regions or craters – only
small boulders.
Table 1: Terrain 1 algorithm solution.
Accuracy 98.9 %
Precision 99.7 %
Recall 98.8 %
Run time 23.6 s
Figure 3: Terrain 1 visual solution and error rate.
3.2 Terrain 2
In addition to the boulders in Terrain 1, this terrain also has an overall slope.
Table 2: Terrain 2 algorithm solution.
Accuracy 97.7 %
Precision 98.8 %
Recall 90.5 %
Run time 18.1 s
Figure 4: Terrain 2 visual solution and error rate.
3.3 Terrain 3
This terrain has boulders, slopes and craters.
Table 3: Terrain 3 algorithm solution.
Accuracy 97.9 %
Precision 98.7 %
Recall 84.3 %
Run time 16.8 s
Figure 5: Terrain 3 visual solution and error rate.
3.4 Terrain 4
Terrain 4 is the most challenging terrain – with increased roughness, higher slopes and
bigger craters.
Table 4: Terrain 4 algorithm solution.
Accuracy 97.4 %
Precision 97.4 %
Recall 72.8 %
Run time 19.2 s
Figure 6: Terrain 4 visual solution and error rate.
3.5 Unseen terrains
In addition to the terrain images that were provided with the true solution, the algorithm was
tested on the images previously unseen. The four images represent the four types of the
terrains that are the same as the types of terrains for which the true solutions were provided.
The results from unseen terrains are shown in Figure 7. The terrain roughness and slope
increase from left to right.
Figure 7: Visual solutions of the unseen terrains.
We see that the results from the unseen images correlate with the results from the four
original images. This means that the algorithm is robust to predicting solutions to the
terrains similar to the ones it has been trained with. Moreover, we notice that training the
algorithm with only 10% of the pixels in a contiguous sub-image provides the accuracy
similar to prediction accuracies for known Terrains 1-4. In other words, 10% of the features
is a well-selected value for the number of features for this particular type of problem. The
results from unseen data demonstrate that the proposed algorithm can be used to predict the
safe landing zones for unknown terrains, which makes it a good candidate for use on an
actual mission, where the landing zones are not known beforehand.
4 Conclusions
As shown in the Section 3, our proposed algorithm is an efficient and accurate method for
detecting hazards and predicting safe landing zones for a planetary lander. One of the critical
issues in autonomous safe landing is the speed and the memory use of the algorithm. Our
algorithm takes less than 25 seconds to output a solution after it was trained. The training
part takes more time and memory use, however this part of the algorithm can be performed
prior to landing stage of the lander.
The accuracy of the algorithm is close to 98% for all types of terrains and we consider it to
be a highly accurate result, since there are no significant hazardous portions of the terrain
that are identified as safe. It can be seen from Figures 3 – 6, that the landing sites that are
incorrectly identified are generally on the border with true unsafe regions, so it is expected
that the lander would avoid such zones overall.
A few assumptions have been made prior to developing the algorithm. First is the
assumption that the elevation map is trustworthy. In an actual mission, the sensors for
reading the elevation map of the terrain will be used. This can incorporate uncertainly errors
in the true values of the elevation map and introduce errors.
The second assumption is the fact that the true solutions were provided along with the terrain
data. In an actual mission, the true solution might not be known if the planetary object was
not explored significantly prior to the mission. However, in the case of a planetary mission,
it is desirable that the planetary object is explored in enough detail with the help of satellites
before sending high cost missions for landing. Overall, the assumptions that were made
during the algorithm development are applicable to real life missions.
Acknowledgments
The research described in this report was performed as a part of the 2015 NASA Jet
Propulsion Laboratory Team Space Design Competition. We thank the competition
organizers for the data and eventual recognition of our algorithm.
References
[1] Bajracharya, M. (2002) Single image based hazard detection for a planetary lander. Automation
Congress, 2002 Proc. 5th
Biannual World, pp. 585-590.
[2] Strandmore, T. J-M. & Trinh, S. (1999) Toward a vision based autonomous planetary lander. Proc.
AIAA GN&C conf. Paper #AIAA-99-4154.
[3] Huertas, A., Cheng, Y. & Madison, R. (2006) Passive imaging based multi-cue hazard detection for
spacecraft safe landing. IEEE Aerospace Conf.
[4] Halbrook, Timothy D., Jim D. Chapel, and Joseph J. Witte. "Derivation of hazard sensing and
avoidance maneuver requirements for planetary landers." In Guidance and control 2001, pp. 347-364.
2001.
[5] Gulick, V. C., R. L. Morris, M. Ruzon, and T. L. Roush. "Autonomous science analyses of digital
images for Mars sample return and beyond." (1999).
[6] Castano, Rebecca, Tobias Mann, and Eric Mjolsness. "Texture analysis for Mars rover images."
In SPIE's International Symposium on Optical Science, Engineering, and Instrumentation, pp. 162-
173. International Society for Optics and Photonics, 1999.
[7] Seraji, Homayoun, and Navid Serrano. "A Multi-sensor Decision fusion system for terrain safety
assessment." Robotics, IEEE Transactions on 25, no. 1 (2009): 99-108.
[8] Howard, Ayanna, and Homayoun Seraji. "Multi-sensor terrain classification for safe spacecraft
landing." Aerospace and Electronic Systems, IEEE Transactions on 40, no. 4 (2004): 1122-1131.

More Related Content

What's hot

Orthorectification and triangulation
Orthorectification and triangulationOrthorectification and triangulation
Orthorectification and triangulationMesfin Yeshitla
 
Close range Photogrammeetry
Close range PhotogrammeetryClose range Photogrammeetry
Close range Photogrammeetrychinmay khadke
 
Introduction of photogrammetry
Introduction of photogrammetryIntroduction of photogrammetry
Introduction of photogrammetry
Jayantha Samarasinghe
 
Photogrammetry
PhotogrammetryPhotogrammetry
Lecture 4 image measumrents & refinement
Lecture 4  image measumrents & refinementLecture 4  image measumrents & refinement
Lecture 4 image measumrents & refinement
Sarhat Adam
 
SURVEYING - Photogrammetry (CE 115) Lec2 By Afia Narzis Spring 2016
SURVEYING - Photogrammetry (CE 115) Lec2 By Afia Narzis Spring 2016SURVEYING - Photogrammetry (CE 115) Lec2 By Afia Narzis Spring 2016
SURVEYING - Photogrammetry (CE 115) Lec2 By Afia Narzis Spring 2016
PIYAL Bhuiyan
 
Photogrammetry - areaotriangulation
Photogrammetry - areaotriangulationPhotogrammetry - areaotriangulation
Photogrammetry - areaotriangulation
jayan_sri
 
Photogrammetry Surveying, its Benefits & Drawbacks
Photogrammetry Surveying, its Benefits & DrawbacksPhotogrammetry Surveying, its Benefits & Drawbacks
Photogrammetry Surveying, its Benefits & Drawbacks
NI BT
 
Stereoscopic vision
Stereoscopic visionStereoscopic vision
Introduction to photogrammetry
Introduction to photogrammetryIntroduction to photogrammetry
Introduction to photogrammetry
Maersk Line
 
Photogrammetry for Architecture and Construction
Photogrammetry for Architecture and ConstructionPhotogrammetry for Architecture and Construction
Photogrammetry for Architecture and Construction
Dat Lien
 
BASIC CONCEPTS OF PHOTOGRAMMETRY
BASIC CONCEPTS OF PHOTOGRAMMETRYBASIC CONCEPTS OF PHOTOGRAMMETRY
BASIC CONCEPTS OF PHOTOGRAMMETRY
Namitha M R
 
Photogrammetry
PhotogrammetryPhotogrammetry
Photogrammetry
Maersk Line
 
Stereoscopic Parallax
Stereoscopic ParallaxStereoscopic Parallax
Stereoscopic Parallax
Siva Subramanian M
 
Aerial Photogrammetry
Aerial Photogrammetry Aerial Photogrammetry
Aerial Photogrammetry
Mujeeb Muji
 
Introduction to Aerial Photogrammetry
Introduction to Aerial PhotogrammetryIntroduction to Aerial Photogrammetry
Introduction to Aerial Photogrammetry
Malla Reddy University
 
Aerial photographs and their interpretation
Aerial photographs and their interpretationAerial photographs and their interpretation
Aerial photographs and their interpretationSumant Diwakar
 
Aerial photography- Concept and Terminologies
Aerial photography- Concept and Terminologies Aerial photography- Concept and Terminologies
Aerial photography- Concept and Terminologies
UTTIYACHATTOPADHYAY2
 

What's hot (20)

Orthorectification and triangulation
Orthorectification and triangulationOrthorectification and triangulation
Orthorectification and triangulation
 
Close range Photogrammeetry
Close range PhotogrammeetryClose range Photogrammeetry
Close range Photogrammeetry
 
Introduction of photogrammetry
Introduction of photogrammetryIntroduction of photogrammetry
Introduction of photogrammetry
 
Photogrammetry
PhotogrammetryPhotogrammetry
Photogrammetry
 
Lecture 4 image measumrents & refinement
Lecture 4  image measumrents & refinementLecture 4  image measumrents & refinement
Lecture 4 image measumrents & refinement
 
SURVEYING - Photogrammetry (CE 115) Lec2 By Afia Narzis Spring 2016
SURVEYING - Photogrammetry (CE 115) Lec2 By Afia Narzis Spring 2016SURVEYING - Photogrammetry (CE 115) Lec2 By Afia Narzis Spring 2016
SURVEYING - Photogrammetry (CE 115) Lec2 By Afia Narzis Spring 2016
 
Photogrammetry - areaotriangulation
Photogrammetry - areaotriangulationPhotogrammetry - areaotriangulation
Photogrammetry - areaotriangulation
 
Photogrammetry
PhotogrammetryPhotogrammetry
Photogrammetry
 
appendix
appendixappendix
appendix
 
Photogrammetry Surveying, its Benefits & Drawbacks
Photogrammetry Surveying, its Benefits & DrawbacksPhotogrammetry Surveying, its Benefits & Drawbacks
Photogrammetry Surveying, its Benefits & Drawbacks
 
Stereoscopic vision
Stereoscopic visionStereoscopic vision
Stereoscopic vision
 
Introduction to photogrammetry
Introduction to photogrammetryIntroduction to photogrammetry
Introduction to photogrammetry
 
Photogrammetry for Architecture and Construction
Photogrammetry for Architecture and ConstructionPhotogrammetry for Architecture and Construction
Photogrammetry for Architecture and Construction
 
BASIC CONCEPTS OF PHOTOGRAMMETRY
BASIC CONCEPTS OF PHOTOGRAMMETRYBASIC CONCEPTS OF PHOTOGRAMMETRY
BASIC CONCEPTS OF PHOTOGRAMMETRY
 
Photogrammetry
PhotogrammetryPhotogrammetry
Photogrammetry
 
Stereoscopic Parallax
Stereoscopic ParallaxStereoscopic Parallax
Stereoscopic Parallax
 
Aerial Photogrammetry
Aerial Photogrammetry Aerial Photogrammetry
Aerial Photogrammetry
 
Introduction to Aerial Photogrammetry
Introduction to Aerial PhotogrammetryIntroduction to Aerial Photogrammetry
Introduction to Aerial Photogrammetry
 
Aerial photographs and their interpretation
Aerial photographs and their interpretationAerial photographs and their interpretation
Aerial photographs and their interpretation
 
Aerial photography- Concept and Terminologies
Aerial photography- Concept and Terminologies Aerial photography- Concept and Terminologies
Aerial photography- Concept and Terminologies
 

Similar to Hazard Detection Algorithm for Safe Autonomous Landing

Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
IOSR Journals
 
An Automatic Detection of Landing Sites for Emergency Landing of Aircraft
An Automatic Detection of Landing Sites for Emergency Landing of AircraftAn Automatic Detection of Landing Sites for Emergency Landing of Aircraft
An Automatic Detection of Landing Sites for Emergency Landing of Aircraft
IRJET Journal
 
A STUDY AND ANALYSIS OF DIFFERENT EDGE DETECTION TECHNIQUES
A STUDY AND ANALYSIS OF DIFFERENT EDGE DETECTION TECHNIQUESA STUDY AND ANALYSIS OF DIFFERENT EDGE DETECTION TECHNIQUES
A STUDY AND ANALYSIS OF DIFFERENT EDGE DETECTION TECHNIQUES
cscpconf
 
LocalizationandMappingforAutonomousNavigationin OutdoorTerrains: A StereoVisi...
LocalizationandMappingforAutonomousNavigationin OutdoorTerrains: A StereoVisi...LocalizationandMappingforAutonomousNavigationin OutdoorTerrains: A StereoVisi...
LocalizationandMappingforAutonomousNavigationin OutdoorTerrains: A StereoVisi...
Minh Quan Nguyen
 
Feature extraction based retrieval of
Feature extraction based retrieval ofFeature extraction based retrieval of
Feature extraction based retrieval of
ijcsity
 
E017443136
E017443136E017443136
E017443136
IOSR Journals
 
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
CSCJournals
 
Estimation of Terrain Gradient Conditions & Obstacle Detection Using a Monocu...
Estimation of Terrain Gradient Conditions & Obstacle Detection Using a Monocu...Estimation of Terrain Gradient Conditions & Obstacle Detection Using a Monocu...
Estimation of Terrain Gradient Conditions & Obstacle Detection Using a Monocu...
Connor Goddard
 
Digital Elevation Model (DEM)
Digital Elevation Model (DEM)Digital Elevation Model (DEM)
Digital Elevation Model (DEM)
Malla Reddy University
 
C012271015
C012271015C012271015
C012271015
IOSR Journals
 
A Novel Approach for Ship Recognition using Shape and Texture
A Novel Approach for Ship Recognition using Shape and Texture A Novel Approach for Ship Recognition using Shape and Texture
A Novel Approach for Ship Recognition using Shape and Texture
ijait
 
Detection of Bridges using Different Types of High Resolution Satellite Images
Detection of Bridges using Different Types of High Resolution Satellite ImagesDetection of Bridges using Different Types of High Resolution Satellite Images
Detection of Bridges using Different Types of High Resolution Satellite Images
idescitation
 
Application of Image Retrieval Techniques to Understand Evolving Weather
Application of Image Retrieval Techniques to Understand Evolving WeatherApplication of Image Retrieval Techniques to Understand Evolving Weather
Application of Image Retrieval Techniques to Understand Evolving Weather
ijsrd.com
 
Application of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position EstimationApplication of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position Estimation
IRJET Journal
 
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
ijma
 
Vehicle Tracking Using Kalman Filter and Features
Vehicle Tracking Using Kalman Filter and FeaturesVehicle Tracking Using Kalman Filter and Features
Vehicle Tracking Using Kalman Filter and Features
sipij
 
02_atiqa ijaz khan_05_2014
02_atiqa ijaz khan_05_201402_atiqa ijaz khan_05_2014
02_atiqa ijaz khan_05_2014
Atiqa khan
 
An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...Kunal Kishor Nirala
 
Satellite Image Classification using Decision Tree, SVM and k-Nearest Neighbor
Satellite Image Classification using Decision Tree, SVM and k-Nearest NeighborSatellite Image Classification using Decision Tree, SVM and k-Nearest Neighbor
Satellite Image Classification using Decision Tree, SVM and k-Nearest Neighbor
National Cheng Kung University
 

Similar to Hazard Detection Algorithm for Safe Autonomous Landing (20)

Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
 
An Automatic Detection of Landing Sites for Emergency Landing of Aircraft
An Automatic Detection of Landing Sites for Emergency Landing of AircraftAn Automatic Detection of Landing Sites for Emergency Landing of Aircraft
An Automatic Detection of Landing Sites for Emergency Landing of Aircraft
 
A STUDY AND ANALYSIS OF DIFFERENT EDGE DETECTION TECHNIQUES
A STUDY AND ANALYSIS OF DIFFERENT EDGE DETECTION TECHNIQUESA STUDY AND ANALYSIS OF DIFFERENT EDGE DETECTION TECHNIQUES
A STUDY AND ANALYSIS OF DIFFERENT EDGE DETECTION TECHNIQUES
 
report
reportreport
report
 
LocalizationandMappingforAutonomousNavigationin OutdoorTerrains: A StereoVisi...
LocalizationandMappingforAutonomousNavigationin OutdoorTerrains: A StereoVisi...LocalizationandMappingforAutonomousNavigationin OutdoorTerrains: A StereoVisi...
LocalizationandMappingforAutonomousNavigationin OutdoorTerrains: A StereoVisi...
 
Feature extraction based retrieval of
Feature extraction based retrieval ofFeature extraction based retrieval of
Feature extraction based retrieval of
 
E017443136
E017443136E017443136
E017443136
 
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
Unsupervised Building Extraction from High Resolution Satellite Images Irresp...
 
Estimation of Terrain Gradient Conditions & Obstacle Detection Using a Monocu...
Estimation of Terrain Gradient Conditions & Obstacle Detection Using a Monocu...Estimation of Terrain Gradient Conditions & Obstacle Detection Using a Monocu...
Estimation of Terrain Gradient Conditions & Obstacle Detection Using a Monocu...
 
Digital Elevation Model (DEM)
Digital Elevation Model (DEM)Digital Elevation Model (DEM)
Digital Elevation Model (DEM)
 
C012271015
C012271015C012271015
C012271015
 
A Novel Approach for Ship Recognition using Shape and Texture
A Novel Approach for Ship Recognition using Shape and Texture A Novel Approach for Ship Recognition using Shape and Texture
A Novel Approach for Ship Recognition using Shape and Texture
 
Detection of Bridges using Different Types of High Resolution Satellite Images
Detection of Bridges using Different Types of High Resolution Satellite ImagesDetection of Bridges using Different Types of High Resolution Satellite Images
Detection of Bridges using Different Types of High Resolution Satellite Images
 
Application of Image Retrieval Techniques to Understand Evolving Weather
Application of Image Retrieval Techniques to Understand Evolving WeatherApplication of Image Retrieval Techniques to Understand Evolving Weather
Application of Image Retrieval Techniques to Understand Evolving Weather
 
Application of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position EstimationApplication of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position Estimation
 
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
 
Vehicle Tracking Using Kalman Filter and Features
Vehicle Tracking Using Kalman Filter and FeaturesVehicle Tracking Using Kalman Filter and Features
Vehicle Tracking Using Kalman Filter and Features
 
02_atiqa ijaz khan_05_2014
02_atiqa ijaz khan_05_201402_atiqa ijaz khan_05_2014
02_atiqa ijaz khan_05_2014
 
An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...
 
Satellite Image Classification using Decision Tree, SVM and k-Nearest Neighbor
Satellite Image Classification using Decision Tree, SVM and k-Nearest NeighborSatellite Image Classification using Decision Tree, SVM and k-Nearest Neighbor
Satellite Image Classification using Decision Tree, SVM and k-Nearest Neighbor
 

Recently uploaded

Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Dr.Costas Sachpazis
 
DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
FluxPrime1
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
Robbie Edward Sayers
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
Aditya Rajan Patra
 
space technology lecture notes on satellite
space technology lecture notes on satellitespace technology lecture notes on satellite
space technology lecture notes on satellite
ongomchris
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
zwunae
 
Hierarchical Digital Twin of a Naval Power System
Hierarchical Digital Twin of a Naval Power SystemHierarchical Digital Twin of a Naval Power System
Hierarchical Digital Twin of a Naval Power System
Kerry Sado
 
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdfHybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
fxintegritypublishin
 
Final project report on grocery store management system..pdf
Final project report on grocery store management system..pdfFinal project report on grocery store management system..pdf
Final project report on grocery store management system..pdf
Kamal Acharya
 
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
ssuser7dcef0
 
Student information management system project report ii.pdf
Student information management system project report ii.pdfStudent information management system project report ii.pdf
Student information management system project report ii.pdf
Kamal Acharya
 
Investor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptxInvestor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptx
AmarGB2
 
ML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptxML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptx
Vijay Dialani, PhD
 
Fundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptxFundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptx
manasideore6
 
DfMAy 2024 - key insights and contributions
DfMAy 2024 - key insights and contributionsDfMAy 2024 - key insights and contributions
DfMAy 2024 - key insights and contributions
gestioneergodomus
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
ClaraZara1
 
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdfGoverning Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
WENKENLI1
 
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTSHeap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Soumen Santra
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
MdTanvirMahtab2
 
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
bakpo1
 

Recently uploaded (20)

Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
 
DESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docxDESIGN A COTTON SEED SEPARATION MACHINE.docx
DESIGN A COTTON SEED SEPARATION MACHINE.docx
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
 
space technology lecture notes on satellite
space technology lecture notes on satellitespace technology lecture notes on satellite
space technology lecture notes on satellite
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
 
Hierarchical Digital Twin of a Naval Power System
Hierarchical Digital Twin of a Naval Power SystemHierarchical Digital Twin of a Naval Power System
Hierarchical Digital Twin of a Naval Power System
 
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdfHybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
Hybrid optimization of pumped hydro system and solar- Engr. Abdul-Azeez.pdf
 
Final project report on grocery store management system..pdf
Final project report on grocery store management system..pdfFinal project report on grocery store management system..pdf
Final project report on grocery store management system..pdf
 
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
 
Student information management system project report ii.pdf
Student information management system project report ii.pdfStudent information management system project report ii.pdf
Student information management system project report ii.pdf
 
Investor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptxInvestor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptx
 
ML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptxML for identifying fraud using open blockchain data.pptx
ML for identifying fraud using open blockchain data.pptx
 
Fundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptxFundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptx
 
DfMAy 2024 - key insights and contributions
DfMAy 2024 - key insights and contributionsDfMAy 2024 - key insights and contributions
DfMAy 2024 - key insights and contributions
 
6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)6th International Conference on Machine Learning & Applications (CMLA 2024)
6th International Conference on Machine Learning & Applications (CMLA 2024)
 
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdfGoverning Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
 
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTSHeap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
 
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
一比一原版(SFU毕业证)西蒙菲莎大学毕业证成绩单如何办理
 

Hazard Detection Algorithm for Safe Autonomous Landing

  • 1. Hazard Detection Algorithm for Safe Autonomous Landing Xander Masotto Narender Gupta Computer Science Computer Science masotto2@illinois.edu ngupta18@illinois.edu Ayush Jain Aliya Burkit Computer Science Aerospace Engineering ajain42@illinois.edu burkit1@illinois.edu Abstract This paper describes a novel and reliable algorithm for detecting safe landing zones for a planetary lander. The algorithm utilizes a computationally efficient method for finding critical features of the terrain taken from the elevation map and improves the accuracy of its solution by comparing the rock height features to the high resolution image using shadow analysis. Furthermore, the algorithm employs a boosted decision tree supervised learning with 10% of the features to train the algorithm with true solutions of the safe landing zones. 1 Introduction The success of planetary missions often hinges on the ability to land a space probe safely in a moderately hazardous area. Scientifically interesting areas on these planets are often hazardous to land in. Additionally, the process of selecting a safe landing site during the descent has to be fully automated. This is primarily because the communication lag between the space probe and its handler on Earth does not allow real-time control. Different types of sensors (SONAR/LIDAR) are often deployed on these space probes to provide a low-resolution digital elevation map (DEM) of the area of interest. Additionally, during descent, the space probe takes a higher resolution descent image (DI) of the terrain. We leverage these separate modalities of information to develop a computationally efficient algorithm that is able to predict safe landing zones with high accuracy. 1.1 Problem Definition The objective of the algorithm is to detect safe landing zones for planetary lander. The algorithm utilizes a 1000x1000 pixel programmable grey map image of the terrain with resolution of 0.1m/pixel and a 500x500 digital elevation map (DEM) of the terrain with resolution of 0.2m/pixel. The DEM is perfectly geo-registered with the image of the terrain. Each terrain features rocks and craters (which account for roughness) and slopes. There are four types of terrains that increase in roughness and slope, which the algorithm needs to analyze. Moreover, the true solutions to the given sets of terrain images were provided. The planetary lander has the following specifications: 1) 3.4 m diameter base plate 2) four 0.5 m diameter footpads at 90 degree intervals on the outer edge of the base plate 3) 0.39 m height between the bottom of the footpad and the bottom of the base plate.
  • 2. The criteria for safe landing zones are identified as: 1) the distance above the plane defined by four triplets of lander footpads for all possible orientations about vertical is less than 0.39 m 2) the maximum angle defined by the triplets of lander footpads for all orientations about vertical is less than 10 degrees. A visual interpretation of safe landing criteria is illustrated in Figure 1. Figure 1: Criteria for safe landing zones for a planetary lander. 1.2 Approach There exist numerous approaches to image based hazard detection for safe landing. Some of the methods include but are not limited to: K-means clustering [1], slope evaluation from gray level images [2], homography based slope estimation [3], and others. The aforementioned methods solely utilize the gray images without the knowledge of the elevation map, so these algorithms only process the image to obtain solutions of the safe landing zones. As noted in [4], safe landing on Mars was studied using image-based techniques by the Mars Program. Their approach was simple - they split the image into tiles and calculate variance in each tile, assuming that a low variance corresponds to a safe area. Although their approach is not quantitative and slightly primitive, it is extremely fast. Some approaches also try to detect explicit features on a planetary surface. These methods, because of computational costs, are designed to run offline. Segmentation and texture methods are useful in this class of algorithms. Texture methods have been proposed in [6]. Using shadows to detect rocks has been implemented in [5]. Integrating multiple data modalities from multiple sensors has also received attention recently. In [7], the authors present multi-decision fusion methods to deal with the heterogeneous sensor inputs. Likewise, in [8], a fuzzy logic methodology for fusing sensed data has been proposed. Our approach takes advantage of the existing elevation map and processes it to determine the critical features of the terrain and to identify the hazardous values. After an initial step of finding slope and height features, the algorithm utilizes the 1000x1000 .pgm image to improve the accuracies of rock heights by using shadow information. After all necessary data about the features of the terrain are collected, the final step is to utilize the known true solutions of the safe landing zones to train the algorithm in order to increase the confidence of the predicted safe zones. 2 Algorithm 2.1 Algorithm overview A flow chart of the algorithm structure is provided in Figure 2. The algorithm primarily consists of three steps – (1) a preprocessing stage that resizes the elevation map to match the information in the higher resolution descent image, (2) a feature extraction stage that extracts meaningful features from the DEM and DI that are able to model the constraints in the problem, and (3) a prediction stage in which we train a classifier on these features to predict the safety of a particular site. We next describe each of these stages in detail.
  • 3. Figure 2: The structure of the algorithm. 2.2 Preprocessing A. Resizing Elevation Map: The low-resolution elevation map is first upsampled to 1000x1000 using bi-cubic interpolation. Since the boulders are close to spherical in shape, bilinear interpolation fails to estimate their heights accurately. Various interpolation techniques were tried out and bi-cubic interpolation was found to work best. B. Peak Correction: Since the elevation map is lower in resolution than the image, there are cases where the height map does not capture the peaks of boulders, due to aliasing errors. Accurate detection of peaks is important in order to satisfy constraint (1) (see Section 1.1). To get the exact height and location of peaks, we add insights from the image. Specifically, we detect the extremities of a boulder’s shadow using gradient images. The boulder’s peak in the shadow is detected. Given the sun’s elevation, this peak in the shadow is used to calculate the boulder’s height using simple trigonometry. We store these image-based height estimates along with the existing DEM-based height estimates. 2.3 Feature Extraction We extracted multiple features informative features from the resized elevation map and the image. The details are provided below: A. Angle Calculation: For each location of the lander, there are several possible orientations. Given a particular location and orientation, we calculate the plane passing through the lander footpads. The slope of this plane is obtained by calculating its angle with the vertical line. For each location, the maximum slope (across all orientations) is taken as a feature. Since there are four footpads, no unique plane may pass through the four landing points. To handle such cases, we take the slopes of all possible planes passing through any three of the lander footpads. B. Overlap Detection: For the area taken by the lander, we calculate the difference between the peak heights and the underlying plane (as calculated for angle calculation). This difference will be used to handle constraint (1) along with craters on inclined surfaces, and hence is used as a feature. Since we have two estimates for the peak height (DEM based and image based), we end up with two versions of this feature.
  • 4. C. Surface Features: We extract several surface features for each point: surface angle, surface roughness, and lander pad roughness. We generate 5 surface angle features by computing a 5-level Gaussian pyramid of the DEM and computing the angle of the tangent plane at each point. Surface roughness is calculated as the second derivative of the DEM. Lander pad roughness takes the maximum roughness value corresponding to the position of the lander pads. 2.4 Prediction We use a decision tree classifier with AdaBoost filtering – This is because the constraints defined in Section 1 are essentially decision rules, given our choice of features. We use 5 learners with a learning rate of 0.1. Since only four images are provided for both training and testing, we train on sub-images. From each image, a contiguous sub-image comprising of 10% of the pixels is used for training. We note that the alternative of sampling 10% points randomly from the image is susceptible to over-fitting as the safety of a site is spatially correlated with the safety of nearby sites. Additionally, in line with the evaluation metrics, false positives were penalized more heavily than false negatives during training. 3 Results The following figures and tables show the predicted safe landing zones and error rates for four types of terrains of increasingly complexity. The safe landing zones are in white, the false positives, or pixels identified as safe while not safe in a true solution, are in blue, and false negatives, or pixels identified as unsafe while safe in a true solution, are in dark red. The rate of detecting false positives should be as low as possible since these pixels have higher penalties when compared to a true solution and they increase the chances of mission failure significantly. 3.1 Terrain 1 This terrain has only minor surface roughness. There are no sloped regions or craters – only small boulders. Table 1: Terrain 1 algorithm solution. Accuracy 98.9 % Precision 99.7 % Recall 98.8 % Run time 23.6 s Figure 3: Terrain 1 visual solution and error rate. 3.2 Terrain 2 In addition to the boulders in Terrain 1, this terrain also has an overall slope. Table 2: Terrain 2 algorithm solution.
  • 5. Accuracy 97.7 % Precision 98.8 % Recall 90.5 % Run time 18.1 s Figure 4: Terrain 2 visual solution and error rate. 3.3 Terrain 3 This terrain has boulders, slopes and craters. Table 3: Terrain 3 algorithm solution. Accuracy 97.9 % Precision 98.7 % Recall 84.3 % Run time 16.8 s Figure 5: Terrain 3 visual solution and error rate. 3.4 Terrain 4 Terrain 4 is the most challenging terrain – with increased roughness, higher slopes and bigger craters. Table 4: Terrain 4 algorithm solution. Accuracy 97.4 % Precision 97.4 % Recall 72.8 % Run time 19.2 s
  • 6. Figure 6: Terrain 4 visual solution and error rate. 3.5 Unseen terrains In addition to the terrain images that were provided with the true solution, the algorithm was tested on the images previously unseen. The four images represent the four types of the terrains that are the same as the types of terrains for which the true solutions were provided. The results from unseen terrains are shown in Figure 7. The terrain roughness and slope increase from left to right. Figure 7: Visual solutions of the unseen terrains. We see that the results from the unseen images correlate with the results from the four original images. This means that the algorithm is robust to predicting solutions to the terrains similar to the ones it has been trained with. Moreover, we notice that training the algorithm with only 10% of the pixels in a contiguous sub-image provides the accuracy similar to prediction accuracies for known Terrains 1-4. In other words, 10% of the features is a well-selected value for the number of features for this particular type of problem. The results from unseen data demonstrate that the proposed algorithm can be used to predict the safe landing zones for unknown terrains, which makes it a good candidate for use on an actual mission, where the landing zones are not known beforehand. 4 Conclusions As shown in the Section 3, our proposed algorithm is an efficient and accurate method for detecting hazards and predicting safe landing zones for a planetary lander. One of the critical issues in autonomous safe landing is the speed and the memory use of the algorithm. Our algorithm takes less than 25 seconds to output a solution after it was trained. The training part takes more time and memory use, however this part of the algorithm can be performed prior to landing stage of the lander. The accuracy of the algorithm is close to 98% for all types of terrains and we consider it to be a highly accurate result, since there are no significant hazardous portions of the terrain that are identified as safe. It can be seen from Figures 3 – 6, that the landing sites that are incorrectly identified are generally on the border with true unsafe regions, so it is expected that the lander would avoid such zones overall. A few assumptions have been made prior to developing the algorithm. First is the assumption that the elevation map is trustworthy. In an actual mission, the sensors for
  • 7. reading the elevation map of the terrain will be used. This can incorporate uncertainly errors in the true values of the elevation map and introduce errors. The second assumption is the fact that the true solutions were provided along with the terrain data. In an actual mission, the true solution might not be known if the planetary object was not explored significantly prior to the mission. However, in the case of a planetary mission, it is desirable that the planetary object is explored in enough detail with the help of satellites before sending high cost missions for landing. Overall, the assumptions that were made during the algorithm development are applicable to real life missions. Acknowledgments The research described in this report was performed as a part of the 2015 NASA Jet Propulsion Laboratory Team Space Design Competition. We thank the competition organizers for the data and eventual recognition of our algorithm. References [1] Bajracharya, M. (2002) Single image based hazard detection for a planetary lander. Automation Congress, 2002 Proc. 5th Biannual World, pp. 585-590. [2] Strandmore, T. J-M. & Trinh, S. (1999) Toward a vision based autonomous planetary lander. Proc. AIAA GN&C conf. Paper #AIAA-99-4154. [3] Huertas, A., Cheng, Y. & Madison, R. (2006) Passive imaging based multi-cue hazard detection for spacecraft safe landing. IEEE Aerospace Conf. [4] Halbrook, Timothy D., Jim D. Chapel, and Joseph J. Witte. "Derivation of hazard sensing and avoidance maneuver requirements for planetary landers." In Guidance and control 2001, pp. 347-364. 2001. [5] Gulick, V. C., R. L. Morris, M. Ruzon, and T. L. Roush. "Autonomous science analyses of digital images for Mars sample return and beyond." (1999). [6] Castano, Rebecca, Tobias Mann, and Eric Mjolsness. "Texture analysis for Mars rover images." In SPIE's International Symposium on Optical Science, Engineering, and Instrumentation, pp. 162- 173. International Society for Optics and Photonics, 1999. [7] Seraji, Homayoun, and Navid Serrano. "A Multi-sensor Decision fusion system for terrain safety assessment." Robotics, IEEE Transactions on 25, no. 1 (2009): 99-108. [8] Howard, Ayanna, and Homayoun Seraji. "Multi-sensor terrain classification for safe spacecraft landing." Aerospace and Electronic Systems, IEEE Transactions on 40, no. 4 (2004): 1122-1131.