ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010    An Efficient K-Nearest Neighbors Bas...
ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010manual interpretation [8]. With no doubt...
ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010hyperspectral remote sensing data. To st...
ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010sparse covariances is also directly appl...
ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010    •    Application of Johnson’s shorte...
ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010                                        ...
ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010                            TABLE I.    ...
ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010     Special Issue of the IEEE Signal Pr...
Upcoming SlideShare
Loading in …5
×

An Efficient K-Nearest Neighbors Based Approach for Classifying Land Cover Regions in Hyperspectral Data via Non-Linear Dimensionality Reduction

645 views

Published on

In recent times, researchers in the remote
sensing community have been greatly interested in
utilizing hyperspectral data for in-depth analysis of
Earth’s surface. In general, hyperspectral imaging comes
with high dimensional data, which necessitates a pressing
need for efficient approaches that can effectively process
on these high dimensional data. In this paper, we present
an efficient approach for the analysis of hyperspectral
data by incorporating the concepts of Non-linear manifold
learning and k-nearest neighbor (k-NN). Instead of
dealing with the high dimensional feature space directly,
the proposed approach employs Non-linear manifold
learning that determines a low-dimensional embedding of
the original high dimensional data by computing the
geometric distances between the samples. Initially, the
dimensionality of the hyperspectral data is reduced to a
pairwise distance matrix by making use of the Johnson's
shortest path algorithm and Multidimensional scaling
(MDS). Subsequently, based on the k-nearest neighbors,
the classification of the land cover regions in the
hyperspectral data is achieved. The proposed k-NN based
approach is evaluated using the hyperspectral data
collected by the NASA’s (National Aeronautics and Space
Administration) AVIRIS (Airborne Visible/Infrared
Imaging Spectrometer) from Kennedy Space Center,
Florida. The classification accuracies of the proposed k-
NN based approach demonstrate its effectiveness in land
cover classification of hyperspectral data.

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
645
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
12
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

An Efficient K-Nearest Neighbors Based Approach for Classifying Land Cover Regions in Hyperspectral Data via Non-Linear Dimensionality Reduction

  1. 1. ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010 An Efficient K-Nearest Neighbors Based Approach for Classifying Land Cover Regions in Hyperspectral Data via Non-Linear Dimensionality Reduction 1 K Perumal and 2Dr. R Bhaskaran 1 Department of Computer Science, DDE, Madurai Kamaraj University, Madurai-625021, Tamilnadu, India Email: perumalmku@yahoo.co.in 2 School of Mathematics, Madurai Kamaraj University, Madurai -625 021, Tamilnadu, India Email: raman.bhaskaran@gmail.comAbstract—In recent times, researchers in the remote number of channels with narrow contiguous spectralsensing community have been greatly interested in bands. Hence, the hyperspectral data can be used toutilizing hyperspectral data for in-depth analysis of achieve better discrimination of the spectral signaturesEarth’s surface. In general, hyperspectral imaging comes of land-cover classes that materialize alike whenwith high dimensional data, which necessitates a pressing viewed by traditional multispectral sensors [2]. Theneed for efficient approaches that can effectively processon these high dimensional data. In this paper, we present hyperspectral data are collected using hyperspectralan efficient approach for the analysis of hyperspectral sensors that sample the reflected solar radiation fromdata by incorporating the concepts of Non-linear manifold the Earth’s surface in the portion of the spectrumlearning and k-nearest neighbor (k-NN). Instead of extending from the visible region through the near-dealing with the high dimensional feature space directly, infrared and mid-infrared (wavelengths between 0.3the proposed approach employs Non-linear manifold and 2.5 μm) in hundreds of narrow (on the order of 10learning that determines a low-dimensional embedding of nm) contiguous bands [4]. These instrumentsthe original high dimensional data by computing the symbolize spectral signatures with much greater detailgeometric distances between the samples. Initially, the than traditional multispectral sensors, and thus, candimensionality of the hyperspectral data is reduced to apairwise distance matrix by making use of the Johnsons potentially offer improved discrimination of targets.shortest path algorithm and Multidimensional scaling This high spectral resolution yields enormous amounts(MDS). Subsequently, based on the k-nearest neighbors, of data, placing stringent requirements onthe classification of the land cover regions in the communications, storage, and processing [3]. Forhyperspectral data is achieved. The proposed k-NN based instance, the Airborne Visible/Infrared Imagingapproach is evaluated using the hyperspectral data Spectrometer (AVIRIS) amasses a 512 (along track) ×collected by the NASA’s (National Aeronautics and Space 614 (across track) × 224 (bands) × 12 (bits) data cubeAdministration) AVIRIS (Airborne Visible/Infrared in 43 s, corresponding to more than 700 Mb; HyperionImaging Spectrometer) from Kennedy Space Center,Florida. The classification accuracies of the proposed k- collects 4 Mb in 3 s, corresponding to 366 kB/km2 [5].NN based approach demonstrate its effectiveness in land Thus, the application of hyperspectral images brings incover classification of hyperspectral data. new capabilities and with it some difficulties in their processing and analysis.Index Terms—Remote Sensing, Hyperspectral, Non- The determination of land cover types correspondinglinear Dimensionality Reduction (NLDR), Mahalanobis to the spectral signatures in the hyperspectral imagedistance, Johnson’s shortest path algorithm, would be a typical application of hyperspectral data, forMultidimensional Scaling (MDS), AVIRIS (Airborne instance, to examine changes in the ecosystem overVisible/Infrared Imaging Spectrometer), k-nearest large geographic areas [8]. Hyperspectral images,neighbor (k-NN). unlike the extensively used multispectral images, can be utilized not only to differentiate distinct categories I. INTRODUCTION of land cover, but also the defining components of Data collected from remote sensing serve as a every land cover category, such as minerals, soil anddominant source for information on vegetation vegetation type [7]. With all these advantages overparameters that are desirable in all sorts of models multispectral images and with enormous quantities ofmeant for describing the processes at the Earth’s hyperspectral data available, extracting reliable andsurface [1]. In recent times, hyperspectral data added accurate class labels for each `pixel from theeven more power by presenting spectral information hyperspectral images is a non-trivial task, involvingabout ground scenes on the basis of an enormous either expensive field campaigns or time-consuming 1© 2010 ACEEEDOI: 01.ijsip.01.02.01
  2. 2. ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010manual interpretation [8]. With no doubts, the bulky of manifold learning algorithms for non-linearamount of data concerned with hyperspectral imagery dimension reduction and for representation of highdramatically increases the processing complexity and dimensional observations through non-linear mappingtime. A unique but principal task for hyperspectral [13]. Good examples of manifold learning techniquesimage analysis is to achieve effective reduction in the include: Isometric feature mapping (Isomap) [16],amount of data involved or selection of the relevant Local Linear Embedding (LLE) [15], Laplacianbands associated with a specific application from the Eigenmaps and Semidefinite Embedding. Even though,entire data set [7]. Some important operations that can these methods were devised to symbolize, highbe carried out with the information contained in dimensional non-linear phenomena in lowerhyperspectral data include characterization, dimensional spaces, the embedded features areidentification, and classification of the land-covers with eminently helpful for classification of hyperspectralimproved accuracy and robustness. Nevertheless, a data. Lately, Bachmann et al. [13] and Chen et al. [14]number of decisive problems should be considered in have successfully applied the Isomap method to theclassification of hyperspectral data namely, the high hyperspectral data. Yet, the development of a morenumber of spectral channels, the spatial variability of robust classifier that exploits the advantages of non-the spectral signature, the high cost of true sample linear dimension reduction and computationallabeling and the quality of data [6]. efficiency is still an area of competitive research. Achieving high classification accuracy and good This research paper proposes an efficient approachgeneralization in hyperspectral image analysis are based on k-nearest neighbors for hyperspectral datacomparatively simpler than the dimension of the input analysis using shortest path computation andspace, which continues to be a difficult problem, Multidimensional scaling (MDS). Since, the proposedparticularly when the number of classes is so large. approach based on k-nearest neighbors’ deal with highMoreover, the high dimensionality of the data is dimensional data, the primary step is to performchallenging for supervised statistical classification dimensionality reduction of the high dimensionaltechniques that make use of the estimated covariance hyperspectral data. Primarily, a novel approach devisedmatrix as the number of known samples is for non-linear manifold learning is applied to the highcharacteristically small relative to the dimension of the dimensional hyperspectral data, which reduces thedata [9]. Earlier studies of supervised methods have dimensionality of the input to a pairwise distancerevealed that a complex classifier is likely to over train matrix with the aid of the Johnsons shortest pathin the aforesaid situations, whereas a weak classifier is algorithm. MDS is employed to estimate theoften insufficient [10]. With the increase in complexity dimension of the manifold created. Lastly, the k-nearestof the classifiers, the generalization error eventually neighbors attained are used to classify the land coverincreases because of over-training [11]. The aforesaid regions in the hyperspectral data based on the distanceproblem can be alleviated using ensemble methods that measures computed during dimensionality reduction.work by reducing the model variance. Complex The rest of the paper is organized as follows. Sectionclassifiers, in addition, do not characteristically perform II presents a brief review of some recent significantwell when characteristics of the training/test data researchers. The proposed non-linear manifold learningobtained over the study site evolve in a new area. The approach for dimensionality reduction and k-nearestabove condition is referred as the knowledge transfer neighbors based approach for hyperspectral dataproblem [12]. In land cover classification, the classification are described in section III. Experimentalknowledge transfer problem is considered as a results obtained from hyperspectral data collected fromsignificant problem, as it is frequently difficult to Kennedy Space Center and a formal investigation ofacquire labeled samples from a new area. Changes in the classification accuracies of the proposed approachspectral signatures can be caused by seasonal changes, are presented in Section IV. Section V sums up theunknown land cover types or a different mixture of paper with the conclusion.classes. Hence, it is essential to devise a simpleclassifier that can acclimatize to such changes and II. REVIEW OF RELATED SIGNIFICANT RESEARCHESmaintain good classification accuracies for the Literature presents with plentiful of researches thattraining/testing data. A number of existing classifiers perform analysis of hyperspectral data. Of them, abuild their models in accordance with the behavior of significant number of researches make use of manifoldlabeled samples in the reduced or original feature learning for classifying hyperspectral data. A handfulspace. of significant works related to the proposed approach On the contrary, non-linear manifold learning are presented below.algorithms actually presumes that the original high Hongjun Su et al. [17] have devised a noveldimensional data lie on a low dimensional manifold algorithm named OBI, based on a traditional algorithmdefined by local geometric differences between and fractal dimension, for quicker processing ofsamples. Current research has illustrated the impending 2© 2010 ACEEEDOI: 01.ijsip.01.02.01
  3. 3. ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010hyperspectral remote sensing data. To start with, the optimal solution through reproduction or mutation.fractal dimension was utilized as the criterion to prune Moreover, the genetic algorithm was capable ofthe noisy bands, and only those bands with better identifying and discarding noisy bands, on the basis ofspatial structure, quality and spectral feature were the fitness criterion computed from the correlation,preserved. Subsequently, the correlation coefficients transformed divergence and optimal number of bands.and covariance amongst all bands were made use of to Qian Du et al. [21] have investigated the applicationcompute the optimal band index, followed by the of independent-component analysis (ICA) toselection of the optimum bands. OBI algorithm has hyperspectral remote sensing image classification.proved over other algorithms, on band selection in They had focused on the performance of two renownedhyperspectral remote sensing data processing.An and commonly utilized ICA algorithms: jointalgorithm that employs spectral-angle based Support approximate diagonalization of Eigen matrices (JADE)Vector Clustering (SVC) and Principal Component and FastICA; yet their proposed method is alsoAnalysis (PCA) for hyperspectral image analysis was applicable to other ICA algorithms. The chiefpresented by S. Sindhumol and M. Wilscy [18]. In their advantage of utilizing ICA is its capability to performprevious research for hyper-spectral dimensionality object classification with unknown spectral signaturesreduction based on Principal Component Analysis in an unknown image scene, i.e., unsupervised(PCA), they have not taken into account, the meaning classification. Nevertheless, ICA is computationallyor behavior of the spectrum, and moreover, the results expensive, which restricts its application to high-were prejudiced by the majority of the components in dimensional data analysis. So as to, make it applicablethe scene. A probable solution to the aforesaid problem or reduce the computation time in hyperspectral imageis to perform a spectral angle based classification classification, a data-preprocessing procedure has beenbefore dimensionality reduction. In their current employed to achieve dimensionality reduction of theresearch, they have proposed a clustering based on data. The experimental results of their proposedsupport vectors using spectral based kernels that have approach have illustrated that the chief principalproduced good results in hyperspectral image components from the NAPC transform can betterclassification. The algorithm was tested with two preserve the object information in the original data thanhyperspectral image data sets of 210 bands each that those from PCA. Consequently, an ICA algorithmwere recorded with HYper-spectral Digital Imagery could offer better object classification.Collection Experiment (HYDICE) air-borne sensors. Yangchi Chen et al. [22] have examined the concept Qian Du and Nicolas H. Younan [19] have examined of L-Isomap and its pros and cons when applied tothe application of Fisher’s linear discriminant analysis hyperspectral data. Isomap and L-Isomap were(FLDA) in classifying hyperspectral remote sensing evaluated by conducting experiments on dimensionalityimages. The core idea of FLDA is to design an optimal reduction and representation of high dimensionaltransform so that the classes can be separated well in observation. Moreover, they have studied on L-Isomapthe low-dimensional space. The difficulty of in coincidence with hyperspectral data classification.realistically applying FLDA to hyperspectral images Their proposed MST-cut landmark selection approachinclude: the unavailability of sufficient training samples was judged against random selection and k-meansand indefinite information for all the classes present. cluster centers.Berge et al. [23] have proposed a simpleHence, the original FLDA is altered to shun the algorithm for reducing the complexity of Gaussian ML-requirements of complete class knowledge, for instance based classifiers for hyperspectral data. The core ideathe number of actual classes present. They have also of this research is to determine a sparse approximationinvestigated the performance of the class of principal to the inverse covariance’s of the componentcomponent analysis (PCA) techniques before FLDA distributions utilized in their classification model. Oneand have discovered that the interference and noise inspiration for devising this approach was to combatadjusted PCA (INAPCA) can offer the improvement in the problems with conventional classifiers because ofthe final classification.Claudionor Ribeiro da SILVA et sample sparsity. The proposed approach reduced theal. [20] have presented a method for selecting features number of parameters when the number of availablefrom hyperspectral images. From the experiments ground truthed samples is low, whilst sacrificing a littleconducted the feasibility of the use of genetic accuracy in modeling the density. The experimentsalgorithms for dimensionality reduction of conducted have portrayed that their method performedhyperspectral remote sensing data has been proved to comparably or better than state of the art conventionalimprove the accuracy of digital classification. The classifiers such as SVM, using only a fraction of theelitism-based algorithm has attained the best results, by full covariance matrices. The performance compared tofar. The utilization of genetic algorithm adds with an covariance regularization strategies, in their paperadvantage of better flexibility in the search for an represented by LOOC, seems more than adequate.optimal solution, by reintroducing a spectral band that Their method also performs well in cases where QDAis discarded within the evolutional process, in the collapses due to sample sparsity and for modeling 3© 2010 ACEEEDOI: 01.ijsip.01.02.01
  4. 4. ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010sparse covariances is also directly applicable to • Novel manifold learning approach for NLDRcovariance estimates of component distributions in • Land cover classification using the k-nearestmixture models. neighbors In general, hyperspectral images contain hundreds ofbands leading to covariance matrices having tens of A. Proposed Novel Manifold Learning Approach forthousands of elements. Of late, the time-series literature NLDRhas witnessed the usage of general linear regression Dimensionality reduction aims at maintaining onlymodels in the estimation of inverse covariance matrix. the most significant dimensions, i.e. the dimensionsC. Jensen et al. [24] have espoused and applied those that encompass the most valuable information for theideas to the problems identified in ill-posed mission at hand, or providing a mapping from the highhyperspectral image classification. The results of dimensional space to the low dimensional embedding.experimentation have shown that at least some of the Manifold learning is a popular approach for NLDR.approaches can give a lower classification error than Generally, the process of estimating a low-dimensionaltraditional methods such as the linear discriminant embedding of high-dimensional space, which underliesanalysis (LDA) and the regularized discriminant the data of interest, is called as manifold learning.analysis (RDA). Moreover, the results have shown in Manifold learning algorithms are based on thecontrast to earlier beliefs that, long-range correlation assumption that most data sets have an artificially highcoefficients are essential to build an effectual dimensionality; although every data point comprises ofhyperspectral classifier, and that the high correlations possibly thousands of features, it could be illustrated asbetween neighboring bands appear to permit differing a function of only a few underlying parameters.sparsity configurations of the covariance matrix to Specifically, a non-linear manifold is an abstractattain similar classification results. mathematical space that is locally Euclidean (i.e., around every point, there is a neighborhood that isIII. PROPOSED APPROACH FOR LAND COVER CLASSIFICATION topologically the same as described by Euclidean OF HYPERSPECTRAL IMAGES USING NON-LINEAR MANIFOLD geometry). For any two data points lying on a Non- LEARNING AND K-NEAREST NEIGHBORS linear manifold, the “true distance” between them is the geodesic distance on the manifold, i.e. the distance This section details the efficient approach proposed along the surface of the manifold, rather than thefor achieving land cover classification on hyperspectral straight-line Euclidean distance. Researchers havedata. The proposed approach integrates Non-linear proposed a number of algorithms for manifold learning,manifold learning and the concepts of k-NN for including: Stochastic Neighbor Embedding (SNE),effective classification of land cover in hyperspectral Isomap, Locally Linear Embedding (LLE), Laplaciandata. Generally, effective classification of land cover Eigenmaps, Semidefinite Embedding, and a host ofcould be accomplished by using the classical k-NN, variants of the aforesaid algorithms. Perhaps, mostwhich classifies the novel observations based on the algorithms serve to be the finest and are applied amongclass label of its k-nearest neighbors by making use of a the multitude of procedures existing for NLDR. In spitedistance measure. But, classical k-NN alone does not of there common usage and optimal behavior, it is stillperform exceptionally well on high dimensional data. possible to improve the classical manifold learningAs the proposed approach deals with hyperspectral approaches, so as to achieve better results in its field ofdata, it necessitates some add-ons to the classical k-NN application.classifier to exploit the advantages of the classical k- One possible extension to the existing NLDRNN. Some advantages of the classical k-NN classifier approaches is to tailor them, specifically for handlingthat can be exploited include, very large datasets. As the proposed approach for 1. Easy to implement, hyperspectral image classification is meant to deal with 2. Very good classification accuracy on low large datasets, we devise a novel approach for NLDR dimensional problems that aims to discover the significant underlying 3. Provides Non-linear decision boundaries. parameters so as to determine a low-dimensional The classical k-NN classifier has achieved very good representation of the data. The proposed approach forclassification accuracies on low dimensional problems. non-linear manifold learning is based on shortest pathSo, an effective solution proposed for hyperspectral network updating and Multidimensional Scalingimage classification is to incorporate the concepts of (MDS) for NLDR. The steps involved in the proposednon-linear manifold learning and k-NN. Hence, in the approach for NLDR include,proposed approach, we incorporate Non-linear • Initial computation of the pairwise distancemanifold learning as a preprocessing step to matrix among the neighborhood ‘k’ usinghyperspectral image classification. Thereby, the Mahalanobis distance.proposed approach for hyperspectral land coversclassification is composed of two phases namely, 4© 2010 ACEEEDOI: 01.ijsip.01.02.01
  5. 5. ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010 • Application of Johnson’s shortest path The new weight of every edge is algorithm to compute the geodesic distance w  u  v =dist  s,u +w u  v −dist  s,v  between all-pairs of points in the k-nearest neighbor graph constructed. JOHNSONAPSPV,E,w  : • The NxN distance matrix computed is fed for create a new vertex s dimensioanlity reduction to the classical MDS for every vertex v ∈V algorithm. D w  s v0 Given data points y 1 ,y 2 , .. .. ,y n ∈R , weassume that the data lies on a d - dimensional wv  s∞manifold M embedded within, where d<D .Moreover, we assume the manifold M is described by dist [ s,⋅]  SHIMBEL V,E,w,sa single coordinate chart f : M  Rd . The manifold d abort if SHIMBEL found a negetive cyclelearning consists of finding z 1 ,z 2 ,. , .. . z n ∈ R ,where z i =f  y i  . The processes involved in the for every edge  u,v ∈ Eproposed non-linear manifold learning approach arelisted as follows: w  u  v  dist [ s,u ]+w u  v − dist [ s,v ] 1. Construct neighborhood graph: First, wedetermine a user-defined neighborhood given by ‘k’. for every vertex u ∈VSubsequently, for every point ‘i’ in R D , we employthe mahalanobis distance measure to estimate the ‘k’ dist [ u,⋅]  DIJKSTRA V,E,w , uneighborhood points on the manifold. Mahalanobis for every vertex v ∈Vdistances computed is given by, d y i,j  betweenpairs of points i,j in the input space Y . A weighted dist [ u,v ]  dist [ u,v ]−dist [ s,u ]+dist [ s,v ]graph G is constructed using the neighborhood points The algorithm spends Θ(V ) time adding the artificialwith edges d y i,j  . Mahalanobis distance [28] of start vertex s , Θ(VE ) time running xeach observation, with random vectors  and  y SHIMBEL,O  E  time reweighting the graph, andand the covariance matrix S −1 is given by, then Θ(VE + V 2 log V ) running V passes of d   ,  =   −  T S −1     x y x y x− y Dijkstra’s algorithm. Thus, the overall running time is 2 2. Shortest path updating using Johnson’s Θ VE+V log V  compared to O n3 for theshortest path algorithm: A shortest path algorithm is naive Dijkstra’s algorithm and O  n4  for thegenerally employed in non-linear manifold learning to Bellman-Ford-Moore algorithm.compute the geodesic distances among the points that 3. Construct d-dimensional embedding: Theare beyond the neighborhood ‘k’. Here, the proposed classical MDS is applied to the matrix of graphapproach estimates the geodesic distances between all distances D G ={ d G  i,j  } , which constructs anpairs of points in the manifold M , by computing the embedding D stp of the data in a d -dimensionalshortest path distance among points beyond theneighborhood using ks nearest neighbor graph, with Euclidean space Z such that it best preserves thethe aid of the computationally efficient Johnson’s all- manifold’s estimated intrinsic geometry. MDS basedpairs shortest path algorithm. Johnson is defined as an on the updated distance matrix evaluates the truealgorithm that finds the shortest paths between all pairs dimension of the manifold. Given a matrix n×nof vertices by taking advantage of the sparse nature of DR of dissimilarities, MDS constructs a set ofthe graphs, with directed edges and no cycles. It finds a points whose interpoint Euclidean distances matchcost c  v  for each vertex, and as a result on those in D closely and it is used to evaluate the truereweighting of the graph, every edge has a non- dimension of the non-linear manifold. Generally, MDSnegative weight. Suppose the graph has a vertex s makes use of a stress function to evaluate the truethat has a path to every other vertex [25]. Johnson’s dimension of the manifold. The stress functionalgorithm computes the shortest paths from s to every inversely measures the degree of correspondenceother vertex, using Shimbel’s algorithm (which doesn’t between the distances among points implied by MDScare if the edge weights are negative), and then sets map and the matrix input by the user. The general form c  v =dist  s,v  , of the stress function corresponding to MDS is as follows: 5© 2010 ACEEEDOI: 01.ijsip.01.02.01
  6. 6. ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010 Where,  and  are random vectors and S −1 is the  ∑ ∑  f  x ij −d size8ij 2 x y covariance matrix. scale Based on the similarity measure computed and the value of ‘k’, the classifier selects ‘k’ nearest neighbors In the equation, d ij refers to the mahalanobis associated with the input matrix. Then, based ondistance, across all dimensions, between points majority voting of the k-nearest neighbors the i and j on the map, f  x ij  is some function of the algorithm classifies the unlabeled samples projected ininput data, and scale refers to a constant scaling factor, the space of the distance matrix D stp to theirused to keep stress values between 0 and 1. When the associated groups.MDS map perfectly reproduces the input data, f  x ij =d ij is for all i and j , so stress is zero. IV. EXPERIMENTAL RESULTSThus, the smaller the stress, the better the This section presents the results obtained fromrepresentations of the MDS map. experimentation on the proposed k-nearest neighbors B Hyperspectral Land Cover Classification Using The based approach for classifying land cover regions ink-Nearest Neighbors hyperspectral data via non-linear dimensionality reduction. The proposed approach is programmed in The proposed approach for the classification of land MATLAB (Matlab 7.4). The performancecover regions in the hyperspectral data makes use of (classification accuracy) of the proposed approach forthe concepts of the k-NN. The classical k-NN classifier hyperspectral data classification is evaluated by makingclassifies the data based on the class label of its k use of the data collected from Kennedy Space Centernearest neighbors, in the distance sense. In spite of (KSC). For clearly depicting the performance, the inputbeing a competitive algorithm for classification, k-NN, KSC data is sampled and the individual samples areas most classification methods when dealing with high further divided into: training instance and test instance.dimensional input data [26, 27] suffers from the curse- As for any classification technique, the classificationof- dimensionality and highly biased estimates. In the accuracy of the proposed approach is determined basedproposed approach, the difficulty of high dimensional on the level of training given. So, we analyze thedata classification can be solved, by initially mapping effectiveness of proposed approach in classifying thethe original data into a lower dimensional space by Hyperspectral data, by means of the classificationnon-linear manifold learning (which can be viewed as a accuracies obtained at different levels of training.preprocessing task) and then performing classificationon the k-nearest neighbors. The above process is A. Data Acquisitionapplicable because, generally, the high dimensional The experimentation of the proposed approach wasdata often represent phenomena that are intrinsically performed on the hyperspectral data collected bylow dimensional. NASA AVIRIS (Airborne Visible/Infrared Imaging The updated distance matrix computed using the Spectrometer) instrument over the Kennedy SpaceJohnson’s shortest path algorithm and the MDS Center (KSC), Florida. AVIRIS records data in 224represents the low dimensional manifold corresponding bands of 10 nm width with center wavelengths fromto the high dimensional hyperspectral data. And, the 400 - 2500 nm. The KSC data, which has been acquiredupdated distance matrix used for classification from an altitude of approximately 20 km, possess apreserves the local information on a graph whilst spatial resolution of 18 m. The hyperspectral dataincreasing the distance between non-neighbor samples. analysis was performed 176 bands, after eliminatingThe input to the classifier devised is the non-linear water absorption and low SNR bands. Selection of theembedding of the high dimensional data D stp , which training data was done by making use of the land coverwould be potentially useful for land cover maps derived from color infrared photography provided by the Kennedy Space Center and Landsat Thematicclassification. Given the non-linear manifold D stp , the Mapper (TM) imagery. Moreover, the KSC personnelclassifier classifies the rows of the data matrix into have developed a vegetation classification scheme so asgroups, based on the grouping of the rows of training. to define functional types that are discernable at theEvery training instance is associated with a class label, spatial resolution of Landsat and the AVIRIS data. Thewhich defines the grouping. Initially, for every row in similarity among the spectral signatures for certain D stp , a similarity measure is computed by comparing vegetation types makes the discrimination of land coverit with the training instances. The distance metric used for this environment very difficult. For classification,in the proposed approach for land cover classification is 13 classes representing the various land cover typesmahalanobis distance, given by, that occur in this environment were defined for the site (see Table I). Here, Classes 4 and 6 represent mixed d   ,  =   −  T S −1     x y x y x− y classes. 6© 2010 ACEEEDOI: 01.ijsip.01.02.01
  7. 7. ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010 TABLE I. V. CONCLUSIONLIST OF THE 13 CLASSES (LAND COVER TYPES) USED IN THE PROPOSED APPROACH In this paper, we have proposed an efficient Class No. samples approach for the analysis of hyperspectral data by1 Scrub 761 (14.6%) integrating Non-linear manifold learning and the2 Willow swamp 243 (4.66%) concepts of k-nearest neighbor (k-NN). The proposed k-3 Cabbage palm hammock 256 (4.92%)4 Cabbage palm/oak hammock 252 (4.84%) NN based approach has employed non-linear manifold5 Slash pine 161 (3.07%) learning to determine a low-dimensional embedding of6 Oak/broadleaf hammock 229 (4.38%) the original high dimensional data by computing the7 Hardwood swamp 105 (2.0%) geometric distances between the samples. To start with,8 Graminoid marsh 431 (8.27%) the proposed approach has employed the Johnsons9 Spartina marsh 520 (9.99%)10 Cattail marsh 404 (7.76%) shortest path algorithm and the classical MDS for11 Salt marsh 419 (8.04%) reducing the dimensionality of the hyperspectral data to12 Mud flats 503 (9.66%) a pairwise distance. Then, the classifier is applied to the13 Water 927 (17.8%) k-nearest neighbors, for the classification of the landB. Classification Results cover regions in the hyperspectral data. The proposed k-NN based approach has been assessed using theThe KSC data chosen for experimentation is first hyperspectral data collected by the NASA’s (Nationaldivided into 10 random samples. Subsequently, for Aeronautics and Space Administration) AVIRISevery sample of the KSC data: 75% is for chosen for (Airborne Visible/Infrared Imaging Spectrometer) fromtraining and 25% for testing. The results are recorded Kennedy Space Center (KSC), Florida. Thewith distinct levels of training say, 5%, 15%, 30%, classification accuracies of the proposed k-NN based50% and 75%(whole data chosen for training) and approach have illustrated its efficacy in land covertesting with 25% of test data. The experimentation is classification of hyperspectral data.repeated for all ten random samples chosen, and theaverage classification accuracy of the proposed REFERENCESapproach for hyperspectral classification is calculated.Table II shows the classification accuracies [1] E. A. Addink, S. M. de Jong, E. J. Pebesma, "Spatialcorresponding to the 13 different classes found in KSC Object Definition for Vegetation Parameter Estimation from HYMAP Data", ISPRS Commission VII Mid-termdata with 75% training and 25% testing. Table III Symposium, "Remote Sensing: From Pixels todepicts the average classification accuracies obtained Processes", Enschede, the Netherlands, 8-11 May 2006with training of 5%, 15%, 30%, 50% and 75% data and [2] Lee, C., Landgrebe, D.A., "Analyzing high-dimensionaltesting with 25% data. multispectral data", IEEE Trans. Geosci. Remote Sens., TABLE II. Vol. 31, pp. 792-800, 1993. CLASSIFICATION ACCURACY CORRESPONDING TO THE 13 DIFFERENT CLASSES IN [3] José M. Bioucas-Dias, and José M. P. Nascimento, A SINGLE SAMPLE OF DATA WITH 75% TRAINING "Hyperspectral Subspace Identification", IEEE Class Classification Accuracy Transactions on Geoscience and Remote Sensing, Vol. 1 96.335 46, No. 8, August 2008. 2 88.525 [4] T. M. Lillesand, R. W. Kiefer, and J. W. Chipman, 3 92.188 "Remote Sensing and Image Interpretation", 5th ed. 4 66.667 Hoboken, NJ: Wiley, 2004. 5 60.976 [5] J. P. Kerekes and J. E. Baum, “Spectral imaging system 6 46.552 analytical model for subpixel object detection,” IEEE 7 85.185 Trans. Geosci. Remote Sens., Vol. 40, No. 5, pp. 1088– 8 87.963 1101, May 2002. 9 95.385 [6] Gustavo Camps-Valls, and Lorenzo Bruzzone, "Kernel- 10 97.030 Based Methods for Hyperspectral Image Classification", 11 99.048 IEEE Transactions On Geoscience And Remote Sensing, 12 85.714 Vol. 43, No. 6, pp. 1351-1362, June 2005. 13 98.707 [7] Craig Rodarmel and Jie Shan, "Principal Component Analysis for Hyperspectral Image Classification", TABLE III. Surveying and Land Information Systems, Vol. 62, No. KSC TEST DATA: AVERAGE CLASSIFICATION ACCURACY AND STANDARD DEVIATION 2, pp.115-000, 2002. [8] Suju Rajan and Joydeep Ghosh, "Exploiting Class Training % Average Classification Accuracy Hierarchies for Knowledge Transfer in Hyperspectral 5% 82.020 Data", Lecture Notes in Computer Science, Springer 15% 85.692 Berlin/ Heidelberg, Vol. 3541, pp. 417-427, 2005. 30% 87.299 [9] D. Landgrebe, "Hyperspectral image data analysis as a 50% 88.370 high dimensional signal processing problem," (Invited), 75% 89.671 7© 2010 ACEEEDOI: 01.ijsip.01.02.01
  8. 8. ACEEE International Journal on Signal and Image Processing Vol 1, No. 2, July 2010 Special Issue of the IEEE Signal Processing Magazine, [19] Qian Du and Nicolas H. Younan, "Dimensionality Vol. 19, No. 1, pp. 17-28, 2002. Reduction and Linear Discriminant Analysis for[10] B. E. Boser, I. Guyon, and V. Vapnik, “A Training Hyperspectral Image Classification", Lecture Notes in Algorithm for Optimal Margin Classifiers,” in Computer Science, Vol. 5179, pp. 392-399, 2008. Computational Learning Theory, pp. 144–152, 1992. [20] Claudionor Ribeiro da SILVA, Jorge Antônio Silva[11] R. E. Schapire, Y. Freund, P. Bartlett, and W. S. Lee, CENTENO and Selma Regina Aranha RIBEIRO, “Boosting the margin: a new explanation for the "Reduction Of The Dimensionality Of Hyperspectral effectiveness of voting methods,” in Proc. 14th Data For The Classification Of Agricultural Scenes", International Conference on Machine Learning. Morgan 13th FIG Symposium on deformation measurement and Kaufmann, pp. 322–330, 1997. Analysis, LNEC, LISBON, 12-15 May 2008.[12] J. Ham, Y. Chen, M. M. Crawford, and J. Ghosh, [21] Qian Du, Ivica Kopriva, Harold Szu, "Independent- “Investigation of the random forest framework for component analysis for hyperspectral remote sensing classification of hyperspectral data,” IEEE Trans. imagery classification", Optical Engineering, Vol. 45, Geosci. and Remote Sens., vol. 43, no. 3, pp. 492–501, No. 1, pp. 017008 (1-13), January 2006. March 2005. [22] Yangchi Chen, Melba M. Crawford, Joydeep Ghosh,[13] C. M. Bachmann, T. L. Ainsworth, and R. A. Fusina, "Improved Non-linear manifold learning for land cover "Exploiting manifold geometry in hyperspectral classification via intelligent landmark selection", IEEE imagery", IEEE Trans. Geosci. and Remote Sens., Vol. International Conference on Geoscience and Remote 43, No. 3, pp. 441–454, March 2005. Sensing Symposium, (IGARSS 2006), pp. 545-548, July[14] Y. Chen, M. M. Crawford, and J. Ghosh, "Applying 31 - August 4 2006. Non-linear manifold learning to hyperspectral data for [23] Berge, A.; Jensen, A. C.; Solberg, A. H. S. "Sparse land cover classification", In 2005 International Geosci. Inverse Covariance Estimates for Hyperspectral Image and Remote Sens. Symposium, Seoul, South Korea, Jul. Classification", IEEE Transactions on Geoscience and 24-29 2005. Remote Sensing, Vol. 45, No. 5, Part 2, pp. 1399 - 1407,[15] S. T. Roweis and L. K. Saul, "Non-linear dimensionality May 2007. reduction by local linear embedding", Science, Vol. 290, [24] Jensen, A. C.; Berge, A.; Solberg, A. H. S. "Regression No. 5500, pp. 2323–2326, 2000. Approaches to Small Sample Inverse Covariance Matrix[16] J. B. Tenenbaum, V. de Silva, and J. C. Langford, "A Estimation for Hyperspectral Image Classification", global geometric framework for Non-linear IEEE Transactions on Geoscience and Remote Sensing, dimensionality reduction," Science, Vol. 290, No. 5500, Vol. 46, No. 10, Part12, pp. 2814 - 2822, Oct 2008. pp. 2319–2323, 2000. [25] Johnson, Donald B., "Efficient algorithms for shortest[17] Hongjun Su, Yehua Sheng, Peijun Du, "A New Band paths in sparse networks", Journal of the ACM, Vol. 24, Selection Algorithm for Hyperspectral Data Based On No. 1, pp. 1–13, 1977. Fractal Dimension", The International Archives of the [26] T. K. Ho, “Nearest Neighbors in random subspaces,” in Photogrammetry, Remote Sensing and Spatial Lecture notes in computer Science: Advances in Pattern Information Sciences. Vol. XXXVII. Part B7. Beijing Recognition, Berlin: Springer, pp.640-948, 1998. 2008. [27] D. Lowe, “Similarity metric learning for a variable-[18] S. Sindhumol, M. Wilscy, "Hyperspectral Image kernel classifier,” Neural Computation, Vol.7, No.1, Analysis--A Robust Algorithm Using Support Vectors pp.72-85, 1995. and Principal Components," International Conference [28] "Mahalanobis distance" from on Computing: Theory and Applications (ICCTA07), pp. http://www.aiaccess.net/English/Glossaries/GlosMod/e_ 389-395, 2007. gm_mahalanobis.htm 8© 2010 ACEEEDOI: 01.ijsip.01.02.01

×