Preprocessing is a major area of interest in the field of hyperspectral endmember extraction, for it can provide a few high-quality candidates for fast endmember extraction without sacrificing endmember accuracy. We propose a superpixel-guided preprocessing (SGPP) algorithm to accelerate endmember extraction based on spatial compactness and spectral purity analysis. The proposed SGPP first transforms a hyperspectral image into low-dimension data using principal component analysis. SGPP then utilizes the superpixel method, which normally has linear complexity, to segment the first three components into a set of superpixels. Next, SGPP transforms low-dimension superpixels into noise-reduced superpixels and calculates their spatial compactness and spectral purity based on Tukey’s test and data convexity. SGPP finally retains a few high-quality pixels from each superpixel with high spatial compactness and spectral purity indices for subsequent endmember identification. Based on the spectral angle distance, root-mean-square error, and speedup, experiments are conducted on synthetic and real hyperspectral datasets, and they indicate that SGPP is superior to current state-of-the-art preprocessing techniques.
Sparse unmixing (SU) can represent an observed image using pure spectral signatures and corresponding fractional abundance from a large spectral library and is an important technique in hyperspectral unmixing. However, the existing SU algorithms mainly exploit spatial information from a fixed neighborhood system, which is not sufficient. To solve this problem, we propose a nonlocal weighted SU algorithm based on global search (G-NLWSU). By exploring the nonlocal similarity of the hyperspectral image, the weights of pixels are calculated to form a matrix to weight the abundance matrix. Specifically, G-NLWSU first searches for a similar group of each pixel in the global scope then uses singular value decomposition to denoise and finally obtains the weight matrix by considering correlations between similar pixels. To reduce the execution burden of G-NLWSU, we propose a parallel computing version of G-NLWSU, named PG-NLWSU, which integrates compute unified device architecture-based parallel computing into G-NLWSU to accelerate the search for groups of nonlocally similar pixels. Our proposed algorithms shed new light on SU by considering a new exploitation process of spatial information and parallel computing scenario. Experimental results conducted on simulated and real datasets show that PG-NLWSU is superior to comparison algorithms.
To address the insufficiency of texture information-based classification features to classify samples, we proposed two methods for spatial information-enhanced hyperspectral imagery classification based on joint spatial-aware collaborative representation (JSaCR). First, we introduce a texture regularized-based joint spatial-aware collaborative representation (TRJSaCR) method, in which prior texture is regarded as a regularization term to constrain the coefficient of the objection function of JSaCR and the closed-form solution is obtained to reconstruct the test sample. Second is a spatial information-assisted discrimination rules (SIDR) method coupled with TRJSaCR (TRJSaCR-SIDR) for classification. More precisely, the label information of the test samples and their corresponding neighborhoods are specified by TRJSaCR-SIDR, and the final labels are determined by considering their neighborhood label distribution. Our work aims to broaden the knowledge of the utilization of spatial information in hyperspectral classification. Experimental results on two benchmark hyperspectral datasets, Indian Pines and Pavia University, indicate that the proposed algorithms are superior to other state-of-the-art classifiers.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.