The multi-sensor image fusion technology can obtain a more comprehensive and more accurate and reliable image, in
order to understand the scene or recognize the target more easily. However, most existing algorithms are mainly based on
optical remote sensing images, which is highly susceptible by media interference, supplemented by SAR images. The
image fusion between SAR images and PAN images also cannot save the textural feature and the color information
effectively at the same time. In view of these problems, this paper presents a multi-sensor image fusion algorithm based
on region-based selection and IHS transform. The SAR image and PAN image are firstly IHS transformed to achieve the
intensity (I), hue (H) and saturation (S) weights. The I weights of SAR image and PAN image are separately decomposed
using SIDWT algorithm to extract wavelet coefficients. Then, the I weight of SAR image is divided into regular area and
irregular area based on a new adaptive segmentation method. A new fusion rules is presented according to local feature,
and then used to fuse corresponding wavelet coefficients of the I weight of SAR image and PAN image. Inverse SIDWT
is carried out on the fused wavelet coefficients to get the I weight (I’) of fused image. Finally, the fused image is obtained
by inverse IHS transform of I’ weight with the H, S weight of PAN image. Experimental results of real images validated
the effectiveness of the proposed algorithm by objective evaluation such as standard deviation, entropy, average gradient,
etc.
As a new image multiscale geometric analysis tool, the nonsubsampled contourlet transform (NSCT) has many
advantages such as multiscale, localization and multidirection, and can efficiently capture the geometric information of
images. Therefore, when the NSCT is introduced to image fusion, the characteristics of original images can be taken
better and more information for fusion can be obtained. In this paper, a novel fusion algorithm for fusion of the synthetic
aperture radar (SAR) image and multispectral images using conjointly the intensity-hue-saturation (IHS) transform and
NSCT is proposed. In the proposed method, atrous wavelet is adopted to extract the detail information in low frequency
parts fusion, and a new salience measure named as local inner product is introduced to select the high frequency
coefficients. A PALSAR HH image of ALOS satellite despeckled by the Lee-sigma filter and HJ-1 multispectral images
are used to evaluate the performance and efficiency of the proposed method. The fused images of each method are
evaluated by qualitative and quantitative comparison and analysis compared with some traditional fusion rules. The
experimental results indicate that the proposed method has the merits of better preservation of image definition and less
loss of spectral information.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.