Open Access Paper
4 October 2017 Implementing and validating of pan-sharpening algorithms in open-source software
Author Affiliations +
Abstract
Several approaches have been used in remote sensing to integrate images with different spectral and spatial resolutions in order to obtain fused enhanced images. The objective of this research is three-fold. To implement in R three image fusion techniques (High Pass Filter, Principal Component Analysis and Gram-Schmidt); to apply these techniques to merging multispectral and panchromatic images from five different images with different spatial resolutions; finally, to evaluate the results using the universal image quality index (Q index) and the ERGAS index. As regards qualitative analysis, Landsat-7 and Landsat-8 show greater colour distortion with the three pansharpening methods, although the results for the other images were better. Q index revealed that HPF fusion performs better for the QuickBird, IKONOS and Landsat-7 images, followed by GS fusion; whereas in the case of Landsat-8 and Natmur-08 images, the results were more even. Regarding the ERGAS spatial index, the ACP algorithm performed better for the QuickBird, IKONOS, Landsat-7 and Natmur-08 images, followed closely by the GS algorithm. Only for the Landsat-8 image did, the GS fusion present the best result. In the evaluation of spectral components, HPF results tended to be better and ACP results worse, the opposite was the case with the spatial components. Better quantitative results are obtained in Landsat-7 and Landsat-8 images with the three fusion methods than with the QuickBird, IKONOS and Natmur-08 images. This contrasts with the qualitative evaluation reflecting the importance of splitting the two evaluation approaches (qualitative and quantitative). Significant disagreement may arise when different methodologies are used to asses the quality of an image fusion. Moreover, it is not possible to designate, a priori, a given algorithm as the best, not only because of the different characteristics of the sensors, but also because of the different atmospherics conditions or peculiarities of the different study areas, among other reasons.

1.

INTRODUCTION

Image fusion has been described as a set of techniques that combines images of different spatial resolutions or containing different types of information with the objective of generating new images that enhance the properties of the original images [1]. The aim is to improve data interpretability, either by improving their visual quality - by facilitating the discrimination of certain categories - or by demonstrating the robustness of a given analysis method [2]. In the last case, multispectral images, generally of low spatial resolution, are combined with a panchromatic image of greater spatial resolution, so it is necessary for the two sets of images to be properly registered to allow integration.

Another reason for image fusion is that more than 70% of terrestrial observation satellites and a large number of digital aerial cameras are simultaneously equipped with panchromatic and multispectral sensors [3,4], the latter with lower spatial resolution but higher spectral resolution, while the former has just the opposite characteristics, pointing to the complementarity of the two data sets.

Several pan-sharpening algorithms have been proposed, and some attempts have been made to classify them, while the way to effectively evaluate the quality of image fusion results has been a challenge to researchers and users of these fused products. However, two approaches have been most widely used in research [5]:

  • Qualitative approaches, involving the visual comparison of the original multispectral image with the fused image, to verify colour coherence, and the original panchromatic image with the fused image, to verify that spatial detail is maintained.

  • Quantitative approaches, which involve a set of predefined quality indicators to measure the spectral and spatial similarities between the fused image and the original (panchromatic and multispectral) images.

Although traditional remote sensing and GIS programs provide very good tools for the visualisation of spatial data, their analytical capacities are relatively limited and, in many cases, they are not sufficiently flexible and do not represent The state of the art [6]. R [7] is an open source data analysis program and language in which many of the new image processing developments are being implemented because of its power, flexibility, and community of developers and users, among other reasons. Brundson and Comber (2015) argue that R is probably the best environment for spatial data analysis and manipulation [6] and remote sensing is undoubtedly included in this category. Here, we use R to program all fusion and validation algorithms.

The general objective of this work is to compare the results of the application of three image fusion techniques (High Pass Filter, Principal Components Analysis, and Gram-Schmidt) in images from four different satellite sensors (Landsat 7, Landsat 8, IKONOS and QuickBird) and an airborne sensor (Intergraph Z/I-Imaging Digital Mapping Camera), with algorithms implemented in R and to evaluate them quantitatively through quality indices, also implemented in R. This objective is divided into four specific objectives:

Intergraph Z/I-Imaging Digital Mapping Camera.

  • To implement in R three image fusion algorithms: High Pass Filter, Principal Component Analysis and Gram-Schmidt.

  • To implement in R three algorithms to obtain quality indexes of the fused image: universal image quality index (QI), ERGAS (erreur relative globale adimensionnelle de synthèse) spectral index, and ERGAS spatial index.

  • To use the algorithms with five images from different technologies: Landsat 7, Landsat 8, IKONOS, QuickBird and a Intergraph Z/I-Imaging Digital Mapping Camera.

  • To compare the results of the different fusion techniques to determine which provides the best results in each of the study areas and the sensors used.

2.

MATERIALS AND METHODS

2.1

Analysed images

QuickBird is a commercial satellite, launched on October 18, 2001, in an heliosynchronous orbit (450 km and 482 km altitude). It has two CCD cameras - one panchromatic and one multispectral (blue, green, red and near infrared), with spatial resolutions of approximately 0.61 m and 2.50 m respectively. The sweeping width covered by these images is between 16.8 km and 18 km according to the orbital height. The radiometric resolution is 11 bits [8].

The analysed image covers an area of 4.63 km2 and corresponds to the city of Azogues (Ecuador), including part of the Burgay river which runs north-south. Several characteristic spots such as the Central Plaza, the Cuenca-Azogues highway, the bus station and the municipal stadium can be distinguished in the image. The size of the panchromatic image is 5677 rows by 2267 columns.

IKONOS is a commercial Earth observation satellite launched on September 24, 1999. It was the first satellite to make high-resolution images available to the public, constituting a milestone in remote sensing. In orbit at 681 km altitude, the spatial resolution is one meter (panchromatic) and four meters (multispectral) with four bands (blue, green, red and near infrared). The width of this image is 11 km. The radiometric resolution is 11 bits [9]. In January 2015, DigitalGlobe, the owner of the satellite, announced that, due to problems with quality standards the satellite had been deactivated.

The analysed image covers the western part of the city and several rural villages. Land use basically corresponds to urban areas, crops, forests and shrubs. The size of the image is 12,217 rows per 10,599 columns, covering an area of 129.49 km2.

Landsat 7 and 8 are part of a constellation of eight satellites that have provided Earth surface information since 1972. The Landsat project has been the most successful space remote sensing project developed so far [2]. The images used in this work are:

  • Landsat 7 image corresponding to the continental part of image P011R063 acquired on October 25, 2001. It covers an area of 6555.41 km2, and includes the south-western part of the Province of Loja (cantón Zapotillo, Ecuador) and part of the departments of Tumbes and Piura in Peru. There are no major urban centres such as provincial or departmental capitals. Most of the region is made up of dry forests, arid zones and small cultivated areas. This image distinguishes the Pozos Dam, which is part of the Chira-Piura Irrigation Project in Peru. The size of the image is 5544 rows by 5823 columns.

  • Landsat 8 image corresponding to a section of image P010R062 acquired on October 30, 2014. It covers the cities of Cuenca and Azogues, as well as the Cajas National Park. It is possible to distinguish an important area of the Andean paramo in the Ecuadorian Western Cordillera as well as urban zones, crops and forests. The image has a size of 2977 rows by 3736 columns and covers an area of 2502.47 km2.

The Natmur-08 project was a technical assistance contracted by the Murcia Regional Administration (Region of Murcia - Spain), which consisted of digital photogrammetric images taken by airborne panchromatic and multispectral (R, G, B, NIR bands) sensors and a LiDAR survey for the generation of digital terrain models. The project generated panchromatic images with spatial resolutions of 0.45 m and 2 m respectively. The image used has an extension of 5451 rows by 8401 columns (9.27 km 2) and covers the hamlet of Archivel, belonging to the Municipality of Caravaca de la Cruz, in the Region of Murcia (Spain).

2.2

Image fusion methods

Ideally, a good image fusion method should not only increase the spatial resolution of multispectral data, but also preserve as far as possible its spectral integrity [1013].

In the present study, three fusion algorithms have been implemented: High Pass Filter, Principal Component Analysis and Gram-Schmidt. The reason for choosing them are the good results reported in previous studies; in addition, they represent the main types of image fusion techniques [4, 1418].

Despite the increasing use of image fusion techniques in remote sensing and the increasing use of R as a data analysis software, there is no R package that implements such techniques. The algorithms were implemented as R [7] functions. We are currently working on creating an R package that will include image fusion techniques, quality assessment methods, test images and a manual.

High Pass Filter (HPF) which is counted among space domain image fusion techniques, inserts high frequency components into images of low spatial resolution.

The HPF methodology was introduced by Schowengerdt (1980) [19] as a data reconstruction and compression technique, and has recently been extended to new datasets to fuse images of different spatial and spectral resolutions [10, 20, 21].

According to Gangkofner et al. (2008) [15], this technique has generally been implemented in a simplistic manner because the parameters used have not been optimized to achieve satisfactory spatial and radiometric results. The same author proposes an optimization and standardization of the method in order to guarantee its applicability to a wide range of images with different ratios between the multispectral and panchromatic spatial resolutions. Different parameter values were derived from this process. This standardisation method was applied in the research described. The algorithm implemented in R can be summarised in three simple steps [15]:

  • 1. Apply a high-pass filter to the panchromatic image

  • 2. Add the filtered panchromatic image to each band of the multispectral image

  • 3. Linearly expand the histogram.

Principal Components Analysis (PCA) is considered as a component replacement technique. It involves a linear transformation of the multispectral bands, the substitution of a variable in the transformed space, and the inverse transformation to the original space [22]. The justification for this substitution is that the panchromatic image is approximately equal to the first principal component, which contains information that is common to all the bands used as input in the PCA procedure, whereas the unique spectral information of each band is represented in the other components [10]. This substitution maximizes the effect of the high resolution panchromatic band on the fused bands resulting from the process [22]. In summary, to calculate PCA image fusion [23]:

  • 1. Reescale of the low resolution multispectral bands to the spatial resolution of the panchromatic band.

  • 2. Calculate the PCA on the rescaled bands.

  • 3. Adjust the panchromatic band according to the mean and standard deviation of the first principal component.

  • 4. Replace the first principal component with the adjusted panchromatic band and inverse transformation to obtain high resolution fused multispectral bands.

Gram-Schmidt (GS), which is also considered a component substitution method [24] was invented by Laben and Brower in 1998 and patented by Eastman Kodak [12]. It is based on the Gram-Schmidt algorithm, a vector orthogonalization process. In the case of images, each band corresponds to a high-dimensional vector (equal to the number of pixels in the image), which are rotated to produce a new set of uncorrelated vectors (GS1,…, GSn)

[25].

The procedure for performing the GS fusion is summarized in five steps, which can be consulted in detail in Laben and Brower, 2000 [12]:

  • 1. Calculate a low resolution simulated panchromatic band.

  • 2. Implement the transformation of Gram-Schmidt with the modification of Laben and Brower (2000) [12].

  • 3. Adjust the high resolution panchromatic band, so that its mean and standard deviation match those of GS1.

  • 4. Replace GS1 with the adjusted panchromatic band.

  • 5. Reverse the data transformation.

2.3

Image fusion evaluation

A visual comparison is made between the original and fused images. There are different criteria within the visual analysis [26], and our work takes into account spectral criteria and spatial criteria. As spectral criteria we considered:

  • Brightness: judging the perceptible intensity differences of a certain colour between the original and the fused image.

  • Anomalous colours: taking into account variations of colour between both images.

The spatial criteria taken into account were as follows: the fused image should keep the sharpness of an objects’ edge and the spatial contrast between different elements without producing the veined textures in the form of small elongated distortions that can appear when a fusion algorithms is applied.

Five mosaics (Fig. 1 to 5), one for each platform used, containing a clip of the image in its original versions, HPF fusion, ACP fusion and GS fusion were composed to perform the evaluation presented in Tab. 1.

Figure 1.

Clip of the QuickBird image showing a highway roundabout to access the Azogues bus station. Original image (a); HPF fused (b); ACP fused (c) and GS fused (d).

00005_PSISDG10427_104271E_page_6_1.jpg

Figure 2.

Clip of the IKONOS image showing the Cuenca University campus. Original image (a); HPF fused (b); ACP fused (c) and GS fused (d).

00005_PSISDG10427_104271E_page_6_2.jpg

Figure 3.

Clip of the Landsat 7 image showing a segment of the River Chira and one of its tributaries, 10 km downstream

00005_PSISDG10427_104271E_page_7_1.jpg

from Zapotillo. Original image (a); HPF fused (b); ACP fused (c) and GS fused (d).

Figure 4.

Clip of the Landsat 8 image showing the Mariscal La Mar Cuenca-Ecuador Airport. Original image (a); HPF fused (b); ACP fused (c) and GS fused (d).

00005_PSISDG10427_104271E_page_7_2.jpg

Figure 5.

Clip of the Natmur-08 image showing agricultural plots in Archivel-Caravaca de la Cruz-Murcia-Spain. Original image (a); HPF fused (b); ACP fused (c) and GS fused GS (d).

00005_PSISDG10427_104271E_page_8_1.jpg

Table 1.

Qualitative evaluation according to some criteria of visual interpretation in Fig. 1 a 5: 1=very bad; 2=bad; 3=acceptable; 4=good; 5=very good.

ImageSpatial ratioColor compositeFigureFusion method
    HPFPCAGS
Spectral criteria
Quickbird4NIR-R-GFig. 1 a-d545
IKONOS4R-G-BFig. 2 a-d545
Landsat 72R-G-BFig. 3 a-d433
Landsat 82R-G-BFig. 4 a-d445
Natmur-084.4NIR-R-GFig. 5 a-d545
Spatial criteria
Quickbird4NIR-R-GFig. 1 a-d555
IKONOS4R-G-BFig. 2 a-d555
Landsat 72R-G-BFig. 3 a-d444
Landsat 82R-G-BFig. 4 a-d555
Natmur-084.4NIR-R-GFig. 5 a-d555

The images used in the mosaics correspond, in each case, to the best color composition to discriminate land cover. Such compositions are indicated in Tab. 1.

The quantitative evaluation was carried out using three algorithms: the universal image quality index (IQ), the ERGAS spatial index and the spectral ERGAS index.

The Q index, although initially proposed for other applications such as image compression, may be useful in image fusion to verify that there has been no significant alteration of the original radiometric values. It allows a quantitative evaluation of the quality of the fused images with respect to the original multispectral images, while calculating the distortions produced. It is expressed as a combination of three factors [27]: loss of correlation, the means of the two images and the contrast distortion [28]. The range of Q is [-1,1]. The best value is 1, which would be given if the two images are identical.

The ERGAS index, proposed by Wald (2000) [29], was used to compare the spectral quality of the fused images. It seeks to satisfy three main requirements:

  • Independence from the units, i.e., radiance values or quantities without units.

  • Independence from the number of bands in the image to be fused.

  • Independence from the spatial resolution ratio between the multispectral and the panchromatic images.

To calculate this index, the original multispectral bands are rescaled to the spatial resolution of the fused bands. The value of ERGAS shows a strong tendency to decrease when the quality of the fused product increases. Values of less than 3 refer to good merge quality [29,30], which improves as it approaches zero. Since the ERGAS index only considers the spectral characteristics of the image, Lillo-Saavedra et al. (2005) [31] proposed a new spatial index, called the spatial ERGAS index, also introducing a spatial RMSE.

3.

RESULTS

With respect to the qualitative evaluation, all the fused images are clearly more helpful for visual interpretation of the same (Fig. 1 to 5). The results presented in Tab. 1 appoint to the better evaluation obtained in the images with higher spatial ratio between the multispectral image and the panchromatic image resolutions (QuickBird and IKONOS images with ratio 4 and Natmur-08 with ratio 4.4). However, as will be seen later, these images have a lower quantitative value in relation to the images Landsat 7 and Landsat 8, both with a ratio equal to two.

A high visual assessment of the images with a ratio between 4 and 4.4 could be partly explained by the visual perception of the degree of improvement in the fused images as the spatial ratio increases. This, however, raises the question: how reliable is the fusion between multispectral and panchromatic images with a larger spatial ratio (the present research has used images with ratios lower than 4.4)?

Landsat 7 and Landsat 8 images show greater colour distortion with the three fusion methods. In the case of the Landsat 7 image, distortions are smaller when the HPF fusion is used (Fig. 3); on the other hand, for the Landsat 8 image, the GS fusion produces the lowest distortion (Fig. 4). For QuickBird, IKONOS and Natmur-08 images the qualitative evaluation reveals better results, especially with HPF and GS both in spectral and spatial terms.

The quantitative evaluation gives good results for the three fusion methods (Tab. 2), with a high degree of correlation between the Q Index and the spectral ERGAS index.

Table 2.

Quantitative evaluation of the fused images.

PLATFORM:QuickBird
BandQ IndexSpectral ERGAS I.Spatial ERGAS I.Mean ERGAS I.
 HPFPCAGSHPFPCAGSHPFPCAGSHPFPCAGS
10.920.870.893.123.863.593.842.763.143.483.303.37
20.930.860.893.514.554.253.792.182.733.653.363.50
30.920.860.894.465.685.304.753.273.944.614.474.62
40.930.960.963.102.272.554.446.205.873.774.244.21
Global0.920.890.903.594.274.054.223.924.103.914.104.07
PLATFORM:IKONOS
BandQ IndexSpectral ERGAS I.Spatial ERGAS I.Mean ERGAS I.
 HPFPCAGSHPFPCAGSHPFPCAGSHPFPCAGS
10.920.840.875.767.346.928.496.006.627.126.676.77
20.920.830.864.065.375.115.433.373.744.754.374.42
30.920.850.883.404.183.945.263.954.194.334.374.06
40.930.960.963.132.212.524.496.286.063.814.254.29
Global0.920.870.894.215.124.896.115.065.295.165.095.09
PLATFORM:Landsat 7
BandQ IndexSpectral ERGAS I.Spatial ERGAS I.Mean ERGAS I.
 HPFPCAGSHPFPCAGSHPFPCAGSHPFPCAGS
10.970.950.961.642.191.973.031.972.052.342.082.01
20.970.950.962.513.443.123.621.952.053.062.702.58
30.970.950.963.194.403.994.612.032.623.903.213.30
40.970.970.971.922.152.013.484.914.662.703.533.33
Global0.970.950.962.393.192.893.733.003.043.063.092.97
PLATFORM:Landsat 8
BandQ IndexSpectral ERGAS I.Spatial ERGAS I.Mean ERGAS I.
 HPFPCAGSHPFPCAGSHPFPCAGSHPFPCAGS
10.980.990.992.281.971.991.701.411.341.991.691.66
20.980.990.992.392.082.121.401.041.011.901.561.56
30.980.990.992.742.392.431.881.561.642.311.972.03
Global0.980.990.992.482.152.191.671.351.352.071.751.77
PLATFORM:Airborne sensor (Natmur-08)
BandQ IndexSpectral ERGAS I.Spatial ERGAS I.Mean ERGAS I.
 HPFPCAGSHPFPCAGSHPFPCAGSHPFPCAGS
10.910.910.913.363.333.371.821.061.032.592.192.20
20.910.910.913.783.733.772.041.301.442.912.522.60
30.910.920.923.203.073.142.422.102.072.812.582.61
40.910.920.922.622.422.522.202.282.282.412.352.40
Global0.910.910.913.373.173.232.131.761.782.702.472.50

Q index and spectral ERGAS give a better spectral rating to the HPF fusion in the QuickBird, IKONOS and Landsat 7 images, followed by the GS fusion. However, in the case of Landsat 8 and Natmur-08 images, the results are much more even. This indicates that the results of the evaluation in spectral terms cannot be determinant for the case of the Landsat 8 and Natmur-08 images used in this investigation.

Regarding the spatial ERGAS index, ACP fusion was the best option for QuickBird, IKONOS, Landsat-7 and Natmur-08 images, followed by GS. Only for Landsat-8, the GS method produced the best results according to the spatial ERGAS index. With respect to the comparatively low ratings obtained for the QuickBird and IKONOS images with the ERGAS spatial index in the three fusion methods, it is interesting to note that these two images have the highest spatial resolution ratios of the present research. They also present a large proportion of urban coverage, which suggests the importance of a good understanding of the principles involved in the different image fusion methods, as well as of the characteristics of the data to be integrated in the fusion processes [32].

It is interesting to note that whereas for the evaluation of spectral components, HPF fusion tended to present the better results and the ACP fusion worse results, the opposite happens when analysing the spatial components.

An average between the spectral and spatial ERGAS indices is also included in Tab. 2, with the GS fusion showing the best results.

4.

CONCLUSIONS

The objective of this study was (i) to implement three fusion image techniques (High Pass Filter, Principal Component Analysis and Gram-Schmidt) as open source software (R); (ii) apply these techniques to fuse multispectral and panchromatic images from four satellite platforms (QuickBird, IKONOS, Landsat 7 and Landsat 8) and an airborne platform (Project Natmur-08); and (iii) to evaluate the results qualitatively, by means of a visual comparison, and quantitatively, with three quality indices also implemented in R (the universal index of image quality, ERGAS index and ERGAS spatial index).

R, a data analysis open source software, allows large volumes of geospatial data to be managed and to run both simple tasks and complex processes, while maintaining reliability and enabling the implementation of new algorithms such as those proposed in this research.

A visual comparison of the fused images in relation to the original images shows the usefulness of applying image fusion methodologies such as those implemented in this work.

The qualitative evaluation of the results does not always agree with a quantitative evaluation. Therefore, each of these approaches can provide important analytical information and should be considered in a complementary way when assessing the quality of image fusion.

The Q index attributes higher quality to the HPF fusion with the QuickBird, IKONOS and Landsat 7 images. However, with the Landsat 8 and Natmur-08 images, the ACP and GS techniques offer the best results.

According to ERGAS (averaging of both values), GS fusion offers good results for IKONOS and Landsat 7 images; for Landsat 8 and Natmur-08 images, ACP and GS present acceptable results; finally, for the QuickBird image, the HPF method produced the best fused results.

A good understanding of the principles involved in the different image fusion methods - their advantages and limitations - as well as a good knowledge of the characteristics of the data to be fused will help users to take the necessary precautions when selecting the best methodology to obtain reliable results.

ACKNOWLEDGMENTS

This work is partially the result of a postdoctoral contract under the Saavedra Fajardo Program (20023/SF/16) funded by the CARM (Region of Murcia Authority), through the Séneca Foundation-Agency for Science and Technology. The QuickBird image used in this work was provided by the University of the Armed Forces (Ecuador). The IKONOS image from the city of Cuenca was facilitated by the University of Azuay (Ecuador).

REFERENCES

[1] 

Liu, J. and Mason, P., “Essential Image Processing and GIS for Remote Sensing,” WILEY-BLACKWELL(2009). Google Scholar

[2] 

Chuvieco, E., “Fundamentals of Satellite Remote Sensing. An Environmental Approach,” 2CRC Press,2016). Google Scholar

[3] 

Zhang, Y., “Understanding image fusion,” Photogrammetric Engineering & Remote Sensing, 70 (6), 657 –661 (2004). Google Scholar

[4] 

Zhang, Y. and Mishra, R. K., “A review and comparison of commercially available pan-sharpening techniques for high resolution satellite image fusion.,” in Geoscience and Remote Sensing Symposium (IGARSS), 182 –185 (2012). Google Scholar

[5] 

Zhang, Y., “Methods for Image Fusion Quality Assessment - A Review, Comparison and Analysis,” The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XXXVII, 1101 –1109 (2008). Google Scholar

[6] 

Brunsdon, C. and Comber, L., An Introduction to R for Spatial Analysis and Mapping, SAGE(2015). Google Scholar

[7] 

Team, R. D. C., R: A language and environment for statistical computing, R Foundation for Statistical ComputingVienna, Austria,2009). Google Scholar

[8] 

Digital Globe, “QuickBird. Data Sheet,” tech. rep., (2013). Google Scholar

[9] 

Digital Globe, “IKONOS. Data Sheet,” tech. rep., (2013). Google Scholar

[10] 

Chavez, P., C, S., and Anderson, J., “Comparision of Three Different Methods to Merge Multiresolution and Multispectral Data: Landsat TM and SPOT Panchromatic,” Photogrammetric Engineering & Remote Sensing, 57 295 –303 (1991). Google Scholar

[11] 

Wald, L., Ranchin, T., and Mangolini, M., “Fusion of Satellite Images of Different Spatial Resolutions: Assessing the Quality of Resulting Images,” Photogrammetric Engineering & Remote Sensing, 63 (6), 691 –699 (1997). Google Scholar

[12] 

Laben, C. and Brower, B., “Process for Enhancing the Spatial Resolution of Multispectral Imagery Using Pan-Sharpening,” tech. rep., United States Patent, 6 (011), 875 (2000). Google Scholar

[13] 

Ranchin, T. and Wald, L., “Fusion of High Spatial and Spectral Resolution Images: the ARSIS Concept and its Implementation,” Photogrammetric Engineering & Remote Sensing, 66 (1), 49 –61 (2000). Google Scholar

[14] 

Karathanassi, V., Kolokousis, P., and Ioannidou, S., “A comparison study on fusion methods using evaluation indicators,” International Journal of Remote Sensing, 28 (10), 2309 –2341 (2007). https://doi.org/10.1080/01431160600606890 Google Scholar

[15] 

Gangkofner, U. G., Pradhan, P. S., and Holcomb, D. W., “Optimizing the High-Pass Filter Addition Technique for Image Fusion,” Photogrammetric Engineering & Remote Sensing, 74 (9), 1107 –1118 (2008). https://doi.org/10.14358/PERS.74.9.1107 Google Scholar

[16] 

Yuhendra, H., Kuze, H., and Sri Sumantyo, J., “Performance Analyzing of High Resolution Pan-Sharpening Techniques: Increasing Image Quality For Classification Using supervised Kernel support Vector Machine,” Selected Topics in Power Systems and Remote Sensing, 260 –268 (2010). Google Scholar

[17] 

Sarp, G., “Spectral and spatial quality analysis of pan-sharpening algorithms: A case study in Istanbul,” European Journal of Remote Sensing, 47 19 –28 (2014). https://doi.org/10.5721/EuJRS20144702 Google Scholar

[18] 

Cánovas-García, F. and Alonso-Sarría, F., “Comparación de técnicas de fusión en imágenes de alta resolución espacial,” GeoFocus, 14 144 –162 (2014). Google Scholar

[19] 

Schowengerdt, R., “Reconstruction of Multispatial, MuItispectraI Image Data Using Spatial Frequency Content,” Photogrammetric Engineering & Remote Sensing, 46 (10), 1325 –1334 (1980). Google Scholar

[20] 

Chavez, P., Guptill, S., and Bowell, J., “Image Processing Techniques for Thematic Mapper Data50 th Annual ASP-ACSM Symposium,” American Society of Photogrammetry(1984). Google Scholar

[21] 

Cliche, C., Bonn, F., and Teillet, P., “Integration of the SPOT Panchromatic Channel into Its Multispectral Mode for Image Sharpness Enhancement,” Photogrammetric Engineering & Remote Sensing, 51 (3), 311 –316 (1985). Google Scholar

[22] 

Shettigara, V., “A Generalized Component Substitution Technique for Spatial Enhancement of Multispectral lmages Using a Higher Resolution Data Set,” Photogrammetric Engineering & Remote Sensing, 58 (5), 561 –567 (1992). Google Scholar

[23] 

Darvishi Boloorani, A., Remotely Sensed Data Fusion as a Basis for Environmental Studies: Concepts, Techniques and Applications, Universität zu Göttingen (2008). Google Scholar

[24] 

Aiazzi, B., Baronti, S., Selva, M., and Alparone, L., “Enhanced Gram-Schmidt Spectral Sharpening Based on Multivariate Regression of MS and Pan Data,” 3806 –3809 (2006). Google Scholar

[25] 

Maurer, T., “How to Pan-Sharpen Images Using the Gram-Schmidt Pan-Sharpen Method - a Recipe,” ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XL-1/W1 239 –244 (2013). https://doi.org/10.5194/isprsarchives-XL-1-W1-239-2013 Google Scholar

[26] 

European Commision, “Corine Land Cover: Technical Guide,” (1997). Google Scholar

[27] 

Wang, Z. and Bovik, A., “A Universal Image Quality Index,” IIEE Signal Processing Letters, 9 (3), 81 –84 (2002). https://doi.org/10.1109/97.995823 Google Scholar

[28] 

Nussbaum, S. and Menz, G., “Object-Based Image Analysis and Treaty Verification,” Springer(2008). Google Scholar

[29] 

Wald, L., “Quality of high resolution synthesised images: Is there a simple criterion ?,” Fusion of Earth data: merging point measurements, raster maps and remotely sensed images, 99 –103 SEE/URISCA(2000). Google Scholar

[30] 

Ozdarici Ok, A. and Akyurek, Z., “Evaluation of Image Fusion Methods on Agricultural Lands,” Journal of Earth Science and Engineering, 1 107 –113 (2011). Google Scholar

[31] 

Lillo-Saavedra, M., Gonzalo, C., Arquero, A., and Martinez, E., “Fusion of multispectral and panchromatic satellite sensor imagery based on tailored filtering in the Fourier domain,” International Journal of Remote Sensing, 26 (6), 1263 –1268 (2005). https://doi.org/10.1080/01431160412331330239 Google Scholar

[32] 

Švab, A. and Oštir, K., “High-resolution Image Fusion: Methods to Preserve Spectral and Spatial Resolu-tion,” Photogrammetric Engineering & Remote Sensing, 72 (5), 565 –572 (2006). https://doi.org/10.14358/PERS.72.5.565 Google Scholar
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Paúl Pesántez-Cobos, Fulgencio Cánovas-García, and Francisco Alonso-Sarría "Implementing and validating of pan-sharpening algorithms in open-source software", Proc. SPIE 10427, Image and Signal Processing for Remote Sensing XXIII, 104271E (4 October 2017); https://doi.org/10.1117/12.2277543
Advertisement
Advertisement
KEYWORDS
Earth observing sensors

Image fusion

High resolution satellite images

Image enhancement

Image quality

Image analysis

Linear filtering

Back to Top