- Research
- Open Access

# Pyramid-based image empirical mode decomposition for the fusion of multispectral and panchromatic images

- Tee-Ann Teo
^{1}Email author and - Chi-Chung Lau
^{2}

**2012**:4

https://doi.org/10.1186/1687-6180-2012-4

© Teo and Lau; licensee Springer. 2012

**Received:**1 April 2011**Accepted:**9 January 2012**Published:**9 January 2012

## Abstract

Image fusion is a fundamental technique for integrating high-resolution panchromatic images and low-resolution multispectral (MS) images. Fused images may enhance image interpretation. Empirical mode decomposition (EMD) is an effective method of decomposing non-stationary signals into a set of intrinsic mode functions (IMFs). Hence, the characteristics of EMD may apply to image fusion techniques. This study proposes a novel image fusion method using a pyramid-based EMD. To improve computational time, the pyramid-based EMD extracts the IMF from the reduced layer. Next, EMD-based image fusion decomposes the panchromatic and MS images into IMFs. The high-frequency IMF of the MS image is subsequently replaced by the high-frequency IMF of the panchromatic image. Finally, the fused image is reconstructed from the mixed IMFs. Two experiments with different sensors were conducted to validate the fused results of the proposed method. The experimental results indicate that the proposed method is effective and promising regarding both visual effects and quantitative analysis.

## Keywords

- image enhancement
- image processing
- multiresolution techniques
- empirical mode decomposition
- image fusion

## 1. Introduction

The development of earth resources' satellites is mainly focus on improving spatial and spectral resolutions [1]. As the spatial and spectral information are the two critical factors for enriching the capability of image interpretation, fusion of high spatial and high spectral images may increase the usability of satellite images. Most remote sensing applications, such as image interpretation and feature extraction, require both spatial and spectral information; therefore, the demands for fusing high-resolution multispectral (MS) images are increasing.

Currently, most optical sensors are capable of acquiring high spatial resolution panchromatic (Pan) and low spatial resolution MS bands simultaneously; for example, QuickBird, IKONOS, and SPOT series. Due to the technological constraints and costs, the spatial resolution of panchromatic images is better than the spatial resolution of MS images in an optical sensor. To overcome this problem, image fusion techniques (also called color fusion, pan sharpen, or resolution merge) are widely used to obtain a fused image with both high spatial and high spectral information.

The approaches of image fusion may be categorized into three types [2]: projection-substitution, relative spectral contribution, and ARSIS (*Amélioration de la Résolution Spatiale par Injection de Structures*). Intensity-Hue-Saturation (IHS) [3] transform is one of the famous fusion algorithms using the projection-substitution method. This method interpolates MS image into the spatial resolution of a panchromatic image and converts the MS image according to intensity, hue, and saturation bands. The intensity of the MS image is then replaced with a high-spatial panchromatic image and reversed to red, green, and blue bands. However, this method is limited to three-band images.

The projection-substitution method also includes principle component analysis (PCA) [4], independent component analysis (ICA) [5], as well as other method. The PCA converts an MS image into several components based on eigen vectors and values. A high spatial panchromatic image replaces the first component of MS image with a large variance and performs the inverse PCA. The image fusion process is similar to the IHS method. Though this method is not constrained by the number of bands, significant color distortion may result.

The relative spectral contribution method utilizes the linear combination of bands to fuse panchromatic and MS images. Brovey transformation [6] is one of the well-known approaches in this category. The fused image is based on a linear combination of panchromatic and MS images.

The ARSIS is a multi-scale fusion approach, which improves spatial resolution by structural injection. This approach is widely used in image fusion because the advantage of multi-scale analysis may improve the fusion results. The multi-scale approach includes the Wavelet transform [7], empirical mode decomposition (EMD) [8], parameterized logarithmic image processing [9], as well as other methods. The Wavelet approach transforms the original images into several high and low frequency layers before replacing the high frequency of MS image with those that are from panchromatic image. Then, an inverse Wavelet transform is selected to construct the mixed layers for image fusion. A more detailed comparison among fusion methods is discussed in [10, 11].

The main difference between the Wavelet and EMD fusion approaches is depended on decomposition. The EMD method is an empirical method, which decomposes a nonlinear and non-stationary signal into a series of intrinsic mode functions (IMFs) [12]. It is obtained from the signal by an algorithm called the "sifting process" produces a signal that obtains these properties. The EMD method is widely used in one-dimensional signal processing as well as in two-dimensional image processing. Wavelet decomposition is related to the predefine Wavelet basis while the EMD is a non-parametric data-driven process that is not required to predetermine the basis during decomposition. The EMD fusion approach is similar to the Wavelet fusion approach, in that it replaces the high frequency of MS images with those that are from panchromatic image.

The EMD can be applied in many image processing applications such as noise reduction [13, 14], texture analysis [15], image compression [16], image zooming [17], and feature extraction [18, 19]. Because the algorithm of image fusion via EMD is not yet mature, a small number of studies have reported on image fusion using EMD. Hariharan et al. [20] combined the visual and thermal images using the EMD method. First, the two-dimensional image is vectorized into a one-dimensional vector to fulfill the one-dimensional EMD decomposition. A set of weights are then multiplied by the number of IMFs. Finally, the weighted IMFs are combined to reconstruct the fused image. From the visual aspect, the experimental results show that the EMD method is better than Wavelet and PCA method. Liu et al. [21] used a bidimensional EMD method in image fusion; the results demonstrate that the EMD method may preserve both spatial and spectral information. The authors also indicated that the two-dimensional EMD is a highly time-consuming process.

Wang et al. [8] integrated QuickBird panchromatic and MS images using the EMD method. The row-column decomposition is selected to decompose the image in rows and columns separately using a one-dimensional EMD decomposition. The quantity evaluation demonstrates that the EMD algorithm may provide more favorable results when compared with either the IHS or Brovey method. Chen et al. [22] combined the Wavelet and EMD in the fusion of QuickBird satellite images. A similar row-column decomposition process is applied in the fusion process. The experiment also substantiates the promising result of the EMD fusion method.

EMD was originally developed to decompose one-dimensional data. Most EMD-based fusion methods use row-column decomposition schemes rather than two-dimensional decomposition. Because the image is two-dimensional, a two-dimensional EMD is more appropriate for image data processing. However, two-dimensional EMD decomposition has seldom been discussed in image fusion.

The sifting process of two-dimensional EMD is interactive, and involves three main steps: (1) determining the extreme points; (2) interpolating the extreme points for the mean envelope; and (3) subtracting the signal using the mean envelope. Determining the extreme points and the interpolation in two-dimensional space is considerably time consuming. Therefore, a new method to improve computation performance is necessary.

The objective of this study is to establish an image fusion method using a pyramid-based EMD. The proposed method reduces the spatial resolution of the original image during the sifting process. First, the proposed method determines and interpolates the extreme points of the reduced image. Then the results are expanded to obtain the mean envelope with identical dimensions to the original image.

The proposed method comprises three main steps: (1) the decomposition of panchromatic and MS images using pyramid-based EMD; (2) image fusion using the mixed IMFs of panchromatic and MS images; and (3) quality assessment of the fused image. The test data include SPOT images of a forest area and QuickBird images of a suburban area. The quality assessment considers two distinct aspects: the visual and quantifiable. Fusion results of the modified IHS, PCA, and wavelet methods are also provided for comparison.

This study establishes a novel image fusion method using a pyramid-based EMD. The proposed method can improve the computational performance of two-dimensional EMD in image fusion, and can also be applied to EMD-based image fusion. The major contribution of this study is the improvement of the computational performance of two-dimensional EMD using image pyramids. The proposed method extracts the mean envelope of the coarse image, and resamples the mean envelope to equal the original size during the sifting process. The benefits of the proposed method are reduced computation time for extreme point extraction and interpolation.

This article is organized as follows. Section 2 presents the proposed pyramid-based EMD fusion method. Section 3 shows the experimental results from using different image fusion methods. This study also compares and discusses one- and two-dimensional EMD in image fusion. Finally, a conclusion is presented in Section 4.

## 2. The proposed scheme

This section introduces the basic ideas and procedures of one-dimensional EMD and row-column EMD. One-dimensional EMD can be extended to two-dimensional EMD before determining the technical details of pyramid-based two-dimensional EMD. This section describes EMD-based image fusion in the final part.

### 2.1. One-dimensional EMD

EMD is used to decompose signals into limited IMFs. An IMF is defined as a function in which the number of extreme points and the number of zero crossings are the same or differ by one [7]. The IMFs are obtained through an iterative process called the sifting process. A brief description of the sifting process is shown below.

Step 1. Determine the local maxima and minima of the current input signal *h*_{(i, j)}(*t*), where *i* is the number of the IMF and *j* is the number of iteration. In the first iteration, *h*_{(1,1)}(*t*) is the original time series signal *X*(*t*).

Step 2. Compute the upper and lower envelopes *u*_{(i, j)}(*t*) and *1*_{(i, j)}(*t*) by interpolating the local minimum and maximum using the cubic splines interpolation.

*m*

_{(i, j)}(

*t*) from the upper and lower envelopes, as shown as (1).

*h*

_{(i, j)}(

*t*) by the mean envelope to obtain the sifting result,

*h*

_{(i, j+1)}(

*t*), as shown in (2). If

*h*

_{(i, j+1)}(

*t*) satisfies the requirement of the IMF, then

*h*

_{(i, j+ 1)}(

*t*) is IMF

_{ i }(

*t*) and subtract the original

*X*(

*t*) by this IMF

_{ i }(

*t*) to obtain residual

*r*

_{ i }(

*t*). The

*r*

_{ i }(

*t*) is treated as the input data and Step 1 is then repeated. If

*h*

_{(i, j+ 1)}(

*t*) does not satisfy the requirement of the IMF,

*h*

_{(i, j+ 1)}(

*t*) is treated as the input data and Step 1 is then repeated.

*r*(

*t*) is smaller than a predefined value. At the end, we can decompose the signal

*X*(

*t*) into several IMFs and a residual

*r*

_{ n }(

*t*). The decomposition of a signal

*X*(

*t*) can be written as (3). Equation 3 shows that

*X*(

*t*) can be reconstructed from the IMFs and residual without information loss. More details of the basis theory of EMD are discussed in [7].

### 2.2. Row-column EMD

EMD was originally developed to manage one-dimensional data. To apply this method to two-dimensional data, a row-column EMD [22] is proposed based on one-dimensional EMD. The purpose of row-column EMD is to perform EMD on the rows and columns. This method determines and interpolates the extreme points of the one-dimensional space. The row-column EMD process is briefly described below.

*h*

_{(i, j)}(

*p, q*) and perform the cubic spline interpolation for upper and lower envelopes

*ur*

_{(i, j)}(

*p, q*) and

*lr*

_{(i, j)}(

*p, q*) systematically by row. The upper and lower envelopes

*uc*

_{(i, j)}(

*p, q*) and

*lc*

_{(i, j)}(

*p, q*) along the columns are also generated, where

*i*is the number of IMFs and

*j*is the number of the iteration. In the first iteration,

*h*

_{(1,1)}(

*p, q*) is the original image

*X*(

*p, q*). Figure 1 illustrates the extreme point extraction using the row-column method.

*m*

_{(i, j)}(

*p, q*) from the upper and lower envelopes along rows and columns, as shown in (4).

*h*

_{(i, j)}(

*p, q*) by the mean envelope to obtain the sifting result

*h*

_{(i, j+ 1)}(

*p, q*), as shown in (5). If

*m*

_{(i, j)}(

*p, q*) satisfies the requirement of the IMF, then

*h*

_{(i, j+1)}(

*p, q*) is

*IMF*

_{ i }(

*p, q*) and subtract the original signal by this

*IMF*

_{ i }(

*p, q*) to obtain residual

*r*

_{ i }(

*p, q*).

*r*

_{ i }(

*p, q*) is treated as the next input data and Step 1 is repeated. If

*m*

_{(i, j)}(

*p, q*) does not satisfy the requirement of the IMF, then

*h*

_{(i, j+1)}(

*p, q*) is treated as the input data and Step 1 is repeated.

*r*(

*p, q*) is smaller than a predefined value. At the end, we can decompose the image

*X*(

*p, q*) into several high to low frequency IMFs and a residual

*r*

_{ n }(

*p, q*). The decomposition of an image

*X*(

*p, q*) can be written as (6). Equation 6 also demonstrates that the original image can be reconstructed using IMFs and residuals without losing information.

### 2.3. Pyramid-based EMD

This study proposed pyramid-based EMD to avoid the striping effect of row-column EMD. Two-dimensional EMD determines and interpolates the extreme points of a two-dimensional space rather than one-dimensional space. The main difference between pyramid-based and row-column EMD is the generation of a mean envelope. The additional image pyramid improves the computation performance of two-dimensional EMD. The process of pyramid-based two-dimensional EMD is described below.

Step 1. Reduce the input image from *h*_{(i, j)}(*p, q*) to *h*_{(i, j)}(*p*_{
g
},*q*_{
g
}) using Gaussian image pyramid [23], where *i* is the number of the IMF; *j* is the number of the iteration and *g* is the number of pyramid layer. In the first iteration, *h*_{(1,1)}(*p*_{
g
},*q*_{
g
}) is the original reduced image *X*(*p*_{
g
},*q*_{
g
}). The reduced scale is related to the smoothness of the input image and EMD computation time.

*h*

_{(i, j)}(

*p*

_{ g },

*q*

_{ g }) using openness strategies [24]. Morphological filters [16, 25] are frequently used to determine the local maxima and minima for two-dimensional EMD; however, extracting the extreme points in the low-frequency image is difficult. To overcome this problem, this study proposes a surface operator called "openness." Openness is defined as a measure of the surface reliefs of zenith and nadir angles, as shown in Figure 3. Openness is an angular measure of the relationship between surface relief and horizontal distance. Therefore, the local maxima and minima points are determined by the slope of the center and the surrounding points, as shown in Figure 4. The openness is then defined by the direction of azimuth

*D*and length of distance

*L*. The slope

_{ D }

*θ*

_{ L }in azimuth

*D*is calculated from Δ

*H*and distance

*L*, as shown in (7). Openness incorporates both positive and negative values related to the value of slope

_{ D }

*θ*

_{ L }. Positive openness

*φ*

_{ L }is defined as the average of

_{ D }φ

_{ L }along eight sampling directions, whereas negative openness

*ψ*

_{ L }is the corresponding average of

_{ D }

*ψ*

_{ L }. Equation 8 can be used to determine the positive and negative openness. Positive values describe openness above the surface and the maxima points, and negative values describe openness below the surface and the minima points. Figure 4 shows the positive and negative openness of scale

*L*. In the high-frequency layer,

*L*should be smaller to extract the local extreme points. By contrast,

*L*should be larger during low-frequency iteration. Openness is more suitable for locating the local extreme points of different scales. In addition, extreme point selection relates to the surrounding points of different scales rather than to the neighboring points.

*u*

_{(i, j)}(

*p*

_{ g },

*q*

_{ g }) and

*1*

_{(i, j)}(

*p*

_{ g },

*q*

_{ g }). Compute the mean envelope

*m*

_{(i, j)}(

*p*

_{ g },

*q*

_{ g }) from the upper and lower envelopes, as shown in (9).

Step 4. Expand the mean envelope to the original image size *m*_{(i, j)}(*p, q*).

Step 5. Subtract the *h*_{(i, j)}(*p, q*) by the mean envelope to obtain the sifting result *h*_{(i, j+1)}(*p, q*), as shown in (5). If *m*_{(i, j)}(*p, q*) < ε, then *h*_{(i, j+1)}(*p, q*) is *IMF*_{
i
}(*p, q*). Subtract the original image by this *IMF*_{
i
}(*p, q*) to obtain residual *r*_{
i
}(*p, q*). If *m*_{(i, j)}(*p, q*) > ε, then *h*_{(i, j+1)}(*p, q*) is treated as the input data and Step 1 is repeated. The procedure will be terminated when *r*_{
i
}(*p, q*) < ε.

*X*(

*p, q*) into several high to low frequency IMFs and a residual

*r*

_{ n }(

*p, q*), as shown in (5). Figure 5 is an example of two-dimensional EMD. The original image is decomposed into two IMFs and a residual. The decomposed results are more favorable than the row-column EMD, as shown in Figure 2.

### 2.4. EMD-based image fusion

Because only the high-frequency IMF was changed from a panchromatic to a MS image, the remaining IMFs will not affect the image fusion results. Thus, the decomposition process can be simplified. This study only decomposes the image into two IMFs, high- and low-frequency, for image fusion. The EMD image-fusion process is described as follows: For data preprocessing, the panchromatic and MS images are registered into the same system. Next, the MS image is resampled to match the size of the panchromatic image. Then, the method proposed by this study uses EMD to decompose the two images into several IMFs and a residual. The first IMF of the panchromatic image replaces the first IMF of the MS image. Finally, the fused image is obtained by reconstructing the mixed IMFs of the MS image. The reconstruction process combines the mixed IMFs and residuals, as shown in Equation 6.

## 3. Experimental results

Related information of test data

Case I | Case II | ||
---|---|---|---|

Location | Alishan, Taiwan | Hsinchu, Taiwan | |

Gray level (bits) | 8 | 11 | |

Test area (m*m) | 2900*3100 | 716.8*716.8 | |

Pan | Sensor | SPOT-5 | QuickBird |

Spatial resolution (m) | 2.5 (Supermode) | 0.7 | |

MS | Sensor | SPOT-4 | QuickBird |

Spatial resolution (m) | 20 | 2.8 | |

Band | G, R, NIR | G, R, NIR |

The quality assessment includes the visual and quality aspects. Regarding the visual aspect, the fused and the original MS images are visually compared. Both row-column and pyramid-based EMD are applied during image fusion to enable a comparison. In addition, this study employed the commercial software ERDAS Imagine 2010 to fuse the images using different methods, including modified IHS [26], PCA, and Wavelet. These images were then compared with the image fused using the EMD method.

The experiment required establishing a number of parameters. Because the purpose of EMD is image fusion, the image was only decomposed into two components: high-frequency and a remainder layer. The stopping criterion is 99% or a mean envelope less than 2 pixels. The image pyramid scales are reduced layers 1 and 2. The experiment results are discussed in the following section. The window of openness is 5-15 pixels in different iterations. Both the threshold of the minimal points for positive openness and threshold of the maximum points for negative openness were less than 75 degrees.

### 3.1. Quality evaluation of the fused image

The quality assessment considers both the visual and quantifiable aspects, and refers to both spatial and spectral qualities. In other words, the fusion method should improve the spatial resolution and preserve spectral content. Several indices are selected to evaluate the quality of a fused image. The experiment compares the fused image with the original MS image to ensure spectral fidelity. The three spectral indices are RMSE [27], ERGAS [27], and the correlation coefficient. Spatial index is the entropy of an image.

#### 3.1.1. Root mean square error (RMSE)

where Bias is the difference between mean value of MS and fused images, SDD is the standard deviation of difference between MS and fused images.

#### 3.1.2. Erreur relative globale adimensionnelle de synthèse (ERGAS)

where *h* and *l* are the resolution of PAN and MSI, respectively. *N* is the number of spectral band (*B*_{
i
}). *M* is the mean value of each spectral band.

#### 3.1.3. Correlation coefficient

where *C* is the coefficient of correlation, *F*(*i, j*) and *M*(*i, j*) are the gray value of the fused and MS images, respectively. *μ*_{
F
}is the mean of fused image, *μ*_{
M
}is the mean of MS image, and *m* and *n* are the image sizes.

#### 3.1.4. Entropy

where *E* is the Entropy, *P*_{
k
}is the probability of gray value *k* in the image.

### 3.2. Case I

Statistical information of SPOT image.

Item | Band | ERGAS | RMSE | Correlation | Entropy |
---|---|---|---|---|---|

Row-column EMD | 1 | 3.322 | 32.389 | 0.990 | 7.345 |

2 | 16.525 | 0.942 | 5.430 | ||

3 | 11.953 | 0.903 | 5.247 | ||

Pyramid EMD Reduce 1 | 1 | 0.836 | 4.112 | 0.991 | 7.574 |

2 | 3.867 | 0.979 | 5.493 | ||

3 | 3.851 | 0.962 | 5.231 | ||

Pyramid EMD Reduce 2 | 1 | 1.033 | 5.781 | 0.973 | 7.292 |

2 | 4.933 | 0.960 | 5.973 | ||

3 | 4.522 | 0.940 | 5.909 | ||

Modified IHS | 1 | 3.705 | 29.754 | 0.805 | 7.552 |

2 | 15.939 | 0.862 | 5.752 | ||

3 | 15.769 | 0.899 | 5.879 | ||

PCA | 1 | 12.049 | 88.805 | 0.864 | 5.115 |

2 | 12.826 | 0.942 | 4.942 | ||

3 | 11.511 | 0.913 | 4.369 | ||

Wavelet | 1 | 1.071 | 17.169 | 0.937 | 6.978 |

2 | 2.487 | 0.991 | 5.397 | ||

3 | 2.241 | 0.987 | 5.092 |

### 3.3. Case II

Statistical information of QuickBird image

Item | Band | ERGAS | RMSE | Correlation | Entropy |
---|---|---|---|---|---|

Row-column EMD | 1 | 4.645 | 70.855 | 0.925 | 5.877 |

2 | 70.702 | 0.912 | 5.952 | ||

3 | 70.808 | 0.931 | 6.335 | ||

Pyramid EMD Reduce 1 | 1 | 1.874 | 28.65 | 0.980 | 6.320 |

2 | 28.609 | 0.976 | 6.200 | ||

3 | 28.699 | 0.980 | 6.660 | ||

Pyramid EMD Reduce 2 | 1 | 1.917 | 29.662 | 0.930 | 6.230 |

2 | 29.243 | 0.930 | 6.210 | ||

3 | 29.897 | 0.930 | 6.590 | ||

Modified IHS | 1 | 3.552 | 61.455 | 0.912 | 6.584 |

2 | 45.528 | 0.940 | 6.678 | ||

3 | 68.916 | 0.875 | 6.889 | ||

PCA | 1 | 21.974 | 191.667 | 0.897 | 5.568 |

2 | 171.879 | 0.902 | 5.701 | ||

3 | 88.590 | 0.973 | 5.800 | ||

Wavelet | 1 | 2.849 | 47.515 | 0.944 | 6.257 |

2 | 43.320 | 0.945 | 6.403 | ||

3 | 36.660 | 0.966 | 7.070 |

## 4. Conclusions

This article proposes an EMD-based image fusion method using image pyramids. The proposed method uses image pyramids to improve the computation performance of two-dimensional EMD. An openness strategy during the extraction of extreme points of two-dimensional EMD is also proposed. This experimental study uses SPOT and QuickBird images in distinct areas to evaluate the proposed method, and compare the results with other fusion approaches. The experimental results demonstrate an improvement of pyramid-based two-dimensional EMD. Used during image decomposition, this method may overcome the linear effect of row-column EMD. In addition, the proposed method is sensor-independent but can be applied to the integration of heterogeneous sensors, such as optical and radar images.

## Declarations

### Acknowledgements

This study was supported in part by the Industrial Technology Research Institute of Taiwan and National Science Council of Taiwan under Project NSC 99-2221-E-009-131. The author would like to thank the Center for Space and Remote Sensing Research at National Central University in Taiwan for providing the test data sets.

## Authors’ Affiliations

## References

- Li Z, Chen J, Baltsavias E:
*Advances in Photogrammetry, Remote Sensing and Spatial Information Sciences: 2008 ISPRS Congress Book*. Taylar & Francis Group; 2008.Google Scholar - Wald L, Ranchin T, Mangolini M: Fusion of satellite images of different spatial resolutions: assessing the quality of resulting images.
*Photogram Eng Remote Sens*1997, 63(6):691-699.Google Scholar - Carper WJ, Lillesand TM, Kiefer RW: The use of intensity-hue-saturation transformations for merging SPOT panchromatic and multispectral image data.
*Photogram Eng Remote Sens*1990, 56(4):459-467.Google Scholar - Chavez JPS, Sides SC, Anderson JA: Comparison of three difference methods to merge multiresolution and multispectral data: landsat TM and SPOT panchromatic.
*Photogram Eng Remote Sens*1991, 57(3):295-303.Google Scholar - Zhang G, Wang L, Zhang H: A fusion algorithm of high spatial and spectral resolution images based on ICA.
*Int Arch Photogram Remote Sens Spatial Inf Sci*2008, XXXVII(B7):1295-1300.Google Scholar - Pohl C, van Genderan JL: Multisensor image fusion in remote sensing: concepts, methods and application.
*Int J Remote Sens*1998, 19: 823-854.View ArticleGoogle Scholar - Jorge N, Xavier O, Octavi F, Albert P, Vicenc P, Roman A: Multiresolution-based imaged fusion with additive wavelet decomposition.
*IEEE Trans Geosci Remote Sens*1999, 37(3):1204-1211.View ArticleGoogle Scholar - Wang J, Zhang J, Liu Z: EMD Based multi-scale model for high resolution image fusion.
*Geospatial Inf Sci*2008, 11(1):31-37.View ArticleGoogle Scholar - Nercessian SC, Panettar KA, Agaian SS: Multiresolution decomposition schemes using the parameterized logarithmic image processing model with application to image fusion.
*EURASIP J Adv Signal Process*2011, 2011: 17. (Article ID 515084)View ArticleGoogle Scholar - Thomas C, Ranchin T, Wald L, Chaunussot J: Synthesis of multispectral images to high spatial resolution: a critical review of fusion methods based on remote sensing physics.
*IEEE Trans Geosci Remote Sens*2088, 46(5):1301-1312.View ArticleGoogle Scholar - Alparone L, Wald L, Chanussot J, Thomas C, Gamba P, Bruce LM: Comparison of pansharpening algorithms: outcome of the 2006 GRS-S data-fusion contest.
*IEEE Trans Geosci Remote Sens*2007, 45(10):3012-3021.View ArticleGoogle Scholar - Huang NE, Shen Z, Long SR, Wu ML, Shih HH, Zheng Q, Yen NC, Tung CC, Liu HH: The empirical mode decomposition and the Hilbert spectrum for nonlinear and nonstationary time series analysis.
*Proc R Soc Lond*1998, A 454: 903-995.MathSciNetView ArticleMATHGoogle Scholar - Han C, Guo H, Wang C, Fan D: A novel method to reduce speckle in SAR images.
*Int J Remote Sens*2002, 23(23):5091-5101.Google Scholar - Bernini MB, Federico A, Kaufmann GH: Noise reduction in digital speckle pattern interferometry using bidimensional emplirical mode decompostion.
*Appl Opt*2008, 47(14):2592-2598.View ArticleGoogle Scholar - Nunes JC, Bouaoune Y, Delechelle E, Niang O, Bunel Ph: Image analysis by bidimensional emirical mode decopostion.
*Image Vis Comput*2003, 21: 1019-1026.View ArticleMATHGoogle Scholar - Linderhed A: Image Empirical Mode Decomposition: a new tool for image processing. In
*The First International Conference on The Advances of Hilbert-Huang Transform and It's Application*. Jhung-Li, Taiwan; 2006.Google Scholar - Tian Y, Huang Y, Li Y: Image zooming method using 2D EMD technique.
*Proceeding of IEEE the 6th World congress on Intellligent Control and Automation*2006, 2: 10036-10040.View ArticleGoogle Scholar - Ayenu-Prah A, Attoh-Okine N: Evaluating pavement cracks with bidimensional empirical mode decomposition.
*EURASIP J Adv Signal Process*2008, 2008: 7. (Article ID 861701)View ArticleMATHGoogle Scholar - Khan JF, Barner K, Adhami R: Feature point detection utilizing the empirical mode decomposition.
*EURASIP J Adv Signal Process*2008, 2008: 13. (Article ID 287061)View ArticleMATHGoogle Scholar - Hariharan H, Gribok A, Abidi M, Koschan A: Image fusion and enhancement via empirical mode decomposition.
*J Pattern Recogn Res*2006, 1(1):16-32.View ArticleGoogle Scholar - Liu Z, Song P, Zhang J, Wang J: Bidimensional empirical mode decomposition for the fusion of multispectral and panchromatic images.
*Int J Remote Sens*2007, 28(18):4081-4093.View ArticleGoogle Scholar - Chen S, Su y, Zhang R, Tian J: Fusing remote sensing images using a trous wavelet transform and empirical mode decomposition.
*Pattern Recogn Lett*2008, 29: 330-342.View ArticleGoogle Scholar - Richards JA, Jia X:
*Remote Sensing Digital Image Analysis: An Introduction*. 3rd edition. Springer; 1999.View ArticleGoogle Scholar - Yokoyama R, Shirasawa M, Pike R: Visualizing topograhpy by openness: a new application of image processing to digital elevation methods.
*Photogram Eng Remote Sens*2002, 68(3):257-265.Google Scholar - Nunes JC, Delechelle E: Empirical mode decompostion: applications on signal and image processing.
*Adv Adapt Data Anal*2008, 1(1):125-175.View ArticleGoogle Scholar - Siddiqui Y: The modified IHS method for fusing satellite imagery. In
*Proceedings of ASPRS Annual Conference*. Reno Nevada, USA; 2006.Google Scholar - Otazu X, Gonzáles-Audícana M, Fors O, Nunez J: Introduction of sensor spectral response into image fusion methods. application to wavelet-based methods.
*IEEE Trans Geosci Remote Sens*2005, 43(10):2379-2385.View ArticleGoogle Scholar

## Copyright

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.