JEI Letters

Texture features based on local Fourier histogram: self-compensation against rotation

[+] Author Affiliations
Ahsan Ahmad Ursani

IETR, Image and Remote Sensing Group, INSA de Rennes, 20, Avenue des Buttes de Coësmes, 35043 Rennes Cedex, France

IICT, Mehran University of Engineering & Technology, Jamshoro 76062 Sindh, Pakistan

Kidiyo Kpalma, Joseph Ronsin

IETR, Image and Remote Sensing Group, INSA de Rennes, 20, Avenue des Buttes de Coësmes, 35043 Rennes Cedex, France

J. Electron. Imaging. 17(3), 030503 (August 05, 2008). doi:10.1117/1.2965439
History: Received January 25, 2008; Revised June 19, 2008; Accepted June 23, 2008; Published August 05, 2008
Text Size: A A A

Open Access Open Access

We present a method of introducing rotation invariance in texture features based on a local Fourier histogram (LFH) computed using a 1-D discrete Fourier transform (DFT). To compensate for image rotation, a local image-gradient angle at each image pixel is found from within one of the 1-D DFT coefficients. The rotation invariance is established theoretically, analytically as well as empirically. The rotation-compensated features extracted from the same texture image oriented at different angles exhibit very high cross correlation. Therefore, the proposed texture features are expected to yield very high accuracies for a variety of image data and applications. The improved LFH-based features outperform the earlier version of the features and the features based on Gabor filters in texture recognition on 8560 images from the Brodatz album.

Figures in this Article

Texture features play an important role in several image-processing applications ranging from computer vision and medical image processing to remote sensing and content-based image retrieval. Almost all the texture processing applications require rotation invariance in the texture features, which we achieve here in a very simple and cost-effective manner. Reference 1 categorizes the wide range of texture features proposed to date into two broad categories and compares them: features that use a large bank of filters or wavelets and features that use immediate pixel neighborhood properties. It shows that the latter outperforms the former. Hence, we take on improving a feature set from the latter category. In Ref. 2, texture features are extracted using a 1-D discrete Fourier transform (DFT) of the circular neighborhood around a pixel. It proposes computing a 1-D DFT of the 8-pixel sequence around each image pixel and uses magnitudes of the DFT coefficients to extract texture features. More recent work3 extracts similar texture features from the square neighborhood and calls it a local Fourier histogram (LFH)-based feature set. The LFH-based feature set was shown to perform better than the texture features extracted from a large filter bank of Gabor filters,4 which are computationally more expensive than the LFH-based features. In this work, we augment the LFH-based feature set by using the phases of the DFT coefficients as texture features as well. However, the improvement suggested herein equally applies to the texture features extracted from the circular neighborhood.2 Since the phases are sensitive to image rotation, we also present a method to make them rotation invariant. This does not cause any additional computational cost, but does improve performance.

The following sections explain how the LFH-based features are extracted, how the local image gradient angle is determined from the features themselves, and how the image gradient angle is used to compensate the features against rotation. Results are presented before concluding the paper.

The texture features proposed in Ref. 3 are extracted in the spatial domain by taking a 1-D DFT of the 8-pixel sequence x0 through x7, hereafter called x, around a central pixel as shown in Fig. 1. We use the local image gradient at the central pixel to compensate the extracted features for the effects of image rotation.

Graphic Jump LocationF1 :

9-pixel neighborhood in the spatial domain.

When moving a 3×3pixel window across a texture image, the 1-D DFT of x is computed asDisplay Formula

1Xk=n=07xnexp(πi4kn),
where 0k7, Xk represents the k’th Fourier coefficient, and xn represents the n’th value in x. From the computed DFT, histograms of the absolute values of the first five DFT coefficients, i.e., X0 through X4, were used for texture description in Ref. 3

The phases of the DFT coefficients X1 through X3 were also proposed as features in Ref. 3 but only for those applications that do not deal with image rotation. The phase features were otherwise excluded because, unlike magnitudes, the phases of the DFT coefficients are sensitive to image rotation. Reference 2 also proposes only magnitudes of the DFT coefficients as texture features. We propose using the histograms of phases of X2 and X3 after appropriately compensating with the local image gradient.

Local Image Gradient

Traditionally, as a good compromise between cost and accuracy, the 3×3-pixel edge-detection operators such as the Sobel (SO) and Prewitt operators (PO)are often used to estimate local image gradient at a given pixel. Below are the general 3×3 edge-detection operators in which the value of b varies from 1, as in the PO, to 2, as in the SO:Display Formula

2SX=[101b0b101],SY=[1b10001b1],
where SX and SY are convolved with a texture image to obtain two gradient images, GX and GY, respectively. The local image gradient angle δ is calculated asDisplay Formula
3δ=tan1(GYGX).
Convolving the edge detection operators of Eq. 2 with the 3×3-pixel neighborhood of Fig. 1 gives GY and GX, which are substituted in Eq. 3 givingDisplay Formula
4tanδ=x1bx2x3+x5+bx6+x7bx0+x1x3bx4x5+x7.
However, the local image-gradient angle can also be obtained from the phase of the first coefficient X1 of the DFT of x. By substituting k=1 in Eq. 1 givesDisplay Formula
5tanX1=x12x2x3+x5+2x6+x72x0+x1x32x4x5+x7.

Equations 45 happen to be exactly the same if b=2 and they are very similar otherwise, because the value 2 falls between the usual values of 1 and 2. For instance, the histograms of the local image-gradient angle from X1 and from the SO (b=2) for image D87 of the Brodatz album (BA) have a cross-correlation coefficient (XCC) of 0.97. In addition, if we consider the X1 image as a noisy version of the SO-driven image, the signal-to-noise ratio (SNR) is 69dB, verifying that the former is a very close approximation of the latter. All other images of the album were tested, and more or less similar values of correlation coefficient and SNR were found between the two approximations of the image gradient. Hence, instead of computing the local image-gradient angle using any 2-D edge-detection operators, we use the value X1 to compensate the phases of the two other DFT coefficients, i.e., X2 and X3, against the effects of image rotation. It can now be said that δ=X1.

Effects of Image Rotation on Fourier Coefficients

Consider that an image is rotated by an arbitrary angle, with the center of rotation exactly in the middle of the image. The angle of rotation at any other point Pxy on the image would be different from what it is at the center of rotation. Let the angle of rotation be ψ deg at point P00 (see Fig. 1), corresponding to a shift in the string x by m places. This shift in x causes nothing but the changes in the phases of the resulting DFT coefficients. Equation 6 states the shift property of DFT:Display Formula

6F[(xnm)]k=F[(xn)]kexp(πi4km),
where F[(xn)]k represents the k’th coefficient of the DFT of (xn), and F[(xnm)]k represents the k’th coefficient of the DFT of the string (xnm) that is the same string (xn) shifted by m places. Equation 6 shows that any displacement in time or space domain causes a phase shift given byDisplay Formula
7Δθk=π4km
in the Fourier domain: hence, where Δθk represents the shift in Xk. The phase shift in X1 is given byDisplay Formula
8Δθ1=π4m=ψ.
Intuitively, the change in the local image-gradient angle δ is equal to the angle of rotation at point P00(ψ) that causes equal change in X1. Comparing Eqs. 78 gives the phase shift in Xk asDisplay Formula
9Δθk=k×Δθ1.
Therefore, the phases X2 and X3 are adjusted accordingly against the rotation by subtracting the local image-gradient angle δ as in Eq. 10. For k{2,3},Display Formula
10ϕk=XkkX1,
where ϕk represents the rotation-compensated phase Xk, and X1 replaces δ.

Rotation Invariance of the Phase Features

All the images from the BA were rotated to 30, 45, 60, and 90deg, and histograms of ϕ2 and ϕ3 were computed at each orientation. Table 1 shows the XCC as a similarity measurement between the histograms corresponding to 0deg and to 30, 45, 60, and 90deg averaged over all the images from the BA. As an example, Figs. 23 show the histograms of ϕ2 and ϕ3, respectively, for the image D87 from BA. All the histograms appear the same and do not exhibit any left or right shift, indicating that the two phases are highly rotation invariant. We also experimented with the features extracted from the circular neighborhood suggested in Ref. 2 and found that they perform worse than those extracted from the square neighborhood.

Graphic Jump LocationF2 :

Histograms of ϕ2 for image D87 at four different orientations, θ=0, 30°, 45, and 60deg.

Graphic Jump LocationF3 :

Histograms of ϕ3 for image D87 at four different orientations, θ=0, 30°, 45, and 60deg.

Table Grahic Jump Location
XCC between the histograms of ϕ2 and ϕ3, respectively, corresponding to images oriented at 0deg and to those at 30, 45, 60, and 90deg averaged over all the images from the Brodatz album.
Texture Recognition

Each of the 107 texture images from the BA was oriented at 0, 30, 45, 60, and 90deg. Then, 16 subimages measuring 128×128pixels were cropped from each one of the 107×5 images, giving a total of 8560 images.4 Recognition was performed on this set using the LFH-based feature set without phase features, with phase features, and with texture features based on 30 Gabor filters.45 Reference 6 is a more recent work that proposes exactly the same filters but with a new distance metric that cannot be used for rotation-invariant recognition or retrieval. Table 2 presents the overall and orientation-wise texture recognition results, showing that the LFH-based features with phases perform the best in terms of accuracy and the rotation variance (RV).4

Table Grahic Jump Location
Recognition rates relative to Orientation with 8560 Brodatz images.

Reference 4 found that the LFT-based texture features exhibit les noise immunity than the features based on Gabor filters. However, our latest results show that the LFT-based features perform even better when extracted from images quantized to only 32 gray levels. Considering this, we expect the proposed features to be more noise resistant than these were without image-quantization as in Ref. 4.

The earlier feature set based on LFH does not use phases of the DFT coefficients as texture features because the phases are sensitive to image orientation. To introduce rotation invariance in the features, we showed that the process of extracting phase features can be guided by the local image gradient. This was achieved by simply subtracting the local image-gradient angle obtained from the 1-D DFT itself, so that the features become self-compensating. This computationally simple and cost-effective method proved useful in making the LFH-based texture features robust against image rotation. The new feature set including the phase features exhibits more rotation invariance and yields higher recognition rates than the one without phase features.

Varma  M., and Zisserman  A., “ Texture classification: are filter banks necessary?. ,”  Proc. Conf. Comp. Vis. Pattern Recog.. , 2, , 691–698  ((2003)).
Arof  H., and Deravi  F., “ Circular neighborhood and 1-D DFT features for texture classification and segmentation. ,” IEE Proc. Vision Image Signal Process..  1350-245X 145, , pp. 167–172  ((1998)).
Zhou  F., , Feng  J.-F., , and Shi  Q.-Y., “ Texture feature based on local fourier transform. ,”  Proc. EEE Conf. Image Process. , vol. 2, pp. 610–613  ((2001)).
Ursani  A. A., , Kpalma  K., , and Ronsin  J., “ Texture features based on Fourier transform and Gabor filters: an empirical comparison. ,”  Int. Conf. Mach. Vis.. , vol. 145, pp. 67–72  ((2007)).
Manjunath  B. S., and Ma  W. Y., “ Texture features for browsing and retrieval of image data. ,” IEEE Trans. Pattern Anal. Mach. Intell..  0162-8828 , 18, (8 ), 837–842  ((1996)).
Wu  P., , Manjunath  B. S., , Newsam  S., , and Shin  H. D., “ A texture descriptor for browsing and similarity retrieval. ,” Signal Process. Image Commun..  0923-5965 , 16, , 33–43  ((2000)).
© 2008 SPIE and IS&T

Citation

Ahsan Ahmad Ursani ; Kidiyo Kpalma and Joseph Ronsin
"Texture features based on local Fourier histogram: self-compensation against rotation", J. Electron. Imaging. 17(3), 030503 (August 05, 2008). ; http://dx.doi.org/10.1117/1.2965439


Figures

Graphic Jump LocationF1 :

9-pixel neighborhood in the spatial domain.

Graphic Jump LocationF2 :

Histograms of ϕ2 for image D87 at four different orientations, θ=0, 30°, 45, and 60deg.

Graphic Jump LocationF3 :

Histograms of ϕ3 for image D87 at four different orientations, θ=0, 30°, 45, and 60deg.

Tables

Table Grahic Jump Location
XCC between the histograms of ϕ2 and ϕ3, respectively, corresponding to images oriented at 0deg and to those at 30, 45, 60, and 90deg averaged over all the images from the Brodatz album.
Table Grahic Jump Location
Recognition rates relative to Orientation with 8560 Brodatz images.

References

Varma  M., and Zisserman  A., “ Texture classification: are filter banks necessary?. ,”  Proc. Conf. Comp. Vis. Pattern Recog.. , 2, , 691–698  ((2003)).
Arof  H., and Deravi  F., “ Circular neighborhood and 1-D DFT features for texture classification and segmentation. ,” IEE Proc. Vision Image Signal Process..  1350-245X 145, , pp. 167–172  ((1998)).
Zhou  F., , Feng  J.-F., , and Shi  Q.-Y., “ Texture feature based on local fourier transform. ,”  Proc. EEE Conf. Image Process. , vol. 2, pp. 610–613  ((2001)).
Ursani  A. A., , Kpalma  K., , and Ronsin  J., “ Texture features based on Fourier transform and Gabor filters: an empirical comparison. ,”  Int. Conf. Mach. Vis.. , vol. 145, pp. 67–72  ((2007)).
Manjunath  B. S., and Ma  W. Y., “ Texture features for browsing and retrieval of image data. ,” IEEE Trans. Pattern Anal. Mach. Intell..  0162-8828 , 18, (8 ), 837–842  ((1996)).
Wu  P., , Manjunath  B. S., , Newsam  S., , and Shin  H. D., “ A texture descriptor for browsing and similarity retrieval. ,” Signal Process. Image Commun..  0923-5965 , 16, , 33–43  ((2000)).

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Book Chapters

Topic Collections

PubMed Articles
Advertisement
  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.