Anda di halaman 1dari 13

1

Statistical texture characterization from discrete


wavelet representations
G. Van de Wouwer, P. Scheunders, D. Van Dyck
Vision Lab, Department of Physics, University of Antwerp,
Groenenborgerlaan 171, 2020 Antwerpen, Belgium
EDICS: IP 1.6

Abstract

We conjecture that texture can be characterized by the statistics of the wavelet detail coecients and therefore
introduce 2 feature sets: 1) the wavelet histogram signatures which capture all rst order statistics using a model based
approach; 2) the wavelet cooccurrence signatures, which re ect the coecients' second order statistics.
The introduced feature sets outperform the traditionally used energy. Best performance is achieved by combining
histogram and cooccurrence signatures.

Keywords

texture analysis, feature extraction, multiscale representation, wavelets

Corresponding author: G. Van de Wouwer; email: wouwer@ruca.ua.ac.be


tel. 32-3-218.04.39; fax 32-3-218.03.18

Permission to publish abstract separately is granted.

I. Introduction

Texture analysis plays an important role in many image processing tasks, ranging from remote sensing to
medical imaging, robot vision and query by content in large image databases. Various methods for texture
feature extraction have been proposed during the last decades (e.g. [1]), but the texture analysis problem
remains dicult and subject to intensive research.
A major class of feature extractors relies on the assumption that texture can be de ned by the local statistical
properties of pixel gray levels. From the image histogram, rst order statistics can be derived and used as
texture features. It was soon argued that they did not suce for adequate texture description and that second
order statistics were required, as eciently re ected in features computed from the cooccurrence matrix [2].
The conjecture that second order statistics suces for texture analysis was later rejected [3] and various other
texture analysis schemes were introduced (e.g. based on Markov random elds [4], fractal models [5] or, more
recently, on Wold decomposition [6]).
A weakness shared by all these texture analysis schemes is that the image is analyzed at one single scale;
a limitation which can be lifted by employing multiscale representations. Studies in the human visual system support this approach since researchers have found that the visual cortex can be modeled as a set of
independent channels, each with a particular orientation and spatial frequency tuning [7].
Several multichannel texture analysis systems have been developed [8] [9]. In particular, Gabor lters were
employed to perform texture segmentation [10] [11] [12]. In the last decade, wavelet theory has emerged and
became a mathematical framework which provides a more formal, solid and uni ed framework for multiscale
image analysis [13] [14]. Typically, the wavelet transform maps an image on a low resolution image and a
series of detail images. The low resolution image is obtained by iteratively blurring the image; the detail
images contain the information lost during this operation. The energy or mean deviation of the detail images
are the most commonly used features for texture classi cation and segmentation problems [15] [16] [17] [18]
[19].
In this paper, we will combine the statistical and multiscale view on texture. We conjecture that texture
can be completely characterized from the statistical properties of its multiscale representation. To optimally
describe these statistics, we introduce two feature sets: the wavelet histogram and cooccurrence signatures.
First order statistical information is derived from the detail image histogram. As observed by Mallat [13] the
detail histograms of natural textured images can be modeled by a family of exponential functions. Introducing
the parameters of this model as texture features completely describes the wavelet coecients' rst order
statistics. Further improvement in texture description is obtained from the coecients' second order statistics,
which can be described using the detail image cooccurrence matrices. The most complete description is
obtained by combining both rst and second order statistical information.
II. The Wavelet Representation

In practice the 2-D discrete wavelet transform is computed by applying a separable lter bank to the image
[13]:
Ln(bi; bj ) = [Hx  [Hy  Ln,1]#2;1]#1;2 (bi; bj )
(1)
Dn1(bi; bj ) = [Hx  [Gy  Ln,1 ]#2;1]#1;2 (bi; bj )
(2)
Dn2(bi; bj ) = [Gx  [Hy  Ln,1 ]#2;1]#1;2 (bi; bj )
(3)
Dn3(bi; bj ) = [Gx  [Gy  Ln,1 ]#2;1]#1;2 (bi; bj )
(4)
( denotes the convolution operator, #2;1 (#1;2) subsampling along the rows (columns) and L0 = I (~x) is the
original image). H and G are a low and bandpass lter respectively. Ln is obtained by low pass ltering
and is therefore referred to as the low resolution image at scale n. The detail images Dni are obtained by
bandpass ltering in a speci c direction and contain directional detail information at scale n. The original
image I is thus represented by a set of subimages at several scales: fLd ; Dni gi=1;2;3;n=1::d which is a multiscale
representation of depth d of the image I .
In this work, a wavelet frame representation has been used, which is an overcomplete representation computed
by omitting the subsampling of the detail images at each step. This approach was advocated in [18] and leads
to a more stable representation.

The histogram of the wavelet detail coecients will be noted as hni (u); thus hni (u)du is the probability that
a wavelet coecient Dni (~b) has a value between u and u + du. We shall refer to hni (u) as the wavelet detail
histogram (the index ni will be mostly dropped for simplicity).
III. Wavelet Signatures

A. Energy signatures
The (normalized) energy of a subimage Dni containing N coecients is de ned as
X
Eni = 1 (Dni(bj ; bk ))2

j;k

(5)

The wavelet energy signatures fEni gn=1::d;i=1;2;3 re ect the distribution of energy along the frequency axis over
scale and orientation and have proven to be very powerful for texture characterization. Since most relevant
texture information has been removed by iteratively low pass ltering, the energy of the low resolution image
Ld is generally not considered a texture feature.
An alternative measure which is sometimes used as a texture feature is the mean deviation:
X
MD = 1 jD (b ; b )j
(6)
ni

j;k

ni j k

Let us call them the MD signatures. Note that since both (5) and (6) are a measure of the dispersion of the
wavelet coecients, they are strongly correlated.
B. Histogram signatures
Since G in (2)-(4) is a high-pass lter, the mean of the wavelet detail coecients equals zero. Consequently,
the energy is exactly their variance. Employing energy as a texture feature is thus equivalent to characterizing
the detail histogram by a gaussian. Employing the histogram as a basis for feature extraction makes sense
since this yields translation invariant features.
From the preceding discussion, it is clear that texture characterization can be re ned by improving the model
for the histogram, resulting in a more adequate description of the wavelet coecients' rst order statistics.
This could e.g. be done by computing higher order moments from it. However, if an empirical parametrical
model of the histogram were available, then all rst order statistical information contained in the detail
histogram could be captured in the parameters of this model.
Mallat [13] has found experimentally that the detail histograms of natural textured images can be modeled
by a family of exponentials:
h(u) = Ke,(juj= )
(7)
is inversely proportional with the decreasing rate of the peak ( = 2 gives a gaussian),Rwhile models the
width of the histogram peak (variance). K is a normalization constant to ensure that h(u)du = 1. This
model has already been used for image coding [20]) [21] but, to our knowledge, has not yet been applied for
texture characterization.
The model parameters , and K can be computed from

m1 = jujh(u)du ; m2 = juj2h(u)du

(8)

Inserting (7) and using the normalization condition one obtains:

Z1

K = 2 ,(1
where
,(
x
)
=
e,ttx,1dt
= )
0
,(1
=
)
= m1 ,(2= )
!
2
2
m
=x)
= F ,1 m1 where F (x) = ,(3,=x(2),(1
=x)
2

(9)
(10)
(11)

Note that the energy (5) and mean deviation (6) are exactly the estimates of m2 and m1 required to compute
and . The (highly nonlinear) transformation (10)-(11) maps the correlated features E and MD into the
wavelet histogram signatures and , which are easily interpreted as speci c, independent characteristics of
the detail histogram. Moreover, if the histogram model is valid they contain all rst order information present
in the detail histogram. The validity of this model will be experimentally veri ed in section IV.
The formulae for the computation of the wavelet signatures as presented here, can easily be employed for
other texture analysis tasks. For instance, for segmentation the wavelet signatures are computed over a
(small) local window centered on each pixel of the image, resulting in one feature vector per pixel. The image
is then subdivided into a number of regions by assigning each pixel to a particular region based on its feature
vector.
C. Cooccurrence signatures
When features based on rst order statistics do not suce, second order statistics can improve texture discrimination. Since all rst order statistical information of the detail images is captured in the histogram
signatures, the obvious extension is the computation of cooccurrence matrix features from the detail images
to describe their second order statistics.
Since each wavelet coecient Dni (~b) 2 R and the cooccurrence matrix is de ned for an image with a countable
number of gray levels, the detail histogram is discretized by choosing M values fuj ; uj gj =1;:::;M . For each ~b
we set
D~ ni (~b) = j if Dni (~b) 2 [uj , uj =2; uj + uj =2]
(12)
The element (j; k) of the cooccurrence matrix Cni is de ned as the joint probability that a wavelet coecient
D~ ni = j cooccurs with a coecient D~ ni = k on a distance  in direction  [2]. Usually small values for 
are used since most relevant correlation between pixels exists on small distances. Formulas for 8 common
cooccurrence features are listed in Table I; these features extracted from the detail images will be referred to
as the wavelet cooccurrence signatures. Similar features were proposed by Thyagarajan et. al. [22]. However,
they used a subsampled wavelet transform leading to very small detail images for lower scales, for which the
cooccurrence matrix features are not robust and may be misleading. Also, they compute the cooccurrence
features of the low-resolution image at each scale, which may lead to redundancy, since the low-resolution
images at two di erent scales contain overlapping (low-frequency) information.
IV. Validity of the wavelet detail histogram model

The conjecture that all rst order statistical information is contained in the wavelet histogram signatures
and is based on the assumption that each detail histogram can indeed be modeled by (7). This was veri ed
for each detail image from each texture image in a database consisting of 30 real world 512x512 images from
di erent natural scenes [23] (Fig. 1). The detail histograms and the model ts were rst inspected visually;
typical results are shown in Fig. 2. To examine the goodness of t quantitatively, we remark that the proposed
model for h(u) 1) is unimodal with one maximum at u = 0, and 2) is symmetric around u = 0.
The rst property was examined by smoothing the observed histograms using a moving average and determining the number and position of the local maxima (veri cation of the unimodality). It was found that
98.3% of all investigated histograms were indeed unimodal (a counterexample is presented in Fig. 2,c)). The
position of the maxima was found to be between -1 and 1.
The symmetry property was examined by evaluating the following asymmetry parameter:
M
X
1
asm = M (h(umax , ui ) , h(umax + ui ))2
i=1

(13)

A statistical analysis was performed on the asm values which showed presence of outliers. The median
M (asm) and mean deviation MD(asm) of the asm values were computed. We found that 98% of the values
were smaller than M (asm) + 3:MD(asm)  2 10,4. Fig. 2 b) shows an asm value higher than this, but
despite this asymmetry, the estimated values for and correctly describe the form of the histogram.
We may conclude that the symmetry property is ful lled satisfactory. Some poor model ts occurred, mainly
due to the deviance from unimodality of the observed histograms. Poor ts always occurred on lower scales
and did not much in uence classi cation performance since most relevant texture information is found in

the high and midband frequencies. Although, undoubtedly, counterexamples to its validity could be found
or constructed, the model was found to adequately t image data from a wide range of naturally textured
images.
V. Classification experiments

A database of 1920 image regions of 30 texture classes was constructed by dividing each 512x512 image into
64 non-overlapping 64x64 regions. Although classi cation performances of ' 90% have been reported on
the entire Brodatz database [24] [17], we have chosen to use less texture classes. The reason for choosing
this particular collection of images is that the classi cation performance using the energy signatures alone is
< 90%, thus creating a suciently hard classi cation problem to clearly observe the change in performance
using the novel features. Furthermore, we have employed sucient samples per texture class to be able to
compare performances of individual texture classes.
Each image region was transformed to an overcomplete wavelet representation of depth 4 using a biorthogonal
spline wavelet of order 2 [25]. Four di erent feature sets were generated containing:
1. 12 wavelet energy signatures. (0.10 sec. / vector)
2. 24 wavelet histogram signatures. (0.11 sec. / vector)
3. 96 wavelet cooccurrence signatures. To keep the feature set size managable, only one  was chosen,
namely  = 1 since most relevant pixel correlations exist on small distances. Cooccurence matrices for
 = 0; 45; 90; 135 were computed and averaged. (0.56 sec. / vector )
4. all features from 2 and 3.
The number between brackets is the processor time required to compute one feature vector by a C-program
on a HP 712/400 unix workstation. Although our routines were not particularly optimized for speed it shows
that the computational complexity of the energy and histogram signatures is comparable. The computational
complexity of the cooccurrence signatures is much higher due to the necessity of quantizing each detail image
and computing a cooccurrence matrix from it.
A. Methodology
To evaluate the features' discriminative power a k-nearest neighbor classi er (knn) is employed. Classi cation
of a feature vector ~x is performed by searching the k closest training vectors according to some metric d(~x; ~y).
The vector ~x is assigned to that class to which the majority of these k nearest neighbors belong.
Classi cation performance estimation is carried out by a leave-one-out method. This method sequentially
picks each data sample and classi es it (by the knn rule) using the remaining samples. Each available sample
is thus employed once as a test sample. The error rate is estimated by counting the total number of samples
classi ed falsely. This method has the bene t of using a large training set and at the same time an independent
test set.
If the distance measure d is chosen to be Euclidean, then some features (the ones with largest variances)
tend to dominate this measure. To avoid this, the median and mean deviation of each feature over all classes
was computed and the features were normalized by subtracting the median and dividing trough the mean
deviation.
Well known in the pattern recognition literature is the curse of dimensionality phenomenon, which dictates
that classi cation performance not necessarily increases with an increasing number of features (given a xed
amount of data samples). Therefore, given a feature extraction scheme and a nite number of training
instances, there exists an optimal number of features for a particular task. Therefore, it is crucial to adopt a
feature selection (or extraction) scheme to nd a (sub-)optimal set of features. In this work we shall adopt the
Floating Forward Feature Selection scheme (FFFS) [26], which has recently been found to outperform other
selection schemes [27]. This algorithm is initialized by taking the best feature (\best" is de ned here as giving
the best recognition performance). The selection then continues by iteratively adding (or deleting) a feature
in each step to obtain a subset of all available features which gives the highest classi cation performance.
Note that we have chosen the knn classi er for two reasons. First, it is a exible and ecient scheme
which often outperforms other classi ers (see e.g [28] in which the performances of several classi ers on
several datasets are compared). Second, knn requires no exhaustive training and is therefore easy to use in
conjunction with the FFFS which requires error estimation on each step.

B. Results
A quick comparison can be made by observing Fig. 3, which displays classi cation performance versus feature
set dimensionality. Maximum recognition performance was obtained with 10 features for the energy signatures.
For the remaining feature sets, recognition performance saturated about a dimensionality of 15, at which point
error rates are reported in Table II. A detailed investigation learns that the error rates for Fabric0, Fabric7,
Fabric17, Fabric18, Sand0, Tile7 and Wood2 were 0% for all feature sets; the error rates for Brick1, Fabric9,
Food5, Stone4 and Water6 were about 0-3% and remained constant for all feature sets. In Table II, the
remaining texture classes, for which error rates di ered between feature sets are reported.
Using energy signatures, a mean error rate of 17.7% is observed. Histogram signatures however o er a clear
advantage over the energies: the mean error rate drops 10%, mainly due to improvements for each texture
class. For some classes, an improvement of 20-30% was observed. Only Brick5 has a slightly higher error rate,
and for 4 textures (Fabric4-11, Tile1 and Wood1) the error rate remains unchanged. The extra ( rst order)
information contained in the histogram signatures thus clearly improves texture characterization.
Comparing rst order to second order statistics based features (i.e. cols. 2 and 3), a general decrease in error
is observed (from 7.6% to 4.0%). The following comments can be given:
1. The number of cooccurrence signatures was 4-fold the number of histogram signatures and 8-fold the
number of energy signatures, resulting in the need for more computer time during feature extraction and
selection.
2. Whereas histogram signatures describe all rst order statistics of a detail histogram using 2 features, the
second order statistics description is not complete since a choice for the parameters  and  of the cooccurrence
matrix should be made.
3. For some classes the error rate rises (Bark8, Fabric11, Food2 and Grass1) while it decreases for others.
This observation motivates the conclusion that some classes are best distinguished using rst order statistics,
while for others the discriminatory power of the second order statistics is larger. Which features to use is
thus data-dependent, but it is not yet clear which ones to choose for which images. A possible approach to
determine the best features for a particular problem is to compute both rst and second order based features
and employ the feature selection method.
The last comment motivates the use of a combination of histogram and cooccurrence signatures, which indeed
proved to yield the best classi cation performances (col.4, mean error rate 2.5%).
Next, some other classi ers were employed to classify the images. We have used: 1) a Gaussian Bayes Classi er
(GBC) [29] and 2) a Learning Vector Quantization (LVQ) neural network [30]. Half of the available samples
were used for training and the others for testing. For the LVQ network a codebook of 120 prototypes was
used and a training time of 8000. The error rates on the test sets are listed below:
1
2
3
4
knn 18% 11% 9% 7%
LVQ 25% 16% 14% 10%
GBC 19% 12% 10% 6%
One rst remarks that the classi cation scores are overall lower than for the knn/leave-1-out scheme. This is
due to the hold-out method which is known to have a positive bias for error estimation [31]. The general trend
however remains the same: 1) histogram signatures outperform energy signatures, 2) wavelet cooccurrence
signatures perform slightly better than the histogram signatures, and 3) best results are obtained using a
combination of histogram and cooccurrence signatures. Comparing classi ers, one notices that knn and GBC
have similar performances, but the LVQ network is generally worse.
VI. Conclusion

Two novel feature sets were introduced to characterize texture by the statistical properties of wavelet detail
coecients. A model was used to describe the rst order statistics of the wavelet detail histograms. We found
that the model ts the detail histograms of a wide range of natural textured images. The parameters of this
model then serve as texture features: the wavelet histogram signatures, which were shown to outperform the
traditionally used energy signatures in a texture classi cation experiment.
Since no further information could be extracted from the detail histograms, further improvement in texture
characterization was obtained from the coecients' second order statistics. Therefore, features were computed

from the cooccurrence matrices of the detail images. The cooccurrence signatures yielded a lower classi cation
error rate in comparison with the histogram signatures. However, we found that some textures are best
characterized using rst order statistics, while for others second order statistics were better. The (average)
best result was obtained by combining both feature sets. The introduced feature sets are very promising for
many image processing tasks such as texture recognition, segmentation and indexing image databases.
References

[1] T.R. Reed and J.M.H. du Buf, \A review of recent texture segmentation and feature extraction techniques," CVGIP: Image
Understanding, vol. 57, no. 3, pp. 359{372, 1993.
[2] R.M. Haralick, K. Shanmugan, and I. Dinstein, \Texture features for image classi cation," IEEE Trans. Systems Man
Cybern., vol. 3, no. 3, pp. 610{621, 1973.
[3] A. Gagalowicz, \A new method for texture eld synthesis: Some applications to the study of human vision," IEEE Trans.
Patt. Anal. Mach. Intell., vol. 3, pp. 520{533, 1982.
[4] G.C. Cross and A.K. Jain, \Markov random eld texture models," IEEE Trans. Patt. Anal. Mach. Intell., vol. 5, pp. 25{39,
1983.
[5] A.P. Pentland, \Fractal-based description of natural scenes," IEEE Trans. Patt. Anal. Mach. Intell., vol. 6, no. 6, pp.
661{674, 1984.
[6] F. Liu and R.W. Picard, \Periodicity, directionality, and randomness: Wold features for image modeling and retrieval,"
IEEE Trans. Patt. Anal. Mach. Intell., vol. 18, no. 7, pp. 722{733, 1996.
[7] J. Beck, A. Sutter, and R. Ivry, \Spatial frequency channels and perceptual grouping in texture segregation," Comp. Vis.
Graph. Im. Process., vol. 37, pp. 299{325, 1987.
[8] M. Unser and M. Eden, \Multiresolution feature extraction and selection for texture segmentation," IEEE Trans. Patt.
Anal. Mach. Intell., vol. 11, pp. 717{728, 1989.
[9] A.K. Jain, \Learning texture discrimination masks," IEEE Trans. Patt. Anal. Mach. Intell., vol. 18, no. 2, pp. 195{205,
1996.
[10] I. Fogel and D. Sagi, \Gabor lters as texture discriminators," Biol. Cyber., vol. 61, pp. 103{113, 1989.
[11] B.S. Manjunath and W.Y. Ma, \Texture features for browsing and retrieval of image data," IEEE Trans. Patt. Anal. Mach.
Intell., vol. 18, no. 8, pp. 837{842, 1996.
[12] C.C. Chen and D.C. Chen, \Multiresolutional gabor lter in texture analysis," Patt. Rec. Lett., vol. 17, no. 10, pp. 1069{1076,
1996.
[13] S. Mallat, \A theory for multiresolution signal decomposition: the wavelet representation," IEEE Trans. Patt. Anal. Mach.
Intell., vol. 11, no. 7, pp. 674{693, 1989.
[14] I. Daubechies, Ten Lectures on Wavelets, Capital City Press, Montpellier, Vermont, 1992.
[15] A. Kundu and J.-L. Chen, \Texture classi cation using qmf bank-based subband decomposition," CVGIP: Graphical Models
and Image Processing, vol. 54, no. 5, pp. 369{384, 1992.
[16] T. Chang and C.-C.J. Kuo, \Texture analysis and classi cation with tree-structured wavelet transform," IEEE Trans. Im.
Process., vol. 2, no. 4, pp. 429{441, 1993.
[17] J.R. Smith and S-F Chang, \Transform features for texture classi cation and discrimination in large image databases," in
Proc. Int. Conf. Im. Proc. IEEE, 1994, vol. III, pp. 407{411.
[18] A. Laine and J. Fan, \Frame representations for texture segmentation," IEEE Trans. Im. Process., vol. 5, no. 5, pp. 771{780,
1996.
[19] G. Van de Wouwer, P. Scheunders, S. Livens, and D. Van Dyck, \Wavelet correlation signatures for color texture characterization," Patt. Rec., 1998, to appear.
[20] R.W. Buccigrossi and E.P. Simoncelli, \Image compression via joint statistical characterization in the wavelet domain,"
Tech. Rep. 414, GRASP Lab., Univ. Pennsylvania, may 1997, download: ftp.cis.upenn.edu/pub/eero/buccigrossi97.ps.gz.
[21] Y. Wang, H. Li, J. Xuan, S-C.B Lo, and S.K. Mun, \Modeling of wavelet coecients in medical image compression," in
Proc. Int. Conf. Im. Proc. IEEE, 1997, vol. I, pp. 644{647.
[22] K.S Thyaragajan, T. Nguyen, and C.E. Persons, \A maximum likelihood approach to texture classi cation using wavelet
transform," in proc. Int. Conf. Im. Proc. IEEE, 1994, vol. II, pp. 640{644.
[23] VisTex, \Color image database," http://www-white.media.mit.edu/vismod/ imagery/VisionTexture, 1995, MIT Media Lab.
[24] R.W. Picard, T. Kabir, and F. Liu, \Real-time recognition with the entire brodatz texture database," in Proc. IEEE Conf.
Comp. Vis. Patt. Rec., 1993, pp. 683{639.
[25] M. Unser, A. Aldroubi, and M. Eden, \A family of polynomial spline wavelet transforms," Signal Processing, vol. 30, pp.
141{162, 1993.
[26] P. Pudil, J. Novovicova, and J. Kittler, \Floating search methods in feature selection," Patt. Rec. Lett., vol. 15, pp. 1119{1125,
1994.
[27] A. Jain and D. Zongker, \Feature selection - evaluation, application, and small sample performance," IEEE Trans. Patt.
Anal. Mach. Intell., vol. 19, no. 2, pp. 153{158, 1997.
[28] D. Michie, D.J. Spiegelhalter, and C.C. Taylor, Eds., Machine learning, neural and statistical classi cation, Hertfordshire,
1994. Ellis Horwood.
[29] P.A. Devijver and J. Kittler, Pattern recognition: a statistical approach, Prentice/Hall, Englewood Cli s, New Jersey, 1982.
[30] T. Kohonen, J. Kangas, J. Laaksonen, and K. Torkkola, \Lvq pak and som pak v2.1," 1992, available via anonymous ftp
from cochlea.hut. .
[31] K. Fukunaga, Introduction to Statistical Pattern Recognition, Academic Press, London, 2nd edition, 1990.

List of Tables

I computation of 8 common features from a cooccurrence matrix C (i; j ). . . . . . . . . . . . . . . . 9


II Error rates from a classi cation experiment of 30 textures, using the selected best features. 10
Features were used in column 1 and 15 in the others. Only those classes for which error rates
vary between experiments are reported. col.1: energy, col.2: histogram, col.3: cooccurrence, col.4:
histogram and cooccurrence signatures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
List of Figures

1 Selection of images from the VisTex-database: from top to bottom and left to right: Bark0,
Bark4, Bark6, Bark8, Bark9, Brick1, Brick4, Brick5, Fabric0, Fabric4, Fabric7, Fabric9, Fabric11,
Fabric13, Fabric16, Fabric17, Fabric18, Food0, Food2, Food5, Food8, Grass1, Sand0, Stone4,
Tile1, Tile3, Tile7, Water6, Wood1, Wood2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2 Examples of the goodness of t of the model. The dotted line is the observed histogram, solid
line is the tted model. a) = 20:8, = 2:00, asm = 1:42 10,6, b) = 1:27, = 0:566,
asm = 2:89 10,4, c) = 44:3, = 50:0, asm = 6:86 10,7. c) represents less then 2 of the
investigated cases. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3 Classi cation performance vs. feature set dimensionality : 1) energy signatures, 2) histogram
signatures, 3) cooccurrence signatures, 4) histogram and cooccurrence signatures. . . . . . . . . . 17

TABLE I

computation of 8 common features from a cooccurrence matrix C(i; j).

inertia
total energy
entropy
local homogeneity
max. probability
cluster shade
cluster prominence
information measure of
correlation

P
P
P
P

n
1 = i;j=0 (i,j )2C (i;j )
n
2
C2 =
C (i;j )
i;j =0
n
C3 =,
C (i;j )log C (i;j )
i;j =0
n
1 C (i;j )
C4 =
i;j =0 1+(i,j )2
C5 =maxi;j C (i;j )
n
C6 =
(i,Mx+j ,My )3 C (i;j )
i;j =0
n
C7 =
(i,Mx+j ,My )4 C (i;j )
i;j =0
(
C3 ,Hxy )
C8 =
maxfHx ;Hy g
C

P
P

Pn i C (i;j ) and M =Pn j C (i;j )


y
i;j=0
i;j=0
P
P
n
n
Sx (i)= j=0 C (i;j ) and Sy (j )= i=0 C (i;j )
P
Hxy =, ni;j=0 C (i;j ) log(Sx (i)Sy (j ))
P
P
H =, n S (i) log S (i) and H =, n S (j ) log S (j )
where Mx =

i=0 x

j=0 y

10

TABLE II

Error rates from a classification experiment of 30 textures, using the selected best features. 10
Features were used in column 1 and 15 in the others. Only those classes for which error rates
vary between experiments are reported. col.1: energy, col.2: histogram, col.3: cooccurrence,
col.4: histogram and cooccurrence signatures.

Bark0
Bark4
Bark6
Bark8
Bark9
Brick4
Brick5
Fabric4
Fabric11
Fabric13
Fabric16
Food0
Food2
Food8
Grass1
Tile1
Tile3
Wood1
mean

1
21.9
15.6
20.3
20.3
40.6
14.1
10.9
4.7
3.1
6.2
17.2
29.7
42.2
9.4
4.7
6.2
26.6
25.0
17.7

2
3
4
12.5 0.0 0.0
7.8 3.1 1.6
12.5 1.6 3.1
1.6 14.1 3.1
14.1 12.5 12.5
4.7 3.1 3.1
12.5 1.6 0.0
4.7 3.1 3.1
3.1 6.2 1.6
0.0 0.0 0.0
1.6 0.0 0.0
6.2 3.1 3.1
4.7 10.9 6.2
3.1 0.0 0.0
1.6 6.2 1.6
6.2 0.0 0.0
15.6 3.1 1.6
25.0 3.1 4.7
7.6 4.0 2.5

15

Fig. 1. Selection of images from the VisTex-database: from top to bottom and left to right: Bark0, Bark4, Bark6,
Bark8, Bark9, Brick1, Brick4, Brick5, Fabric0, Fabric4, Fabric7, Fabric9, Fabric11, Fabric13, Fabric16, Fabric17,
Fabric18, Food0, Food2, Food5, Food8, Grass1, Sand0, Stone4, Tile1, Tile3, Tile7, Water6, Wood1, Wood2.

16

Fig. 2. Examples of the goodness of t of the model. The dotted line is the observed histogram, solid line is the tted
model. a) = 20:8, = 2:00, asm = 1:42 10,6, b) = 1:27, = 0:566, asm = 2:89 10,4, c) = 44:3, = 50:0,
asm = 6:86 10,7. c) represents less then 2 of the investigated cases.

17

Fig. 3. Classi cation performance vs. feature set dimensionality : 1) energy signatures, 2) histogram signatures, 3)
cooccurrence signatures, 4) histogram and cooccurrence signatures.

Anda mungkin juga menyukai