xi
p(x i ) = k p k (x i )
(1)
k =1
p k (x i ) =
(2 )
L/2
1/ 2
T
exp (x i k ) k1 (x i k )
2
(2)
1796
0
where
=1.
(3)
k =1
p ( X) =
N1 N 2 K
k pk (x i ) .
(4)
i =1 k =1
k , k
and
k such that
(0)
k ,
(k0) and
initialize
(k0) ,
the
algorithm
with
1 k K .
z ik( m)
k( m ) p k (x i )
K
k =1
k( m)
(5)
p k (x i )
( m +1)
k
(km+1) =
(km+1) =
N1 N 2
1
=
N1 N 2
( m)
ij
(6)
i =1
N1 N 2
zij(m) x i
N1 N 2 k( m+1) i =1
1
(7)
N1 N 2
N1 N 2 k( m+1) i =1
(8)
4) If the difference between the parameters in the m-th and
(m + 1) -th iterations is less than a prescribed threshold
, the algorithm is terminated. Otherwise, set
m = m + 1 and go to Step 2.
1797
0-7803-7930-6/$17.00 (C) 2003 IEEE
(a) vegetation
IV. CONCLUSIONS
(c) water bodies
The application of EM and adaptive EM algorithms to
remote sensing image is investigated. The number of
resulting classes in the EM and adaptive EM algorithms can
be automatically selected. Using the adaptive EM algorithm,
local statistics in an image scene can be captured and
modeled. As a result, small classes can be more accurately
detected and classified at the sacrifice of computational
time, comparing to the high potential of being merged into
large classes when using EM algorithm for global statistics.
(a) vegetation 1
(c) vegetation 2
REFERENCES
[1] R. A. Schowengerdt, Remote Sensing, Models and
Methods for Image Processing, Academic Press, 1997.
[2] A. P. Dempster, N. M. Laird and D. B. Rubin,
Maximum likelihood from incomplete data via the EM
algorithm, Journal Royal Statistics Society, Vol. 39,
No. 1, pp. 1-21, 1977.
[3] A. Peng and W. Pieczynski, Adaptive mixture
estimation and unsupervised local Baysian image
1798
0-7803-7930-6/$17.00 (C) 2003 IEEE