RemoteSensingClassifications
Mahesh Pal
Department of Civil Engineering
N. I. T. Kurukshetra
Haryana India
Data used
Results
Bagging
----
Boosting
Boosting
A method using same training data with different weights to
produce a series of classifiers.
Boostingassignsaweighttoeachobservationthehigherthe
weight,themorethatobservationinfluencestheclassifier
Ateachtrial,thevectorofweightsisadjustedandtheweight
ofmisclassifiedobservationsisincreased.
The final classifier aggregates the classifiers generated after
eachiterationbyvoting.
Eachclassifiersvoteisafunctionofitsaccuracy.
For this study AdaboostM1 (Freund and Schapire, 1996) is
used
Data used
ETM+ data near a town LittleportinUK,2000.
Classificationprobleminvolvessevenlandcovertypes
(wheat,potato,sugarbeet,onion,peas,lettuceandbeans).
Referenceimagewascreatedusingfieldsurvey.
Atotalof4737pixelsarerandomlyselected.
Totalpixelsweredividedintwoparts2700fortraining
and2037fortesting.
Classification methods
A univariate decision tree classifier with gainratioasattribute
selectionmeasuresanderrorbasedpruningwasused.
ABackpropagationneuralnetworkwithonehiddenlayer
havingtwentysixnodeswasused.
Studieswithavalidationdatasetsuggeststhat2200iteration
withlearningrateof0.25providesgoodresults.
10iterationofbaggingandboostingwascarriedout.
Totalaccuracyandkappacoefficientwerecalculatedusing
confusionmatrices.
Trainingtimeofbothclassifierswithbaggingandboosting
alsorecorded.
Results
Classification accuracies
Data set ETM+
Decision Tree
Accuracy Kappa
(%)
value
Without
bagging/
boosting
83.8
Neural Network
Accuracy Kappa
(%)
value
Boosting
0.86
Bagging
0.89
Training time
Classifier
Without Boosting
boosting/
bagging
Bagging
Neural
network
975.4
seconds
10846.1
seconds
10200.9
seconds
Decision
tree
0.53
seconds
17.02
seconds
12.4
seconds
Conclusions
Asmallpercentageincreaseinaccuracyisdifficultto
generatewhentheoverallclassificationaccuracylevel
exceeds80%,so
DT perform very well with both boosting and bagging,
improving accuracy by about 4%.
An improvement of about 2.5% in classification accuracy
with NN is good as these classifier are more stable than DT.
Training time with NN is quite large in comparison with
DT classifier.
NNperformwellwithbaggingincomparisontoboosting.