An object-based classification of mangroves using a hybrid decision tree-support vector machine approach

Research output: Contribution to journalArticlepeer-review

191 Scopus citations


Mangroves provide valuable ecosystem goods and services such as carbon sequestration, habitat for terrestrial and marine fauna, and coastal hazard mitigation. The use of satellite remote sensing to map mangroves has become widespread as it can provide accurate, efficient, and repeatable assessments. Traditional remote sensing approaches have failed to accurately map fringe mangroves and true mangrove species due to relatively coarse spatial resolution and/or spectral confusion with landward vegetation. This study demonstrates the use of the new Worldview-2 sensor, Object-based image analysis (OBIA), and support vector machine (SVM) classification to overcome both of these limitations. An exploratory spectral separability showed that individual mangrove species could not be spectrally separated, but a distinction between true and associate mangrove species could be made. An OBIA classification was used that combined a decision-tree classification with the machine-learning SVM classification. Results showed an overall accuracy greater than 94% (kappa = 0.863) for classifying true mangroves species and other dense coastal vegetation at the object level. There remain serious challenges to accurately mapping fringe mangroves using remote sensing data due to spectral similarity of mangrove and associate species, lack of clear zonation between species, and mixed pixel effects, especially when vegetation is sparse or degraded.

Original languageEnglish
Pages (from-to)2440-2460
Number of pages21
JournalRemote Sensing
Issue number11
StatePublished - Nov 2011


  • Classification
  • Decision-tree
  • Galapagos islands
  • Mangroves
  • Obia
  • Support vector machine


Dive into the research topics of 'An object-based classification of mangroves using a hybrid decision tree-support vector machine approach'. Together they form a unique fingerprint.

Cite this