Please use this identifier to cite or link to this item: http://repo.lib.jfn.ac.lk/ujrr/handle/123456789/2155
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMadurapperuma, B.D.
dc.contributor.authorKahara, S.N.
dc.contributor.authorHernandez, K.B.
dc.contributor.authorCorro, L.M.
dc.date.accessioned2021-03-26T06:38:41Z
dc.date.accessioned2022-07-07T05:06:57Z-
dc.date.available2021-03-26T06:38:41Z
dc.date.available2022-07-07T05:06:57Z-
dc.date.issued2020
dc.identifier.urihttp://repo.lib.jfn.ac.lk/ujrr/handle/123456789/2155-
dc.description.abstractThe purpose of this study is to use the Unmanned Aerial Systems (UAS) images for mapping wetland vegetation using object-based classification methods and to compare its performance with cropland data layers. The UAS imagery (~0.1-m resolution) and National Agriculture Imagery Program (~0.6-m) data were used to extract wetland vegetation using object-based classification methods in ArcGIS Pro. Spectral indices, such as green chromatic coordinate (GCC) and normalized difference vegetation index (NDVI) coupled with unsupervised classification have been used for vegetation classification. UAS imagery performed slightly better than NAIP for classification yielding 49% vegetation in 2019, while it was 45% in 2018 and 35% in 2016 for NAIP classification. According to cropland data classification, open water land cover class also covered a large portion of the study area. In conclusion, object based classification using high-resolution imagery has good potential to integrate with ground survey to implement best management practices for restoring wetlands.en_US
dc.language.isoenen_US
dc.publisheruniversity of Jaffnaen_US
dc.subjectUasen_US
dc.subjectWetlandsen_US
dc.subjectObject- based classificationen_US
dc.subjectSpectral indicesen_US
dc.subjectCropland dataen_US
dc.titleHigh-resolution data for capturing wetland vegetation using object based classification methodsen_US
dc.typeArticleen_US
Appears in Collections:FARS 2020



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.