Please use this identifier to cite or link to this item:
http://repo.lib.jfn.ac.lk/ujrr/handle/123456789/809
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kesavan, Y. | - |
dc.contributor.author | Ramanan, A. | - |
dc.date.accessioned | 2016-01-02T11:01:34Z | - |
dc.date.accessioned | 2022-06-28T04:51:43Z | - |
dc.date.available | 2016-01-02T11:01:34Z | - |
dc.date.available | 2022-06-28T04:51:43Z | - |
dc.date.issued | 2014-12-22 | - |
dc.identifier.other | 10.1109/ICIAFS.2014.7069599 | - |
dc.identifier.uri | http://repo.lib.jfn.ac.lk/ujrr/handle/123456789/809 | - |
dc.description.abstract | A superpixel is an image patch which is better aligned with intensity edges than a rectangular patch. Superpixels are perceptually consistent units which carry more information than pixels and adhere well to image boundaries. Nowadays superpixels are widely used for segmentation in computer vision and biomedical applications. There are many approaches to generate superpixels such as SLIC, QuickShift, Turbopixels and Normalized cuts algorithms, each with its own advantages and drawbacks that may be better suited to a particular application. In this paper we propose a one-pass clustering (OPC) technique to efficiently generate superpixels in the combined five-dimensional feature space of CIELAB colour and XY image plane. The Berkeley segmentation dataset (BSDS500) is used to quantitatively compare the performance of OPC with SLIC superpixels method as measured by boundary recall and under-segmentation error. OPC superpixels achieve comparable performance to previously reported results which allow a trade-off between a less regular space superpixels but accurate boundaries or better efficiency. | en_US |
dc.language.iso | en | en_US |
dc.publisher | IEEE | en_US |
dc.subject | Superpixels, One-pass clustering, Segmentation | en_US |
dc.title | One-Pass Clustering Superpixels | en_US |
dc.type | Article | en_US |
Appears in Collections: | Computer Science |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.