Please use this identifier to cite or link to this item: http://repo.lib.jfn.ac.lk/ujrr/handle/123456789/1413
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAnantharajah, K.
dc.contributor.authorDenmon, S.
dc.contributor.authorSridharan, S.
dc.contributor.authorFookes, C.
dc.contributor.authorTjondronegoro, D.
dc.date.accessioned2021-02-15T05:15:41Z
dc.date.accessioned2022-06-27T09:57:58Z-
dc.date.available2021-02-15T05:15:41Z
dc.date.available2022-06-27T09:57:58Z-
dc.date.issued2012
dc.identifier.citationAnantharajah, K., Denman, S., Sridharan, S., Fookes, C., & Tjondronegoro, D. (2012, December). Quality based frame selection for video face recognition. In 2012 6th International Conference on Signal Processing and Communication Systems (pp. 1-5). IEEE.en_US
dc.identifier.urihttp://repo.lib.jfn.ac.lk/ujrr/handle/123456789/1413-
dc.description.abstractQuality based frame selection is a crucial task in video face recognition, to both improve the recognition rate and to reduce the computational cost. In this paper we present a framework that uses a variety of cues (face symmetry, sharpness, contrast, closeness of mouth, brightness and openness of the eye) to select the highest quality facial images available in a video sequence for recognition. Normalized feature scores are fused using a neural network and frames with high quality scores are used in a Local Gabor Binary Pattern Histogram Sequence based face recognition system. Experiments on the Honda/UCSD database shows that the proposed method selects the best quality face images in the video sequence, resulting in improved recognition performance.en_US
dc.language.isoenen_US
dc.publisher6th International Conference on Signal Processing and Communication Systemsen_US
dc.titleQuality based frame selection for video face recognitionen_US
dc.typeArticleen_US
Appears in Collections:Computer Engineering

Files in This Item:
File Description SizeFormat 
Quality based frame selection for VFR_CR -2.pdf86.15 kBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.