Please use this identifier to cite or link to this item:
http://repo.lib.jfn.ac.lk/ujrr/handle/123456789/1291
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Kokul, T. | |
dc.date.accessioned | 2019-11-25T07:06:49Z | |
dc.date.accessioned | 2022-06-27T04:11:21Z | - |
dc.date.available | 2019-11-25T07:06:49Z | |
dc.date.available | 2022-06-27T04:11:21Z | - |
dc.date.issued | 2015-10-06 | |
dc.identifier.issn | 955-8787-07-6 | |
dc.identifier.uri | http://repo.lib.jfn.ac.lk/ujrr/handle/123456789/1291 | - |
dc.description.abstract | Support vector machine (SVM) is an efficient classification technique which is widely used in many machine learning applications due to the property of outperforming other classifiers in its generalization performance. There are several tools available for SVM but they differ in their implementation and efficiency. This study aims to evaluate three popular SVM tools: LIBSVM, SVMlight and MATLAB packaged SVM, and four wrapper based feature selection techniques: Sequential forward selection (SFS), sequential backward selection (SBS), sequential forward floating selection (SFFS) and sequential backward floating selection (SBFS), in classification. The evaluation was performed on five benchmark numerical datasets: Segment, Vehicle and Satimage from the UCI machine learning repository, and Madelon and Gisette from the NIPS 2003 feature selection challenge. The former subset of data is of multiclass whereas the latter is of binary class classification tasks. Each dataset was scaled to be in [-1, 1] and were classified using one-against-all (OVA) SVMs by comparing linear and RBF kernels. The performance evaluation among SVM tools were tested for statistical significance using ANOVA test. Testing results show that LIBSVM and SVMlight outperform MATLAB packaged SVM in classification. LIBSVM is of near performance to SVMlight. Moreover, LIBSVM is faster when training data is of dense format. The kernel evaluation of sparse vector is slower in LIBSVM so the total training time is at least twice of that using the dense format. This issue has been well tackled by SVMlight that is faster in training when using the sparse format. In addition to this, the feature selection techniques were evaluated using LIBSVM. The testing results show that SFFS technique yields more compact feature sets while maintaining comparable performance to other feature selection techniques in classification. Based on the experimental results, LIBSVM can be considered as a better tool and SFFS technique gives more compact feature set which can be classified using the implementation of LIBSVM | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | Postgraduate Institute of Science, University of Peradeniya | en_US |
dc.subject | Support vector machine, | en_US |
dc.subject | Feature selection | en_US |
dc.title | A Performance Evaluation of various SVM Tools and Different Feature Selection Strategies | en_US |
dc.type | Article | en_US |
Appears in Collections: | Physical Science |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
A Performance Evaluation of various SVM Tools and Different Feature Selection Strategies.pdf | 348.1 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.