Exploiting Sampling and Meta-learning for Parameter Setting for Support Vector Machines

Warning

This publication doesn't include Faculty of Education. It includes Faculty of Informatics. Official publication website can be found on muni.cz.
Authors

KUBA Petr BRAZDIL Pavel SOARES Carlos WOZNICA Adam

Year of publication 2002
Type Article in Proceedings
Conference Proc. of Workshop Learning and Data Mining associated with Iberamia 2002, VIII Iberoamerican Conference on Artificial Intellignce
MU Faculty or unit

Faculty of Informatics

Citation
Field Informatics
Keywords SVM; Meta-learning; parameter setting
Description It is a known fact that good parameter settings affect the performance of many machine learning algorithms. Support Vector Machines (SVM) and Neural Networks are particularly affected. In this paper, we concentrate on SVM and discuss some ways to set its parameters. The first approach uses small samples, while the second one exploits meta-learning and past results. Both methods have been thoroughly evaluated. We show that both approaches enable us to obtain quite good results with significant savings in experimentation time.
Related projects:

You are running an old browser version. We recommend updating your browser to its latest version.