University of Wollongong
Browse

Learning kernel parameters by using class separability measure

Download (286.71 kB)
conference contribution
posted on 2024-11-13, 15:36 authored by Lei WangLei Wang, Kap Luk Chan
Learning kernel parameters is important for kernel based methods because these parameters have significant impact on the generalization abilities of these methods. Besides the methods of Cross-Validation and Leave-One-Out, minimizing some upper bounds on the generalization error, such as the radius-margin bound, was also proposed to more efficiently learn the optimal kernel parameters. In this paper, a class separability criterion is proposed for learning kernel parameters. The optimal kernel parameters are regarded as those that can maximize the class separability in the induced feature space. With this criterion, learning the kernel parameters in SVM can avoid solving the quadratic programming problem. The relationship between this criterion and the radius-margin bound is also explored. Both theoretical analysis and experimental results show that the class separability criterion is effective in learning kernel parameters for SVM.

History

Citation

Wang, L. & Chan, K. Luk. (2002). Learning kernel parameters by using class separability measure. The sixth kernel machines workshop, in conjunction with Neural Information Processing Systems (NIPS) (pp. 1-8).

Pagination

1-8

Language

English

RIS ID

54424

Usage metrics

    Categories

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC