University of Wollongong
Browse

A kernel-induced space selection approach to model selection of KLDA

Download (1.48 MB)
journal contribution
posted on 2024-11-13, 23:52 authored by Lei WangLei Wang, Kap Luk Chan, Ping Xue, Luping Zhou
Model selection in kernel linear discriminant analysis (KLDA) refers to the selection of appropriate parameters of a kernel function and the regularizer. By following the principle of maximum information preservation, this paper formulates the model selection problem as a problem of selecting an optimal kernel-induced space in which different classes are maximally separated from each other. A scatter-matrix-based criterion is developed to measure the "goodness" of a kernel-induced space, and the kernel parameters are tuned by maximizing this criterion. This criterion is computationally efficient and is differentiable with respect to the kernel parameters. Compared with the leave-one-out (LOO) or -fold cross validation (CV), the proposed approach can achieve a faster model selection, especially when the number of training samples is large or when many kernel parameters need to be tuned. To tune the regularization parameter in the KLDA, our criterion is used together with the method proposed by Saadi et al. (2004). Experiments on benchmark data sets verify the effectiveness of this model selection approach.

History

Citation

Wang, L., Chan, K. Luk., Xue, P. & Zhou, L. (2008). A kernel-induced space selection approach to model selection of KLDA. IEEE Transactions on Neural Networks, 19 (12), 2116-2131.

Journal title

IEEE Transactions on Neural Networks

Volume

19

Issue

12

Pagination

2116-2131

Language

English

RIS ID

54292

Usage metrics

    Categories

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC