University of Wollongong
Browse

Variational inference for infinite mixtures of sparse Gaussian processes through KL-correction

Download (222.34 kB)
conference contribution
posted on 2024-11-14, 09:13 authored by Thi Nhat Anh Nguyen, Abdesselam BouzerdoumAbdesselam Bouzerdoum, Son Lam PhungSon Lam Phung
We propose a new approximation method for Gaussian process (GP) regression based on the mixture of experts structure and variational inference. Our model is essentially an infinite mixture model in which each component is composed of a Gaussian distribution over the input space, and a Gaussian process expert over the output space. Each expert is a sparse GP model augmented with its own set of inducing points. Variational inference is made feasible by assuming that the training outputs are independent given the inducing points. In previous works on variational mixture of GP experts, the inducing points are selected through a greedy selection algorithm, which is computationally expensive. In our method, both the inducing points and hyperparameters of the experts are learned through maximizing an improved lower bound of the marginal likelihood. Experiments on benchmark datasets show the advantages of the proposed method.

History

Citation

T. Nguyen, A. Bouzerdoum & S. L. Phung, "Variational inference for infinite mixtures of sparse Gaussian processes through KL-correction," in 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2016, pp. 2579-2583.

Parent title

ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings

Volume

2016-May

Pagination

2579-2583

Language

English

RIS ID

107986

Usage metrics

    Categories

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC