Bayesian Gaussian process classification with the EM-EP algorithm

HC Kim, Z Ghahramani�- IEEE Transactions on Pattern Analysis�…, 2006 - ieeexplore.ieee.org
HC Kim, Z Ghahramani
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2006ieeexplore.ieee.org
Gaussian process classifiers (GPCs) are Bayesian probabilistic kernel classifiers. In GPCs,
the probability of belonging to a certain class at an input location is monotonically related to
the value of some latent function at that location. Starting from a Gaussian process prior over
this latent function, data are used to infer both the posterior over the latent function and the
values of hyperparameters to determine various aspects of the function. Recently, the
expectation propagation (EP) approach has been proposed to infer the posterior over the�…
Gaussian process classifiers (GPCs) are Bayesian probabilistic kernel classifiers. In GPCs, the probability of belonging to a certain class at an input location is monotonically related to the value of some latent function at that location. Starting from a Gaussian process prior over this latent function, data are used to infer both the posterior over the latent function and the values of hyperparameters to determine various aspects of the function. Recently, the expectation propagation (EP) approach has been proposed to infer the posterior over the latent function. Based on this work, we present an approximate EM algorithm, the EM-EP algorithm, to learn both the latent function and the hyperparameters. This algorithm is found to converge in practice and provides an efficient Bayesian framework for learning hyperparameters of the kernel. A multiclass extension of the EM-EP algorithm for GPCs is also derived. In the experimental results, the EM-EP algorithms are as good or better than other methods for GPCs or Support Vector Machines (SVMs) with cross-validation.
ieeexplore.ieee.org
Showing the best result for this search. See all results