Parameter inference with estimated covariance matrices

E Sellentin, AF Heavens�- Monthly Notices of the Royal�…, 2015 - academic.oup.com
E Sellentin, AF Heavens
Monthly Notices of the Royal Astronomical Society: Letters, 2015academic.oup.com
When inferring parameters from a Gaussian-distributed data set by computing a likelihood, a
covariance matrix is needed that describes the data errors and their correlations. If the
covariance matrix is not known a priori, it may be estimated and thereby becomes a random
object with some intrinsic uncertainty itself. We show how to infer parameters in the
presence of such an estimated covariance matrix, by marginalizing over the true covariance
matrix, conditioned on its estimated value. This leads to a likelihood function that is no�…
Abstract
When inferring parameters from a Gaussian-distributed data set by computing a likelihood, a covariance matrix is needed that describes the data errors and their correlations. If the covariance matrix is not known a priori, it may be estimated and thereby becomes a random object with some intrinsic uncertainty itself. We show how to infer parameters in the presence of such an estimated covariance matrix, by marginalizing over the true covariance matrix, conditioned on its estimated value. This leads to a likelihood function that is no longer Gaussian, but rather an adapted version of a multivariate t-distribution, which has the same numerical complexity as the multivariate Gaussian. As expected, marginalization over the true covariance matrix improves inference when compared with Hartlap et�al.'s method, which uses an unbiased estimate of the inverse covariance matrix but still assumes that the likelihood is Gaussian.
Oxford University Press