Information matrix computation from conditional information via normal approximation
نویسنده
چکیده
This paper provides a method for computing the asymptotic covariance matrix from a likelihood function with known maximum likelihood estimate of the parameters. Philosophically, the basic idea is to assume that the likelihood function should be well approximated by a normal density when asymptotic results about the maximum likelihood estimate are applied for statistical inference. Technically, the method makes use of two facts: the information for a one-dimensional parameter can be well computed when the loglikelihood is approximately quadratic over the range corresponding to a small positive confidence interval; and the covariance matrix of a normal distribution can be obtained from its one-dimensional conditional distributions whose sample spaces span the sample space of the joint distribution. We illustrate the method with its application to a linear mixed-effects model.
منابع مشابه
Approximation of certain multivariate integrals
A Taylor series approximation to multivariate integrals taken with respect to a multivariate probability distribution is proposed and applied to the computation of multivariate normal probabilities and conditional expectations. The approximation does not require that the multivariate distribution have a structured covariance matrix and, in its simplest form, can be written as the product of uni...
متن کاملDesigns for generalized linear models with random block effects via information matrix approximations
The selection of optimal designs for generalized linear mixed models is complicated by the fact that the Fisher information matrix, on which most optimality criteria depend, is computationally expensive to evaluate. Our focus is on the design of experiments for likelihood estimation of parameters in the conditional model. We provide two novel approximations that substantially reduce the computa...
متن کاملSufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities
Sufficient dimension reduction (SDR) is aimed at obtaining the low-rank projection matrix in the input space such that information about output data is maximally preserved. Among various approaches to SDR, a promising method is based on the eigendecomposition of the outer product of the gradient of the conditional density of output given input. In this letter, we propose a novel estimator of th...
متن کاملOn Conditional Applications of Matrix Variate Normal Distribution
In this paper, by conditioning on the matrix variate normal distribution (MVND) the construction of the matrix t-type family is considered, thus providing a new perspective of this family. Some important statistical characteristics are given. The presented t-type family is an extension to the work of Dickey [8]. A Bayes estimator for the column covariance matrix &Sigma of MVND is derived under ...
متن کاملBayesian Non-negative Matrix Factorization
We present a Bayesian treatment of non-negative matrix factorization (NMF), based on a normal likelihood and exponential priors, and derive an efficient Gibbs sampler to approximate the posterior density of the NMF factors. On a chemical brain imaging data set, we show that this improves interpretability by providing uncertainty estimates. We discuss how the Gibbs sampler can be used for model ...
متن کامل