Bayesian Logistic Regression Model Choice via Laplace-Metropolis Algorithm

Authors

  • Farzad Eskandari
  • M. Reza Meshkani
Abstract:

Following a Bayesian statistical inference paradigm, we provide an alternative methodology for analyzing a multivariate logistic regression. We use a multivariate normal prior in the Bayesian analysis. We present a unique Bayes estimator associated with a prior which is admissible. The Bayes estimators of the coefficients of the model are obtained via MCMC methods. The proposed procedure is illustrated by analyzing a data set which has previously b"'en analyzed by various authors. It is shown that our model is more precise and computationally less taxing.

Upgrade to premium to download articles

Sign up to access the full text

Already have an account?login

similar resources

Bayesian multivariate logistic regression.

Bayesian analyses of multivariate binary or categorical outcomes typically rely on probit or mixed effects logistic regression models that do not have a marginal logistic structure for the individual outcomes. In addition, difficulties arise when simple noninformative priors are chosen for the covariance parameters. Motivated by these problems, we propose a new type of multivariate logistic dis...

full text

Approximate Bayesian logistic regression via penalized likelihood by data augmentation

We present a command, penlogit, for approximate Bayesian logistic regression using penalized likelihood estimation via data augmentation. This command automatically adds specific prior-data records to a dataset. These records are computed so that they generate a penalty function for the log-likelihood of a logistic model, which equals (up to an additive constant) a set of independent log prior ...

full text

Sparse Multinomial Logistic Regression via Bayesian L1 Regularisation

Multinomial logistic regression provides the standard penalised maximumlikelihood solution to multi-class pattern recognition problems. More recently, the development of sparse multinomial logistic regression models has found application in text processing and microarray classification, where explicit identification of the most informative features is of value. In this paper, we propose a spars...

full text

Exploiting Monotonicity via Logistic Regression in Bayesian Network Learning

An important challenge in machine learning is to find ways of learning quickly from very small amounts of training data. The only way to learn from small data samples is to constrain the learning process by exploiting background knowledge. In this report, we present a theoretical analysis on the use of constrained logistic regression for estimating conditional probability distribution in Bayesi...

full text

Sparse Bayesian kernel logistic regression

In this paper we present a simple hierarchical Bayesian treatment of the sparse kernel logistic regression (KLR) model based MacKay’s evidence approximation. The model is re-parameterised such that an isotropic Gaussian prior over parameters in the kernel induced feature space is replaced by an isotropic Gaussian prior over the transformed parameters, facilitating a Bayesian analysis using stan...

full text

Bayesian computation for logistic regression

A method for the simulation of samples from the exact posterior distributions of the parameters in logistic regression is proposed. It is based on the principle of data augmentation and a latent variable is introduced, similar to the approach of Albert and Chib (J. Am. Stat. Assoc. 88 (1993) 669), who applied it to the probit model. In general, the full conditional distributions are intractable...

full text

My Resources

Save resource for easier access later

Save to my library Already added to my library

{@ msg_add @}


Journal title

volume 5  issue None

pages  9- 24

publication date 2006-11

By following a journal you will be notified via email when a new issue of this journal is published.

Keywords

Hosted on Doprax cloud platform doprax.com

copyright © 2015-2023