High-dimensional Ising model selection with Bayesian information criteria
نویسندگان
چکیده
منابع مشابه
Composite Likelihood Bayesian Information Criteria for Model Selection in High-Dimensional Data
For high-dimensional data sets with complicated dependency structures, the full likelihood approach often leads to intractable computational complexity. This imposes difficulty on model selection, given that most traditionally used information criteria require evaluation of the full likelihood. We propose a composite likelihood version of the Bayes information criterion (BIC) and establish its ...
متن کاملBayesian Model Selection in High-Dimensional Settings.
Standard assumptions incorporated into Bayesian model selection procedures result in procedures that are not competitive with commonly used penalized likelihood methods. We propose modifications of these methods by imposing nonlocal prior densities on model parameters. We show that the resulting model selection procedures are consistent in linear model settings when the number of possible covar...
متن کاملHigh - Dimensional Ising Model Selection Using 1 - Regularized Logistic Regression
We consider the problem of estimating the graph associated with a binary Ising Markov random field. We describe a method based on 1-regularized logistic regression, in which the neighborhood of any given node is estimated by performing logistic regression subject to an 1-constraint. The method is analyzed under high-dimensional scaling in which both the number of nodes p and maximum neighborhoo...
متن کاملDeviance Information Criteria for Model Selection in Approximate Bayesian Computation
Approximate Bayesian computation (ABC) is a class of algorithmic methods in Bayesian inference using statistical summaries and computer simulations. ABC has become popular in evolutionary genetics and in other branches of biology. However model selection under ABC algorithms has been a subject of intense debate during the recent years. Here we propose novel approaches to model selection based o...
متن کاملBayesian feature selection for high-dimensional linear regression via the Ising approximation with applications to genomics
MOTIVATION Feature selection, identifying a subset of variables that are relevant for predicting a response, is an important and challenging component of many methods in statistics and machine learning. Feature selection is especially difficult and computationally intensive when the number of variables approaches or exceeds the number of samples, as is often the case for many genomic datasets. ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Electronic Journal of Statistics
سال: 2015
ISSN: 1935-7524
DOI: 10.1214/15-ejs1012