Consistent Bayesian information criterion based on a mixture prior for possibly high‐dimensional multivariate linear regression models
نویسندگان
چکیده
In the problem of selecting variables in a multivariate linear regression model, we derive new Bayesian information criteria based on prior mixing smooth distribution and delta distribution. Each them can be interpreted as fusion Akaike criterion (AIC) (BIC). Inheriting their asymptotic properties, our are consistent variable selection both large-sample high-dimensional frameworks. numerical simulations, methods choose true set with high probability most cases.
منابع مشابه
General Hyperplane Prior Distributions Based on Geometric Invariances for Bayesian Multivariate Linear Regression
Based on geometric invariance properties, we derive an explicit prior distribution for the parameters of multivariate linear regression problems in the absence of further prior information. The problem is formulated as a rotationally-invariant distribution of L-dimensional hyperplanes inN dimensions, and the associated system of partial differential equations is solved. The derived prior distri...
متن کاملPrior Information Based Bayesian Infinite Mixture Model
Unsupervised learning methods have been tremendously successful in extracting knowledge from genomics data generated by high throughput experimental assays. However, analysis of each dataset in isolation without incorporating potentially informative prior knowledge is limiting the utility of such procedures. Here we present a novel probabilistic model and computational algorithm for semi-superv...
متن کاملMultivariate linear regression with non-normal errors: a solution based on mixture models
In some situations, the distribution of the error terms of a multivariate linear regression model may depart from normality. This problem has been addressed, for example, by specifying a different parametric distribution family for the error terms, such as multivariate skewed and/or heavy-tailed distributions. A new solution is proposed, which is obtained by modelling the error term distributio...
متن کاملExtending the Akaike Information Criterion to Mixture Regression Models
We examine the problem of jointly selecting the number of components and variables in finite mixture regression models. We find that the Akaike information criterion is unsatisfactory for this purpose because it overestimates the number of components, which in turn results in incorrect variables being retained in the model. Therefore, we derive a new information criterion, the mixture regressio...
متن کاملA Bayesian information criterion for singular models
We consider approximate Bayesian model choice for model selection problems that involve models whose Fisher-information matrices may fail to be invertible along other competing submodels. Such singular models do not obey the regularity conditions underlying the derivation of Schwarz’s Bayesian information criterion (BIC) and the penalty structure in BIC generally does not reflect the frequentis...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Scandinavian Journal of Statistics
سال: 2022
ISSN: ['0303-6898', '1467-9469']
DOI: https://doi.org/10.1111/sjos.12617