Block HSIC Lasso: model-free biomarker detection for ultra-high dimensional data
نویسندگان
چکیده
منابع مشابه
High-dimensional data and the Lasso
How would you try to solve a linear system of equations with more unknowns than equations? Of course, there are infinitely many solutions, and yet this is the sort of the problem statisticians face with many modern datasets, arising in genetics, imaging, finance and many other fields. What’s worse, our equations are often corrupted by noisy measurements! In this article we will introduce a stat...
متن کاملNon-asymptotic Oracle Inequalities for the Lasso and Group Lasso in high dimensional logistic model
We consider the problem of estimating a function f0 in logistic regression model. We propose to estimate this function f0 by a sparse approximation build as a linear combination of elements of a given dictionary of p functions. This sparse approximation is selected by the Lasso or Group Lasso procedure. In this context, we state non asymptotic oracle inequalities for Lasso and Group Lasso under...
متن کاملMethods for regression analysis in high-dimensional data
By evolving science, knowledge and technology, new and precise methods for measuring, collecting and recording information have been innovated, which have resulted in the appearance and development of high-dimensional data. The high-dimensional data set, i.e., a data set in which the number of explanatory variables is much larger than the number of observations, cannot be easily analyzed by ...
متن کاملThresholded Lasso for High Dimensional Variable Selection
Given n noisy samples with p dimensions, where n " p, we show that the multi-step thresholding procedure based on the Lasso – we call it the Thresholded Lasso, can accurately estimate a sparse vector β ∈ R in a linear model Y = Xβ + ", where Xn×p is a design matrix normalized to have column #2-norm √ n, and " ∼ N(0,σIn). We show that under the restricted eigenvalue (RE) condition (BickelRitov-T...
متن کاملLocalized Lasso for High-Dimensional Regression
We introduce the localized Lasso, which is suited for learning models that both are interpretable and have a high predictive power in problems with high dimensionality d and small sample size n. More specifically, we consider a function defined by local sparse models, one at each data point. We introduce sample-wise network regularization to borrow strength across the models, and sample-wise ex...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Bioinformatics
سال: 2019
ISSN: 1367-4803,1460-2059
DOI: 10.1093/bioinformatics/btz333