نتایج جستجو برای: روش lasso

تعداد نتایج: 374083  

2017
Karim Lounici Alexandre Tsybakov Massimiliano Pontil Sara Van de Geer Alexandre B. Tsybakov Sara van de Geer

We consider the problem of estimating a sparse linear regression vector β∗ under a gaussian noise model, for the purpose of both prediction and model selection. We assume that prior knowledge is available on the sparsity pattern, namely the set of variables is partitioned into prescribed groups, only few of which are relevant in the estimation process. This group sparsity assumption suggests us...

Journal: :Annals of statistics 2014
Richard Lockhart Jonathan Taylor Ryan J Tibshirani Robert Tibshirani

In the sparse linear regression setting, we consider testing the significance of the predictor variable that enters the current lasso model, in the sequence of models visited along the lasso solution path. We propose a simple test statistic based on lasso fitted values, called the covariance test statistic, and show that when the true model is linear, this statistic has an Exp(1) asymptotic dis...

Journal: :CoRR 2015
Xin Jiang Patricia Reynaud-Bouret Vincent Rivoirard Laure Sansonnet Rebecca Willett

Sparse linear inverse problems appear in a variety of settings, but often the noise contaminating observations cannot accurately be described as bounded by or arising from a Gaussian distribution. Poisson observations in particular are a characteristic feature of several real-world applications. Previous work on sparse Poisson inverse problems encountered several limiting technical hurdles. Thi...

2007
Jerome Friedman Trevor Hastie Holger Höfling Robert Tibshirani

We consider “one-at-a-time” coordinate-wise descent algorithms for a class of convex optimization problems. An algorithm of this kind has been proposed for the L1-penalized regression (lasso) in the lterature, but it seems to have been largely ignored. Indeed, it seems that coordinate-wise algorithms are not often used in convex optimization. We show that this algorithm is very competitive with...

Journal: :CoRR 2014
Seunghak Lee Eric P. Xing

Recently, to solve large-scale lasso and group lasso problems, screening rules have been developed, the goal of which is to reduce the problem size by efficiently discarding zero coefficients using simple rules independently of the others. However, screening for overlapping group lasso remains an open challenge because the overlaps between groups make it infeasible to test each group independen...

2005
Trevor Park George Casella

The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the priors on the regression parameters are independent double-exponential (Laplace) distributions. This posterior can also be accessed through a Gibbs sampler using conjugate normal priors for the regression parameters, with independent exponential hyperpriors on their variances. T...

Journal: :CoRR 2016
Yun Wang Peter J. Ramadge

Recently dictionary screening has been proposed as an effective way to improve the computational efficiency of solving the lasso problem, which is one of the most commonly used method for learning sparse representations. To address today’s ever increasing large dataset, effective screening relies on a tight region bound on the solution to the dual lasso. Typical region bounds are in the form of...

2017
Yasuhiro Fujiwara Naoki Marumo Mathieu Blondel Koh Takeuchi Hideaki Kim Tomoharu Iwata Naonori Ueda

The graphical lasso is the most popular approach to estimating the inverse covariance matrix of highdimension data. It iteratively estimates each row and column of the matrix in a round-robin style until convergence. However, the graphical lasso is infeasible due to its high computation cost for large size of datasets. This paper proposes Sting, a fast approach to the graphical lasso. In order ...

Journal: :Applied and Computational Harmonic Analysis 2023

This note extends an attribute of the LASSO procedure to a whole class related procedures, including square-root LASSO, square LAD-LASSO, and instance generalized LASSO. Namely, under assumption that input matrix satisfies ℓ p -restricted isometry property (which in some sense is weaker than standard 2 assumption), it shown if vector comes from exact measurement sparse vector, then minimizer an...

2008
Alessandro Rinaldo

We consider estimating an unknown signal, which is both blocky and sparse, corrupted by additive noise. We study three interrelated least squares procedures and their asymptotic properties. The first procedure is the fused lasso, put forward by Friedman et al. (2007), which we modify into a different estimator, called the fused adaptive lasso, with better properties. The other two estimators we...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید