نتایج جستجو برای: lasso
تعداد نتایج: 4548 فیلتر نتایج به سال:
Recently, to solve large-scale lasso and group lasso problems, screening rules have been developed, the goal of which is to reduce the problem size by efficiently discarding zero coefficients using simple rules independently of the others. However, screening for overlapping group lasso remains an open challenge because the overlaps between groups make it infeasible to test each group independen...
The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the priors on the regression parameters are independent double-exponential (Laplace) distributions. This posterior can also be accessed through a Gibbs sampler using conjugate normal priors for the regression parameters, with independent exponential hyperpriors on their variances. T...
Recently dictionary screening has been proposed as an effective way to improve the computational efficiency of solving the lasso problem, which is one of the most commonly used method for learning sparse representations. To address today’s ever increasing large dataset, effective screening relies on a tight region bound on the solution to the dual lasso. Typical region bounds are in the form of...
The graphical lasso is the most popular approach to estimating the inverse covariance matrix of highdimension data. It iteratively estimates each row and column of the matrix in a round-robin style until convergence. However, the graphical lasso is infeasible due to its high computation cost for large size of datasets. This paper proposes Sting, a fast approach to the graphical lasso. In order ...
This note extends an attribute of the LASSO procedure to a whole class related procedures, including square-root LASSO, square LAD-LASSO, and instance generalized LASSO. Namely, under assumption that input matrix satisfies ℓ p -restricted isometry property (which in some sense is weaker than standard 2 assumption), it shown if vector comes from exact measurement sparse vector, then minimizer an...
We consider estimating an unknown signal, which is both blocky and sparse, corrupted by additive noise. We study three interrelated least squares procedures and their asymptotic properties. The first procedure is the fused lasso, put forward by Friedman et al. (2007), which we modify into a different estimator, called the fused adaptive lasso, with better properties. The other two estimators we...
We extend the `2-consistency result of (Meinshausen and Yu 2008) from the Lasso to the group Lasso. Our main theorem shows that the group Lasso achieves estimation consistency under a mild condition and an asymptotic upper bound on the number of selected variables can be obtained. As a result, we can apply the nonnegative garrote procedure to the group Lasso result to obtain an estimator which ...
The LASSO (Tibshirani, J R Stat Soc Ser B 58(1):267–288, 1996, [30]) and the adaptive LASSO (Zou, J Am Stat Assoc 101:1418–1429, 2006, [37]) are popular in regression analysis for their advantage of simultaneous variable selection and parameter estimation, and also have been applied to autoregressive time series models. We propose the doubly adaptive LASSO (daLASSO), or PLAC-weighted adaptive L...
This paper presents an upper bound for the estimation error of the constrained lasso, under the high-dimensional (n < p) setting. In contrast to existing results, the error bound in this paper is sharp, is valid when the parameter to be estimated is not exactly sparse (e.g., when the parameter is weakly sparse), and shows explicitly the effect of over-estimating the `1-norm of the parameter to ...
The literature is replete with variable selection techniques for the classical linear regression model. It is only relatively recently that authors have begun to explore variable selection in fully nonparametric and additive regression models. One such variable selection technique is a generalization of the LASSO called the group LASSO. In this work, we demonstrate a connection between the grou...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید