نتایج جستجو برای: lasso

تعداد نتایج: 4548  

2016
Brian R. Gaines Hua Zhou

We compare alternative computing strategies for solving the constrained lasso problem. As its name suggests, the constrained lasso extends the widely-used lasso to handle linear constraints, which allow the user to incorporate prior information into the model. In addition to quadratic programming, we employ the alternating direction method of multipliers (ADMM) and also derive an efficient solu...

2013
Woncheol Jang Johan Lim Ji Meng Loh Nicole Lazar

Identifying homogeneous subgroups of variables can be challenging in high dimensional data analysis with highly correlated predictors. The generalized fused lasso has been proposed to simultaneously select correlated variables and identify them as predictive clusters. In this article, we study several properties of generalized fused lasso. First, we present a geometric interpretation of the gen...

2010
Jian Kang Jian Guo

In this paper, we proposed a self-adaptive lasso method for variable selection in regression problems. Unlike the popular lasso method, the proposed method introduces a specific tuning parameter for each regression coefficient. We modeled self-adaptive lasso in a Bayesian framework and developed an efficient Gibbs sampling algorithm to automatically select these tuning parameters and estimate t...

2011
Sara van de Geer Peter Bühlmann Shuheng Zhou

Abstract: We revisit the adaptive Lasso as well as the thresholded Lasso with refitting, in a high-dimensional linear model, and study prediction error, lq-error (q ∈ {1, 2}), and number of false positive selections. Our theoretical results for the two methods are, at a rather fine scale, comparable. The differences only show up in terms of the (minimal) restricted and sparse eigenvalues, favor...

2018
Alexander Jung Nguyen Tran Alexandru Mara

The “least absolute shrinkage and selection operator” (Lasso) method has been adapted recently for network-structured datasets. In particular, this network Lasso method allows to learn graph signals from a small number of noisy signal samples by using the total variation of a graph signal for regularization. While efficient and scalable implementations of the network Lasso are available, only l...

2017
Daniel F. Schmidt Enes Makalic

The lasso, introduced by Robert Tibshirani in 1996, has become one of the most popular techniques for estimating Gaussian linear regression models. An important reason for this popularity is that the lasso can simultaneously estimate all regression parameters as well as select important variables, yielding accurate regression models that are highly interpretable. This paper derives an efficient...

2010
Jerome Friedman Trevor Hastie Robert Tibshirani

We propose several methods for estimating edge-sparse and nodesparse graphical models based on lasso and grouped lasso penalties. We develop efficient algorithms for fitting these models when the numbers of nodes and potential edges are large. We compare them to competing methods including the graphical lasso and SPACE (Peng, Wang, Zhou & Zhu 2008). Surprisingly, we find that for edge selection...

ژورنال: پیاورد سلامت 2016
درگاهی, حسین, صادقی فر, جمیل, طلوعی رخشان, شیوا,

Background & Aim: One of the most important and useful models for assessing hospital performance is the Pabon Lasso Model, a graphical model that determines the relative performance of hospitals using three indicators: 1. Bed Occupancy Rate (BOR); 2. Bed turnover (BTO); 3 Average Length of Stay (ALS). The aim of this research is to investigate the performance of the hospitals affiliated with Te...

Journal: :CoRR 2013
Weiguang Wang Yingbin Liang Eric P. Xing

for K linear regressions. The support union of K p-dimensional regression vectors (collected as columns of matrix B∗) is recovered using l1/l2-regularized Lasso. Sufficient and necessary conditions on sample complexity are characterized as a sharp threshold to guarantee successful recovery of the support union. This model has been previously studied via l1/l∞regularized Lasso by Negahban & Wain...

2014
Nikhil Rao Robert Nowak Christopher Cox Timothy Rogers

Binary logistic regression with a sparsity constraint on the solution plays a vital role in many high dimensional machine learning applications. In some cases, the features can be grouped together, so that entire subsets of features can be selected or zeroed out. In many applications, however, this can be very restrictive. In this paper, we are interested in a less restrictive form of structure...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید