نتایج جستجو برای: روش انقباضی lasso

تعداد نتایج: 374444  

Journal: :CoRR 2016
Eugène Ndiaye Olivier Fercoq Alexandre Gramfort Vincent Leclère Joseph Salmon

In high dimensional settings, sparse structures are crucial for efficiency, both in term of memory, computation and performance. It is customary to consider `1 penalty to enforce sparsity in such scenarios. Sparsity enforcing methods, the Lasso being a canonical example, are popular candidates to address high dimension. For efficiency, they rely on tuning a parameter trading data fitting versus...

2009
Junzhou Huang Tong Zhang

This paper develops a theory for group Lasso using a concept called strong group sparsity. Our result shows that group Lasso is superior to standard Lasso for strongly group-sparse signals. This provides a convincing theoretical justification for using group sparse regularization when the underlying group structure is consistent with the data. Moreover, the theory predicts some limitations of t...

ژورنال: فیض 2017
دهقان شیبانی, معصومه, علنی, بهرنگ, نورالدینی, مهدی,

سابقه و هدف: گیاه انبه از تیره پسته‌ایان است و اثرات عصاره­های حاصل از ساقه، برگ، میوه و هسته انبه روی عملکرد انقباضی عضلات صاف گزارش شده است. در این تحقیق اثر مهار گیرنده­ های کولینرژیک موسکارینی روی تاثیر عصاره آبی هسته انبه (Mangifera indica) بر فعالیت پایه عضلات صاف رحم موش صحرایی بکر مورد بررسی قرار گرفت. مواد و روش ­ها: در این تحقیق تجربی 24 قطعه میانی رحم موش ­های سالم بکر در حمام بافت ح...

Journal: :Journal of Machine Learning Research 2016
Stéphane Ivanoff Franck Picard Vincent Rivoirard

High dimensional Poisson regression has become a standard framework for the analysis of massive counts datasets. In this work we estimate the intensity function of the Poisson regression model by using a dictionary approach, which generalizes the classical basis approach, combined with a Lasso or a group-Lasso procedure. Selection depends on penalty weights that need to be calibrated. Standard ...

2016
Brian R. Gaines Hua Zhou

We compare alternative computing strategies for solving the constrained lasso problem. As its name suggests, the constrained lasso extends the widely-used lasso to handle linear constraints, which allow the user to incorporate prior information into the model. In addition to quadratic programming, we employ the alternating direction method of multipliers (ADMM) and also derive an efficient solu...

2013
Woncheol Jang Johan Lim Ji Meng Loh Nicole Lazar

Identifying homogeneous subgroups of variables can be challenging in high dimensional data analysis with highly correlated predictors. The generalized fused lasso has been proposed to simultaneously select correlated variables and identify them as predictive clusters. In this article, we study several properties of generalized fused lasso. First, we present a geometric interpretation of the gen...

2010
Jian Kang Jian Guo

In this paper, we proposed a self-adaptive lasso method for variable selection in regression problems. Unlike the popular lasso method, the proposed method introduces a specific tuning parameter for each regression coefficient. We modeled self-adaptive lasso in a Bayesian framework and developed an efficient Gibbs sampling algorithm to automatically select these tuning parameters and estimate t...

2011
Sara van de Geer Peter Bühlmann Shuheng Zhou

Abstract: We revisit the adaptive Lasso as well as the thresholded Lasso with refitting, in a high-dimensional linear model, and study prediction error, lq-error (q ∈ {1, 2}), and number of false positive selections. Our theoretical results for the two methods are, at a rather fine scale, comparable. The differences only show up in terms of the (minimal) restricted and sparse eigenvalues, favor...

2018
Alexander Jung Nguyen Tran Alexandru Mara

The “least absolute shrinkage and selection operator” (Lasso) method has been adapted recently for network-structured datasets. In particular, this network Lasso method allows to learn graph signals from a small number of noisy signal samples by using the total variation of a graph signal for regularization. While efficient and scalable implementations of the network Lasso are available, only l...

2017
Daniel F. Schmidt Enes Makalic

The lasso, introduced by Robert Tibshirani in 1996, has become one of the most popular techniques for estimating Gaussian linear regression models. An important reason for this popularity is that the lasso can simultaneously estimate all regression parameters as well as select important variables, yielding accurate regression models that are highly interpretable. This paper derives an efficient...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید