نتایج جستجو برای: lasso

تعداد نتایج: 4548  

2013
Fabian L. Wauthier Nebojsa Jojic Michael I. Jordan

The Lasso is a cornerstone of modern multivariate data analysis, yet its performance suffers in the common situation in which covariates are correlated. This limitation has led to a growing number of Preconditioned Lasso algorithms that pre-multiply X and y by matrices PX , Py prior to running the standard Lasso. A direct comparison of these and similar Lasso-style algorithms to the original La...

2011
Kotaro Kitagawa Kumiko Tanaka-Ishii

Relational lasso is a method that incorporates feature relations within machine learning. By using automatically obtained noisy relations among features, relational lasso learns an additional penalty parameter per feature, which is then incorporated in terms of a regularizer within the target optimization function. Relational lasso has been tested on three different tasks: text categorization, ...

2007
Peng Zhao Bin Yu Saharon Rosset

Many statistical machine learning algorithms (in regression or classification) minimize either an empirical loss function as in AdaBoost, or a penalized empirical loss as in SVM. A single regularization tuning parameter controls the trade-off between fidelity to the data and generalibility, or equivalently between bias and variance. When this tuning parameter changes, a regularization “path” of...

2017
Nils Ternès Federico Rotolo Georg Heinze Stefan Michiels

Stratified medicine seeks to identify biomarkers or parsimonious gene signatures distinguishing patients that will benefit most from a targeted treatment. We evaluated 12 approaches in high-dimensional Cox models in randomized clinical trials: penalization of the biomarker main effects and biomarker-by-treatment interactions (full-lasso, three kinds of adaptive lasso, ridge+lasso and group-lass...

2006
Hui ZOU

The lasso is a popular technique for simultaneous estimation and variable selection. Lasso variable selection has been shown to be consistent under certain conditions. In this work we derive a necessary condition for the lasso variable selection to be consistent. Consequently, there exist certain scenarios where the lasso is inconsistent for variable selection. We then propose a new version of ...

2012
Ryan J. Tibshirani Jonathan Taylor

We derive the degrees of freedom of the lasso fit, placing no assumptions on the predictor matrix X. Like the well-known result of Zou et al. (2007), which gives the degrees of freedom of the lasso fit when X has full column rank, we express our result in terms of the active set of a lasso solution. We extend this result to cover the degrees of freedom of the generalized lasso fit for an arbitr...

2012
Marius Kwemou

We consider the problem of estimating a function f0 in logistic regression model. We propose to estimate this function f0 by a sparse approximation build as a linear combination of elements of a given dictionary of p functions. This sparse approximation is selected by the Lasso or Group Lasso procedure. In this context, we state non asymptotic oracle inequalities for Lasso and Group Lasso under...

2016
Nickolai V. Vysokov John-Paul Silva Vera G. Lelianova Claudia Ho Mustafa B. Djamgoz Alexander G. Tonevitsky Yuri A. Ushkaryov

Teneurins are large cell-surface receptors involved in axon guidance. Teneurin-2 (also known as latrophilin-1-associated synaptic surface organizer (Lasso)) interacts across the synaptic cleft with presynaptic latrophilin-1, an adhesion G-protein-coupled receptor that participates in regulating neurotransmitter release. Lasso-latrophilin-1 interaction mediates synapse formation and calcium sign...

Journal: :CoRR 2018
José Bento Surjyendu Ray

The solution path of the 1D fused lasso for an ndimensional input is piecewise linear with O(n) segments [1], [2]. However, existing proofs of this bound do not hold for the weighted fused lasso. At the same time, results for the generalized lasso, of which the weighted fused lasso is a special case, allow Ω(3) segments [3]. In this paper, we prove that the number of segments in the solution pa...

Journal: :Bernoulli : official journal of the Bernoulli Society for Mathematical Statistics and Probability 2010
Fengrong Wei Jian Huang

In regression problems where covariates can be naturally grouped, the group Lasso is an attractive method for variable selection since it respects the grouping structure in the data. We study the selection and estimation properties of the group Lasso in high-dimensional settings when the number of groups exceeds the sample size. We provide sufficient conditions under which the group Lasso selec...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید