نتایج جستجو برای: روش lasso
تعداد نتایج: 374083 فیلتر نتایج به سال:
The Lasso is a cornerstone of modern multivariate data analysis, yet its performance suffers in the common situation in which covariates are correlated. This limitation has led to a growing number of Preconditioned Lasso algorithms that pre-multiply X and y by matrices PX , Py prior to running the standard Lasso. A direct comparison of these and similar Lasso-style algorithms to the original La...
Relational lasso is a method that incorporates feature relations within machine learning. By using automatically obtained noisy relations among features, relational lasso learns an additional penalty parameter per feature, which is then incorporated in terms of a regularizer within the target optimization function. Relational lasso has been tested on three different tasks: text categorization, ...
Many statistical machine learning algorithms (in regression or classification) minimize either an empirical loss function as in AdaBoost, or a penalized empirical loss as in SVM. A single regularization tuning parameter controls the trade-off between fidelity to the data and generalibility, or equivalently between bias and variance. When this tuning parameter changes, a regularization “path” of...
Stratified medicine seeks to identify biomarkers or parsimonious gene signatures distinguishing patients that will benefit most from a targeted treatment. We evaluated 12 approaches in high-dimensional Cox models in randomized clinical trials: penalization of the biomarker main effects and biomarker-by-treatment interactions (full-lasso, three kinds of adaptive lasso, ridge+lasso and group-lass...
The lasso is a popular technique for simultaneous estimation and variable selection. Lasso variable selection has been shown to be consistent under certain conditions. In this work we derive a necessary condition for the lasso variable selection to be consistent. Consequently, there exist certain scenarios where the lasso is inconsistent for variable selection. We then propose a new version of ...
We derive the degrees of freedom of the lasso fit, placing no assumptions on the predictor matrix X. Like the well-known result of Zou et al. (2007), which gives the degrees of freedom of the lasso fit when X has full column rank, we express our result in terms of the active set of a lasso solution. We extend this result to cover the degrees of freedom of the generalized lasso fit for an arbitr...
We consider the problem of estimating a function f0 in logistic regression model. We propose to estimate this function f0 by a sparse approximation build as a linear combination of elements of a given dictionary of p functions. This sparse approximation is selected by the Lasso or Group Lasso procedure. In this context, we state non asymptotic oracle inequalities for Lasso and Group Lasso under...
Teneurins are large cell-surface receptors involved in axon guidance. Teneurin-2 (also known as latrophilin-1-associated synaptic surface organizer (Lasso)) interacts across the synaptic cleft with presynaptic latrophilin-1, an adhesion G-protein-coupled receptor that participates in regulating neurotransmitter release. Lasso-latrophilin-1 interaction mediates synapse formation and calcium sign...
The solution path of the 1D fused lasso for an ndimensional input is piecewise linear with O(n) segments [1], [2]. However, existing proofs of this bound do not hold for the weighted fused lasso. At the same time, results for the generalized lasso, of which the weighted fused lasso is a special case, allow Ω(3) segments [3]. In this paper, we prove that the number of segments in the solution pa...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید