نتایج جستجو برای: روش انقباضی lasso

تعداد نتایج: 374444  

2015
Leena Pasanen Lasse Holmström Mikko J. Sillanpää

BACKGROUND LASSO is a penalized regression method that facilitates model fitting in situations where there are as many, or even more explanatory variables than observations, and only a few variables are relevant in explaining the data. We focus on the Bayesian version of LASSO and consider four problems that need special attention: (i) controlling false positives, (ii) multiple comparisons, (ii...

پایان نامه :دانشگاه تربیت معلم - تبریز - دانشکده علوم پایه 1391

در این رساله به بررسی روش سوزوکی در نظریه نقطه ثابت خواهیم پرداخت. این روش تعمیمی جدید از اصل انقباضی باناخ است که در سال 2008 توسط سوزوکی معرفی شد و پس از آن توسط ریاضی دانان دیگر ادامه یافته است. در فصل اول، تعاریف، قضایا و لم های مورد نیاز را ارائه می دهیم. فصل دوم شامل برخی از کارهای قدیمی سوزوکی مانند تعمیم قضیه مایر-کیلر و برخی قضایای مربوط به نگاشت های غیرتوسیعی است. روش جدید سوزوکی و نت...

احمدزاده, مهدیه سادات, باستانی, پیوند, لطفی, فرهاد, مرادی, مرجان,

Background and Objective: The evaluation of the hospitals performance in order to improve the quality of services provided is of great importance. This study aimed to evaluate the performance of teaching hospitals affiliated to Shiraz University of Medical Sciences (SUMS) using Pabon Lasso graph before and after the implementation of the health system transformation plan. Materials and Metho...

Journal: :Molecular biology and evolution 2015
George Kettleborough Jo Dicks Ian N Roberts Katharina T Huber

The wealth of phylogenetic information accumulated over many decades of biological research, coupled with recent technological advances in molecular sequence generation, presents significant opportunities for researchers to investigate relationships across and within the kingdoms of life. However, to make best use of this data wealth, several problems must first be overcome. One key problem is ...

2017
Karim Lounici Alexandre Tsybakov Massimiliano Pontil Sara Van de Geer Alexandre B. Tsybakov Sara van de Geer

We consider the problem of estimating a sparse linear regression vector β∗ under a gaussian noise model, for the purpose of both prediction and model selection. We assume that prior knowledge is available on the sparsity pattern, namely the set of variables is partitioned into prescribed groups, only few of which are relevant in the estimation process. This group sparsity assumption suggests us...

Journal: :Annals of statistics 2014
Richard Lockhart Jonathan Taylor Ryan J Tibshirani Robert Tibshirani

In the sparse linear regression setting, we consider testing the significance of the predictor variable that enters the current lasso model, in the sequence of models visited along the lasso solution path. We propose a simple test statistic based on lasso fitted values, called the covariance test statistic, and show that when the true model is linear, this statistic has an Exp(1) asymptotic dis...

Journal: :CoRR 2015
Xin Jiang Patricia Reynaud-Bouret Vincent Rivoirard Laure Sansonnet Rebecca Willett

Sparse linear inverse problems appear in a variety of settings, but often the noise contaminating observations cannot accurately be described as bounded by or arising from a Gaussian distribution. Poisson observations in particular are a characteristic feature of several real-world applications. Previous work on sparse Poisson inverse problems encountered several limiting technical hurdles. Thi...

2007
Jerome Friedman Trevor Hastie Holger Höfling Robert Tibshirani

We consider “one-at-a-time” coordinate-wise descent algorithms for a class of convex optimization problems. An algorithm of this kind has been proposed for the L1-penalized regression (lasso) in the lterature, but it seems to have been largely ignored. Indeed, it seems that coordinate-wise algorithms are not often used in convex optimization. We show that this algorithm is very competitive with...

Journal: :CoRR 2014
Seunghak Lee Eric P. Xing

Recently, to solve large-scale lasso and group lasso problems, screening rules have been developed, the goal of which is to reduce the problem size by efficiently discarding zero coefficients using simple rules independently of the others. However, screening for overlapping group lasso remains an open challenge because the overlaps between groups make it infeasible to test each group independen...

2005
Trevor Park George Casella

The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the priors on the regression parameters are independent double-exponential (Laplace) distributions. This posterior can also be accessed through a Gibbs sampler using conjugate normal priors for the regression parameters, with independent exponential hyperpriors on their variances. T...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید