نتایج جستجو برای: روش lasso

تعداد نتایج: 374083  

2006
Wenbin Lu Hao Helen Zhang WENBIN LU HAO H. ZHANG

We study the problem of variable selection for linear transformation models, a class of general semiparametric models for censored survival data. The penalized marginal likelihood methods with shrinkage-type penalties are proposed to automate variable selection in linear transformation models; we consider the LASSO penalty and propose a new penalty called the adaptive-LASSO (ALASSO). Unlike the...

Journal: :CoRR 2013
Nadine Hussami Robert Tibshirani

We propose a new sparse regression method called the component lasso, based on a simple idea. The method uses the connected-components structure of the sample covariance matrix to split the problem into smaller ones. It then applies the lasso to each subproblem separately, obtaining a coefficient vector for each one. Finally, it uses non-negative least squares to recombine the different vectors...

2012
Enrique Pinzón

This paper proposes a new two stage least squares (2SLS) estimator which is consistent and asymptotically normal in the presence of many weak and irrelevant instruments and heteroskedasticity. In the first stage the estimator uses an adaptive absolute shrinkage and selection operator (LASSO) that selects the relevant instruments with high probability. However, the adaptive LASSO estimates have ...

2017
Jonathan I Tietz Christopher J Schwalen Parth S Patel Tucker Maxson Patricia M Blair Hua-Chia Tai Uzma I Zakai Douglas A Mitchell

Ribosomally synthesized and post-translationally modified peptide (RiPP) natural products are attractive for genome-driven discovery and re-engineering, but limitations in bioinformatic methods and exponentially increasing genomic data make large-scale mining of RiPP data difficult. We report RODEO (Rapid ORF Description and Evaluation Online), which combines hidden-Markov-model-based analysis,...

2012
Luca Baldassarre Jean Morales Andreas Argyriou Massimiliano Pontil

We study a generalized framework for structured sparsity. It extends the well known methods of Lasso and Group Lasso by incorporating additional constraints on the variables as part of a convex optimization problem. This framework provides a straightforward way of favouring prescribed sparsity patterns, such as orderings, contiguous regions and overlapping groups, among others. Available optimi...

2016
Pierre C. Bellec Alexandre B. Tsybakov

This paper considers the penalized least squares estimator with arbitrary convex penalty. When the observation noise is Gaussian, we show that the prediction error is a subgaussian random variable concentrated around its median. We apply this concentration property to derive sharp oracle inequalities for the prediction error of the LASSO, the group LASSO and the SLOPE estimators, both in probab...

2009
Pierre Alquier Mohamed Hebiri

Transductive methods are useful in prediction problems when the training dataset is composed of a large number of unlabeled observations and a smaller number of labeled observations. In this paper, we propose an approach for developing transductive prediction procedures that are able to take advantage of the sparsity in the high dimensional linear regression. More precisely, we define transduct...

2009
Wook Yeon Hwang Hao Helen Zhang Subhashis Ghosal

We propose a new class of variable selection techniques for regression in high dimensional linear models based on a forward selection version of the LASSO, adaptive LASSO or elastic net, respectively to be called as forward iterative regression and shrinkage technique (FIRST), adaptive FIRST and elastic FIRST. These methods seem to work effectively for extremely sparse high dimensional linear m...

2012
Jinzhu Jia Karl Rohe Bin Yu BIN YU

The performance of the Lasso is well understood under the assumptions of the standard sparse linear model with homoscedastic noise. However, in several applications, the standard model does not describe the important features of the data. This paper examines how the Lasso performs on a non-standard model that is motivated by medical imaging applications. In these applications, the variance of t...

2011
Daniela M. WITTEN Jerome H. FRIEDMAN Noah SIMON

We consider the graphical lasso formulation for estimating a Gaussian graphical model in the high-dimensional setting. This approach entails estimating the inverse covariance matrix under a multivariate normal model by maximizing the 1-penalized log-likelihood. We present a very simple necessary and sufficient condition that can be used to identify the connected components in the graphical lass...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید