Inexact Accelerated Proximal Gradient Algorithms For Matrix l2,1-Norm Minimization Problem in Multi-Task Feature Learning
نویسندگان
چکیده
In this paper, we extend the implementable APG method to solve the matrix l2,1-norm minimization problem arising in multi-task feature learning. We investigate that the resulting inner subproblem has closed-form solution which can be easily determined by taking the problem’s favorable structures. Under suitable conditions, we can establish a comprehensive convergence result for the proposed method. Furthermore, we present three different inexact APG algorithms by using the Lipschitz constant, the eigenvalue of Hessian matrix and the Barzilai and Borwein parameter in the inexact model, respectively. Numerical experiments on simulated data and real data set are reported to show the efficiency of proposed method.
منابع مشابه
An implementable proximal point algorithmic framework for nuclear norm minimization
The nuclear norm minimization problem is to find a matrix with the minimum nuclear norm subject to linear and second order cone constraints. Such a problem often arises from the convex relaxation of a rank minimization problem with noisy data, and arises in many fields of engineering and science. In this paper, we study inexact proximal point algorithms in the primal, dual and primal-dual forms...
متن کاملFeature Learning for Multitask Learning Using l2,1 Norm Minimization
We look at solving the task of Multitask Feature Learning by way of feature selection. We find that formulating it as an l2,1 norm minimization problem helps with both sparsity and uniformity, which are required for Multitask Feature Learning. Two approaches are explored, one which formulates an equivalent convex optimization problem and iteratively solves it, and another which formulates two e...
متن کاملl2, 1 Regularized correntropy for robust feature selection
In this paper, we study the problem of robust feature extraction based on l2,1 regularized correntropy in both theoretical and algorithmic manner. In theoretical part, we point out that an l2,1-norm minimization can be justified from the viewpoint of half-quadratic (HQ) optimization, which facilitates convergence study and algorithmic development. In particular, a general formulation is accordi...
متن کاملFeature Selection at the Discrete Limit
Feature selection plays an important role in many machine learning and data mining applications. In this paper, we propose to use L2,p norm for feature selection with emphasis on small p. As p → 0, feature selection becomes discrete feature selection problem. We provide two algorithms, proximal gradient algorithm and rankone update algorithm, which is more efficient at large regularization λ. W...
متن کاملProbabilistic Multi-Label Classification with Sparse Feature Learning
Multi-label classification is a critical problem in many areas of data analysis such as image labeling and text categorization. In this paper we propose a probabilistic multi-label classification model based on novel sparse feature learning. By employing an individual sparsity inducing l1-norm and a group sparsity inducing l2,1-norm, the proposed model has the capacity of capturing both label i...
متن کامل