Learning Scale Free Networks by Reweighted `1 regularization
نویسندگان
چکیده
Methods for `1-type regularization have been widely used in Gaussian graphical model selection tasks to encourage sparse structures. However, often we would like to include more structural information than mere sparsity. In this work, we focus on learning so-called “scale-free” models, a common feature that appears in many real-work networks. We replace the `1 regularization with a power law regularization and optimize the objective function by a sequence of iteratively reweighted `1 regularization problems, where the regularization coefficients of nodes with high degree are reduced, encouraging the appearance of hubs with high degree. Our method can be easily adapted to improve any existing `1-based methods, such as graphical lasso, neighborhood selection, and JSRM when the underlying networks are believed to be scale free or have dominating hubs. We demonstrate in simulation that our method significantly outperforms the a baseline `1 method at learning scale-free networks and hub networks, and also illustrate its behavior on gene expression data.
منابع مشابه
Learning Scale Free Networks by Reweighted L1 regularization
Methods for `1-type regularization have been widely used in Gaussian graphical model selection tasks to encourage sparse structures. However, often we would like to include more structural information than mere sparsity. In this work, we focus on learning so-called “scale-free” models, a common feature that appears in many real-work networks. We replace the `1 regularization with a power law re...
متن کاملLarge-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation
In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...
متن کاملA Second Order Learning Scheme based on Iteratively Reweighted Least Squares
In this work we demonstrate a method to obtain maximum likelihood weight estimates for a multilay-ered feedforward neural network using least squares. This method has certain advantages when compared to other second-order methods. The proposed method uses the Fisher's information matrix instead of the Hessian matrix to compute the search direction. Since this matrix is formulated as an inner pr...
متن کاملMultiple Task Learning Using Iteratively Reweighted Least Square
Multiple task learning (MTL) is becoming popular due to its theoretical advances and empirical successes. The key idea of MTL is to explore the hidden relationships among multiple tasks to enhance learning performance. Recently, many MTL algorithms have been developed and applied to various problems such as feature selection and kernel learning. However, most existing methods highly relied on c...
متن کامل