نتایج جستجو برای: l1 norm

تعداد نتایج: 74840  

2011
Sander Wozniak Tobias Gerlach Günter Schäfer

The problem of localizing nodes without GPS based on a small fraction of anchor nodes which are aware of their positions is considered to be an important service for applications in wireless ad hoc networks. With an adversary trying to mislead nodes about their estimated locations, several approaches aiming to defeat attackers by means of robustness instead of cryptographic measures have been p...

Journal: :Neurocomputing 2012
András Lörincz Zsolt Palotai Gábor Szirtes

Sparse coding algorithms are about finding a linear basis in which signals can be represented by a small number of active (non-zero) coefficients. Such coding has many applications in science and engineering and is believed to play an important role in neural information processing. However, due to the computational complexity of the task, only approximate solutions provide the required efficie...

ژورنال: فیزیک زمین و فضا 2018

In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...

Journal: :CoRR 2017
Nicholas Tsagkarakis Panos P. Markopoulos Dimitris A. Pados

L1-norm Principal-Component Analysis (L1-PCA) of real-valued data has attracted significant research interest over the past decade. However, L1-PCA of complex-valued data remains to date unexplored despite the many possible applications (e.g., in communication systems). In this work, we establish theoretical and algorithmic foundations of L1-PCA of complex-valued data matrices. Specifically, we...

Journal: :Operations Research 2001
Ravindra K. Ahuja James B. Orlin

In this paper, we study inverse optimization problems defined as follows: Let S denote the set of feasible solutions of an optimization problem P, let c be a specified cost vector, and x0 be a given feasible solution. The solution x0 may or may not be an optimal solution of P with respect to the cost vector c. The inverse optimization problem is to perturb the cost vector c to d so that x0 is a...

Journal: :NeuroImage 2012
Srikanth Ryali Tianwen Chen Kaustubh Supekar Vinod Menon

Characterizing interactions between multiple brain regions is important for understanding brain function. Functional connectivity measures based on partial correlation provide an estimate of the linear conditional dependence between brain regions after removing the linear influence of other regions. Estimation of partial correlations is, however, difficult when the number of regions is large, a...

2013
Jicheng Meng Xiaolong Zheng

We extensively investigate robust sparse two dimensional principal component analysis (RS2DPCA) that makes the best of semantic, structural information and suppresses outliers in this paper. The RS2DPCA combines the advantages of sparsity, 2D data format and L1-norm for data analysis. We also prove that RS2DPCA can offer a good solution of seeking spare 2D principal components. To verify the pe...

Journal: :Comp. Opt. and Appl. 2015
Roland Herzog Johannes Obermeier Gerd Wachsmuth

In this paper we consider optimal control problems in which a certain L1-type norm of the control appears in the objective. Problems of this type are of interest for at least two reasons. Firstly, the L1 norm of the control is often a natural measure of the control cost. Secondly, this term promotes sparsely supported optimal controls, i.e., controls which are zero on substantial parts of its d...

2005
Simon Blanchard Gilles Caporossi Pierre Hansen

L1 norm discrimination consists in finding the hyperplane that minimizes the sum of L1 norm distances between the hyperplane and the points that lie on the wrong side of the hyperplane. This problem is difficult for datasets containing more than 100,000 points. Since few points are needed to obtain the optimal hyperplane, we propose a point selection algorithm which iteratively adds the necessa...

2012
Panqu Wang

In this project, we implement a robust face recognition system via sparse representation and convex optimization. We treat each test sample as sparse linear combination of training samples, and get the sparse solution via L1-minimization. We also explore the group sparseness (L2-norm) as well as normal L1-norm regularization.We discuss the role of feature extraction and classification robustnes...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید