نتایج جستجو برای: L1−norm
تعداد نتایج: 28 فیلتر نتایج به سال:
We propose a sparse signal reconstruction algorithm from interlaced samples with unknown offset parameters based on the l1-norm minimization principle. A typical application of the problem is superresolution from multiple lowresolution images. The algorithm first minimizes the l1norm of a vector that satisfies data constraint with the offset parameters fixed. Second, the minimum value is furthe...
Comparing with the standard L2-norm support vector machine (SVM), the L1-norm SVM enjoys the nice property of simultaneously preforming classification and feature selection. In this paper, we investigate the statistical performance of L1-norm SVM in ultra-high dimension, where the number of features p grows at an exponential rate of the sample size n. Different from existing theory for SVM whic...
A new image coding technique based on an L1norm criterion and exploiting statistical properties of the reconstruction error is investigated. The original image is preprocessed, quantized, encoded, and reconstructed within a given confidence interval. Two important classes of preprocessing, namely linear prediction and iterated filterbanks, are used. The approach is also shown to be compatible w...
Introduction Compressed sensing (CS) has been shown to provide accurate reconstructions from highly undersampled data for certain types of MR acquisitions [1, 2]. This offers the promise of faster MR acquisitions, and further speed gains are possible when CS is used in conjunction with parallel acquisition schemes such as SENSE [3]. Several approaches have been recently proposed to reconstruct ...
This paper proposes distributed algorithms for multi-agent networks to achieve a solution in finite time to a linear equation Ax = b where A has full row rank, and with the minimum l1-norm in the underdetermined case (where A has more columns than rows). The underlying network is assumed to be undirected and fixed, and an analytical proof is provided for the proposed algorithm to drive all agen...
In this paper we define, for each aspherical orientable 3-manifold M endowed with a torus splitting T , a 2-dimensional fundamental l1-class [M ] whose l1norm has similar properties as the Gromov simplicial volume of M (additivity under torus splittings and isometry under finite covering maps). Next, we use the Gromov simplicial volume of M and the l1-norm of [M ] to give a complete characteriz...
This paper addresses the problem of semantic parsing, by which natural language sentences are translated into a form which conveys their underlying meaning. Semantic parsing involves a parameter estimation process, which is a convex optimization problem. The optimization formulation of previous approaches often requires huge amount of time to converge due to the high dimensional feature space. ...
We study the recovery of sparse signals from underdetermined linear measurements when a potentially erroneous support estimate is available. Our results are twofold. First, we derive necessary and sufficient conditions for signal recovery from compressively sampled measurements using weighted l1norm minimization. These conditions, which depend on the choice of weights as well as the size and ac...
The problem of computing sparse (mostly zero) solutions to underdetermined linear systems of equations has received much attention recently, due to its applications to compressed sensing. Under mild assumptions, the sparsest solution has minimum-L1norm, and can be computed using linear programming. In some applications (valid deconvolution, singular linear transformations), the linear system is...
We propose a novel model reduction approach for the approximation of non linear hyperbolic equations in the scalar and the system cases. The approach relies on an offline computation of a dictionary of solutions together with an online L1norm minimization of the residual. It is shown why this is a natural framework for hyperbolic problems and tested on nonlinear problems such as Burgers’ equati...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید