نتایج جستجو برای: sparsity constraints
تعداد نتایج: 194849 فیلتر نتایج به سال:
Consider the n-dimensional vector y = Xβ+ ǫ, where β ∈ R has only k nonzero entries and ǫ ∈ R is a Gaussian noise. This can be viewed as a linear system with sparsity constraints, corrupted by noise. We find a non-asymptotic upper bound on the probability that the optimal decoder for β declares a wrong sparsity pattern, given any generic perturbation matrix X . In the case when X is randomly dr...
In this paper , we propose a novel sparse learning based feature selection method that directly optimizes a large margin linear classification model’s sparsity with -norm ( ) subject to data-fitt ing constraints, rather than using the sparsity as a regularizat ion term. To solve the d irect sparsity opt imizat ion p rob lem that is non-s mooth and non-convex when , we prov ide an efficient iter...
Learning latent representations is playing a pivotal role in machine learning and many application areas. Previous work on the relational topic model (RTM) has shown promise on learning latent topical representations for describing relational document networks and predicting pairwise links. However under a probabilistic formulation with normalization constraints, RTM could be ineffective in con...
The adaptive BDDC method is extended to the selection of face constraints in three dimensions. A new implementation of the BDDC method is presented based on a global formulation without an explicit coarse problem, with massive parallelism provided by a multifrontal solver. Constraints are implemented by a projection and sparsity of the projected operator is preserved by a generalized change of ...
In this paper we consider the problem of joint enhancement of multichannel Synthetic Aperture Radar (SAR) data. Previous work by Cetin and Karl introduced nonquadratic regularization methods for image enhancement using sparsity enforcing penalty terms. For multichannel data, independent enhancement of each channel is shown to degrade the relative phase information across channels that is useful...
The hierarchical non-negative matrix factorization (HNMF) is a multilayer generative network for decomposing strictly positive data into strictly positive activations and base vectors in a hierarchical manner. However, the standard hierarchical NMF is not suited for overcomplete representations and does not code efficiently for transformations in the input data. Therefore we extend the standard...
We study the problem of learning a sparse linear regression vector under additional conditions on the structure of its sparsity pattern. We present a family of convex penalty functions, which encode this prior knowledge by means of a set of constraints on the absolute values of the regression coefficients. This family subsumes the l1 norm and is flexible enough to include different models of sp...
Image reconstruction based on sparse constraints is an important research topic in compressed sensing. Sparsity adaptive matching pursuit (SAMP) a greedy algorithm, which reconstructs signals without prior information of the sparsity level and potentially presents better performance than other algorithms. However, SAMP still suffers from being sensitive to step size selection at high sub-sampli...
We study optimal control problems in which controls with certain sparsity patterns are preferred. For time-dependent problems the approach can be used to find locations for control devices that allow controlling the system in an optimal way over the entire time interval. The approach uses a nondifferentiable cost functional to implement the sparsity requirements; additionally, bound constraints...
Most existing approaches address multi-view subspace clustering problem by constructing the affinity matrix on each view separately and afterwards propose how to extend spectral clustering algorithm to handle multi-view data. This paper presents an approach to multi-view subspace clustering that learns a joint subspace representation by constructing affinity matrix shared among all views. Relyi...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید