Convex Generalizations of Total Variation Based on the Structure Tensor with Applications to Inverse Problems
نویسندگان
چکیده
We introduce a generic convex energy functional that is suitable for both grayscale and vector-valued images. Our functional is based on the eigenvalues of the structure tensor, therefore it penalizes image variation at every point by taking into account the information from its neighborhood. It generalizes several existing variational penalties, such as the Total Variation and vectorial extensions of it. By introducing the concept of patch-based Jacobian operator, we derive an equivalent formulation of the proposed regularizer that is based on the Schatten norm of this operator. Using this new formulation, we prove convexity and develop a dual definition for the proposed energy, which gives rise to an efficient and parallelizable minimization algorithm. Moreover, we establish a connection between the minimization of the proposed convex regularizer and a generic type of nonlinear anisotropic diffusion that is driven by a spatially regularized and adaptive diffusion tensor. Finally, we perform extensive experiments with image denoising and deblurring for grayscale and color images. The results show the effectiveness of the proposed approach as well as its improved performance compared to Total Variation and existing vectorial extensions of it.
منابع مشابه
Structure Tensor Total Variation
We introduce a novel generic energy functional that we employ to solve inverse imaging problems within a variational framework. The proposed regularization family, termed as structure tensor total variation (STV), penalizes the eigenvalues of the structure tensor and is suitable for both grayscale and vector-valued images. It generalizes several existing variational penalties, including the tot...
متن کاملfor the SSVM - 2013 paper entitled “ Convex Generalizations of Total Variation based on the Structure Tensor with Applications to Inverse Problems ”
1 Proof of Proposition 1 Let T (x) = R θ x denote the rotation of the image coordinates x, with R θ being the rotation matrix. Applying the chain rule to the Jacobian matrix we have that J {u • T } (x) = J u (T (x)) R θ , (1) where • denotes the composition of functions. Now, we write the structure tensor as S K {u • T } (x) = R T θ K * (J u (T (x))) T J u (T (x)) (h • T)(x) R θ. Since the conv...
متن کاملAn efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems
Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...
متن کاملOrthogonal metric space and convex contractions
In this paper, generalized convex contractions on orthogonal metric spaces are stablished in whath might be called their definitive versions. Also, we show that there are examples which show that our main theorems are genuine generalizations of Theorem 3.1 and 3.2 of [M.A. Miandaragh, M. Postolache and S. Rezapour, {it Approximate fixed points of generalized convex contractions}, Fixed Poi...
متن کاملSIZE AND GEOMETRY OPTIMIZATION OF TRUSS STRUCTURES USING THE COMBINATION OF DNA COMPUTING ALGORITHM AND GENERALIZED CONVEX APPROXIMATION METHOD
In recent years, the optimization of truss structures has been considered due to their several applications and their simple structure and rapid analysis. DNA computing algorithm is a non-gradient-based method derived from numerical modeling of DNA-based computing performance by new computers with DNA memory known as molecular computers. DNA computing algorithm works based on collective intelli...
متن کامل