Seminorm-Induced Oblique Projections for Sparse Nonlinear Convex Feasibility Problems
نویسنده
چکیده
Simultaneous subgradient projection algorithms for the convex feasibility problem use subgradient calculations and converge sometimes even in the inconsistent case. We devise an algorithm that uses seminorm-induced oblique projections onto super half-spaces of the convex sets, which is advantageous when the subgradient-Jacobian is a sparse matrix at many iteration points of the algorithm. Using generalized seminorm-induced oblique projections on hyperplanes defined by subgradients at each iterative step, allows component-wise diagonal weighting which has been shown to be useful for early acceleration in the sparse linear case. Convergence for the consistent case with underrelaxation is established.
منابع مشابه
Iterative Algorithms with Seminorm-induced Oblique Projections
A definition of oblique projections onto closed convex sets that use seminorms induced by diagonal matrices which may have zeros on the diagonal is introduced. Existence and uniqueness of such projections are secured via directional affinity of the sets with respect to the diagonal matrices involved. A blockiterative algorithmic scheme for solving the convex feasibility problem, employing semin...
متن کاملOn the string averaging method for sparse common fixed-point problems
We study the common fixed point problem for the class of directed operators. This class is important because many commonly used nonlinear operators in convex optimization belong to it. We propose a definition of sparseness of a family of operators and investigate a string-averaging algorithmic scheme that favorably handles the common fixed points problem when the family of operators is sparse. ...
متن کاملSparse-View CT Reconstruction Based on Nonconvex L1 − L2 Regularizations
The reconstruction from sparse-view projections is one of important problems in computed tomography (CT) limited by the availability or feasibility of obtaining of a large number of projections. Traditionally, convex regularizers have been exploited to improve the reconstruction quality in sparse-view CT, and the convex constraint in those problems leads to an easy optimization process. However...
متن کاملAn efficient modified neural network for solving nonlinear programming problems with hybrid constraints
This paper presents the optimization techniques for solving convex programming problems with hybrid constraints. According to the saddle point theorem, optimization theory, convex analysis theory, Lyapunov stability theory and LaSalleinvariance principle, a neural network model is constructed. The equilibrium point of the proposed model is proved to be equivalent to the optima...
متن کاملLinear convergence of the Randomized Sparse Kaczmarz Method
The randomized version of the Kaczmarz method for the solution of linear systems is known to converge linearly in expectation. In this work we extend this result and show that the recently proposed Randomized Sparse Kaczmarz method for recovery of sparse solutions, as well as many variants, also converges linearly in expectation. The result is achieved in the framework of split feasibility prob...
متن کامل