Convergent Subgradient Methods for Nonsmooth Convex Minimization
نویسندگان
چکیده
In this paper, we develop new subgradient methods for solving nonsmooth convex optimization problems. These methods are the first ones, for which the whole sequence of test points is endowed with the worst-case performance guarantees. The new methods are derived from a relaxed estimating sequences condition, which allows reconstruction of the approximate primal-dual optimal solutions. Our methods are applicable as efficient real-time stabilization tools for potential systems with infinite horizon. As an example, we consider a model of privacy-respecting taxation, where the center has no information on the utility functions of the agents. Nevertheless, we show that by a proper taxation policy, the agents can be forced to apply in average the socially optimal strategies. Preliminary numerical experiments confirm a high efficiency of the new methods.
منابع مشابه
A strongly convergent method for nonsmooth convex minimization in Hilbert spaces
In this paper we propose a strongly convergent variant on the projected subgradient method for constrained convex minimization problems in Hilbert spaces. The advantage of the proposed method is that it converges strongly when the problem has solutions, without additional assumptions. The method also has the following desirable property: the sequence converges to the solution of the problem whi...
متن کاملErgodic Results in Subgradient Optimization
Subgradient methods are popular tools for nonsmooth, convex minimization , especially in the context of Lagrangean relaxation; their simplicity has been a main contribution to their success. As a consequence of the nonsmoothness, it is not straightforward to monitor the progress of a subgradient method in terms of the approximate fulllment of optimality conditions, since the subgradients used i...
متن کاملSelection Strategies in Projection Methods for Convex Minimization Problems
We propose new projection method for nonsmooth convex minimization problems. We present some method of subgradient selection, which is based on the so called residual selection model and is a generalization of the so called obtuse cone model. We also present numerical results for some test problems and compare these results with some other convex nonsmooth minimization methods. The numerical re...
متن کاملMinimization of Nonsmooth Convex Functionals in Banach Spaces
We develop a uniied framework for convergence analysis of subgradient and subgradient projection methods for minimization of nonsmooth convex functionals in Banach spaces. The important novel features of our analysis are that we neither assume that the functional is uniformly or strongly convex, nor use regularization techniques. Moreover, no boundedness assumptions are made on the level sets o...
متن کاملEmpirical and Theoretical Comparisons of Several Nonsmooth Minimization Methods and Software
The most of nonsmooth optimization methods may be divided in two main groups: subgradient methods and bundle methods. Usually, when developing new algorithms and testing them, the comparison is made between similar kinds of methods. In this report we test and compare both different bundle methods and different subgradient methods as well as some methods which may be considered as hybrids of the...
متن کامل