نتایج جستجو برای: modified subgradient method
تعداد نتایج: 1831354 فیلتر نتایج به سال:
We present an extension to the subgradient algorithm to produce primal as well as dual solutions. It can be seen as a fast way to carry out an approximation of Dantzig-Wolfe decomposition. This gives a fast method for producing approximations for large scale linear programs. It is based on a new theorem in linear programming duality. We present successful experience with linear programs coming ...
An algorithm for solving convex feasibility problem for a finite family of convex sets is considered. The acceleration scheme of De Pierro (em Methodos de projeção para a resolução de sistemas gerais de equações algébricas lineares. Thesis (tese de Doutoramento), Instituto de Matemática da UFRJ, Cidade Universitária, Rio de Janeiro, Brasil, 1981), which is designed for simultaneous algorithms, ...
In this paper, we develop a version of the bundle method to locally solve unconstrained difference of convex (DC) programming problems. It is assumed that a DC representation of the objective function is available. Our main idea is to use subgradients of both the first and second components in the DC representation. This subgradient information is gathered from some neighborhood of the current ...
In this paper, we investigate how to minimize the distortion in the reconstruction of correlated sources. We consider a communication scenario where a sensor node is capable of harvesting energy from the environment and where the Fusion Center (FC), in order to exploit correlation, uses past observations as side information for decoding. We provide a convex formulation of the problem and derive...
The notion of a secant for locally Lipschitz continuous functions is introduced and a new algorithm to locally minimize nonsmooth, nonconvex functions based on secants is developed. We demonstrate that the secants can be used to design an algorithm to find descent directions of locally Lipschitz continuous functions. This algorithm is applied to design a minimization method, called a secant met...
Many practical optimization problems involve nonsmooth (that is, not necessarily differentiable) functions of hundreds or thousands of variables with various constraints. In this paper, we describe a new efficient adaptive limited memory interior point bundle method for large, possible nonconvex, nonsmooth inequality constrained optimization. The method is a hybrid of the nonsmooth variable met...
We consider a class of large scale robust optimization problems. While the robust optimization literature often relies on structural assumptions to reformulate the problem in a tractable form using duality, this method is not always applicable and can result in problems which are very large. We propose an alternative way of solving such problems by applying a constrained bundle method. The orig...
Semide nite relaxations of quadratic 0-1 programming or graph partitioning problems are well known to be of high quality. However, solving them by primaldual interior point methods can take much time even for problems of moderate size. The recent spectral bundle method of Helmberg and Rendl can solve quite eÆciently large structured equality-constrained semide nite programs if the trace of the ...
Many challenging problems in automatic control may be cast as optimization programs subject to matrix inequality constraints. Here we investigate an approach which converts such problems into non-convex eigenvalue optimization programs and makes them amenable to non-smooth analysis techniques like bundle or cutting plane methods. We prove global convergence of a first-order bundle method for pr...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید