A Simplex Algorithm - Gradient Projection Method for Nonlinear Programming

نویسنده

  • L. Duane Pyle
چکیده

W i tzgal l [ 7 L comment ing on the gradient project ion methods of R . Frlsch and J . B . Rosen , states: "More or less a l l algori thms for solving the l inear programming problem are known to be modificat ions of an algori thm for matrix inversion . Thus the simplex method corresponds to the Gauss-Jordan method . The methods of Frisch and Rosen are based on an interest ing method for invert ing symmetric matrices . However , this method is no t a happy one , considered from the numerical point of v i ew , and this seems to account for the relat ive instabi l i ty of the project ion methods" . This paper presents an implementat ion of the gradient project ion method which uses a variat ion of the simplex algori thm . The underly ing (wel l-known) geometric idea is that the simplex algori thm for l inear programm ing [1] provides a method for obtaining vectors along the "edges" [4] of the feasible region A={x | Ax=b ,x>6} which l ie In certain nu l l spaces. Th is property is discussed in ii detai l in sect ion §1., Geometric Analysis of the Simplex Method of Linear Programming . In sect ion §2., Project ion on Paces of A of Higher D imension , the geometric analysis .of §1. is extended to obtain the orthogonal project ion matrix P such that * ( P ) = N(A) n M f J 1 ) ) 1 i=s+l where ^ ( P ) is the range of P; N(A) is the nu l l space of A ; and M ( u ( i ) ) X = {x | X l =0} . The gradient project ion method [6] , [2] requires computat ions involving (l) an orthogonal project ion matrix whose range is a certain nu l l space; and (2) a related general ized inverse [33In sect ion §3.» Simplex A lgori thm Implementat ion of the Grad ien t Project ion Me thod , the developments given in }2. are combined w i th the simplex algori thm to prov ide the computat ional resul ts required by the gradient project ion method . Mot ivat ion for this approach may be found in [5]. In the approach given here , a representat ion of the project ion matrix P = (I-N N + ) \ r r ' is generated using the simplex a lgori thm , whereas Rosen gives a method for ob tain ing N r N* based on an algori thm Involving T — 1 ( N , j O ~ • Is a matrix whose columns are normals to the "active* 1 N r r * r constrain ts . ) If the dimension of R( l -N r N^) is smal l compared to the dimension of H ( N J ^ ) , as is the case when the current vector iii i terate l ies on a face of A of low d imension , one would expect significant computat ional improvements . This expectat ion is further enhanced by the use of a variat ion of the product form of the inverse in comput ing the vectors which const i tute the representat ion of the matrix P , and by the use of "simplex mul t ipl iers" and "relat ive cost factors" in the standard fashion of simplex algori thm technology .

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Three-terms Conjugate Gradient Algorithm for Solving Large-Scale Systems of Nonlinear Equations

Nonlinear conjugate gradient method is well known in solving large-scale unconstrained optimization problems due to it’s low storage requirement and simple to implement. Research activities on it’s application to handle higher dimensional systems of nonlinear equations are just beginning. This paper presents a Threeterm Conjugate Gradient algorithm for solving Large-Scale systems of nonlinear e...

متن کامل

On Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions

Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...

متن کامل

A New Method to Find All Alternative Extreme Optimal Points for Linear Programming Problem

The problem of linear programming (LP) is one of the earliest formulated problems in mathematical programming where a linear function has to be maximized (minimized) over a convex constraint polyhedron X. The simplex algorithm was early suggested for solving this problem by moving toward a solution on the exterior of the constraint polyhedron X. In 1984, the area of linear programming underwent...

متن کامل

A conjugate Rosen’s gradient projection method with global line search for piecewise linear optimization∗

The Kelley cutting plane method is one of the methods commonly used to optimize the dual function in the Lagrangian relaxation scheme. Usually the Kelley cutting plane method uses the simplex method as the optimization engine. It is well known that the simplex method leaves the current vertex, follows an ascending edge and stops at the nearest vertex. What would happen if one would continue the...

متن کامل

Sparsity Constrained Nonlinear Optimization: Optimality Conditions and Algorithms

This paper treats the problem of minimizing a general continuously differentiable function subject to sparsity constraints. We present and analyze several different optimality criteria which are based on the notions of stationarity and coordinate-wise optimality. These conditions are then used to derive three numerical algorithms aimed at finding points satisfying the resulting optimality crite...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013