Quadratic Optimization with Orthogonality Constraints: Explicit Łojasiewicz Exponent and Linear Convergence of Line-Search Methods
نویسندگان
چکیده
A fundamental class of matrix optimization problems that arise in many areas of science and engineering is that of quadratic optimization with orthogonality constraints. Such problems can be solved using line-search methods on the Stiefel manifold, which are known to converge globally under mild conditions. To determine the convergence rates of these methods, we give an explicit estimate of the exponent in a Łojasiewicz inequality for the (non-convex) set of critical points of the aforementioned class of problems. This not only allows us to establish the linear convergence of a large class of line-search methods but also answers an important and intriguing problem in mathematical analysis and numerical optimization. A key step in our proof is to establish a local error bound for the set of critical points, which may be of independent interest.
منابع مشابه
Quadratic Optimization with Orthogonality Constraints: Explicit Lojasiewicz Exponent and Linear Convergence of Line-Search Methods
A fundamental class of matrix optimization problems that arise in many areas of science and engineering is that of quadratic optimization with orthogonality constraints. Such problems can be solved using line-search methods on the Stiefel manifold, which are known to converge globally under mild conditions. To determine the convergence rate of these methods, we give an explicit estimate of the ...
متن کاملAn affine scaling method for optimization problems with polyhedral constraints
Recently an affine scaling, interior point algorithm ASL was developed for box constrained optimization problems with a single linear constraint (GonzalezLima et al., SIAM J. Optim. 21:361–390, 2011). This note extends the algorithm to handle more general polyhedral constraints. With a line search, the resulting algorithm ASP maintains the global and R-linear convergence properties of ASL. In a...
متن کاملLinear Convergence of Proximal-Gradient Methods under the Polyak-Łojasiewicz Condition
In 1963, Polyak proposed a simple condition that is sufficient to show that gradient descent has a global linear convergence rate. This condition is a special case of the Łojasiewicz inequality proposed in the same year, and it does not require strong-convexity (or even convexity). In this work, we show that this much-older Polyak-Łojasiewicz (PL) inequality is actually weaker than the four mai...
متن کاملA Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کاملA note on the convergence of nonconvex line search
In this note, we consider the line search for a class of abstract nonconvex algorithm which have been deeply studied in the Kurdyka-Łojasiewicz theory. We provide a weak convergence result of the line search in general. When the objective function satisfies the Kurdyka-Łojasiewicz property and some certain assumption, a global convergence result can be derived. An application is presented for t...
متن کامل