A J-symmetric quasi-newton method for minimax problems

نویسندگان

چکیده

Minimax problems have gained tremendous attentions across the optimization and machine learning community recently. In this paper, we introduce a new quasi-Newton method for minimax problems, which call J-symmetric method. The is obtained by exploiting structure of second-order derivative objective function in problem. We show that Hessian estimation (as well as its inverse) can be updated rank-2 operation, it turns out update rule natural generalization classic Powell symmetric Broyden from minimization to problems. theory, our proposed algorithm enjoys local Q-superlinear convergence desirable solution under standard regularity conditions. Furthermore, trust-region variant global R-superlinear convergence. Finally, present numerical experiments verify theory effectiveness algorithms compared Broyden’s extragradient on three classes

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A truncated aggregate smoothing Newton method for minimax problems

Aggregate function is a useful smoothing function to the max-function of some smooth functions and has been used to solve minimax problems, linear and nonlinear programming, generalized complementarity problems, etc. The aggregate function is a single smooth but complicated function, its gradient and Hessian calculations are timeconsuming. In order to gain more efficient performance of aggregat...

متن کامل

A Quasi-Newton Penalty Barrier Method for Convex Minimization Problems

We describe an infeasible interior point algorithm for convex minimization problems. The method uses quasi-Newton techniques for approximating the second derivatives and providing superlinear convergence. We propose a new feasibility control of the iterates by introducing shift variables and by penalizing them in the barrier problem. We prove global convergence under standard conditions on the ...

متن کامل

A Regularized Smoothing Newton Method for Symmetric Cone Complementarity Problems

This paper extends the regularized smoothing Newton method in vector optimization to symmetric cone optimization, which provide a unified framework for dealing with the nonlinear complementarity problem, the second-order cone complementarity problem, and the semidefinite complementarity problem (SCCP). In particular, we study strong semismoothness and Jacobian nonsingularity of the total natura...

متن کامل

A Nonmonotone Filter Method for Minimax Problems

In this paper, we propose a modified trust-region filter method algorithm for Minimax problems, which based on the framework of SQP-filter method and associated with the technique of nonmonotone method. We use the SQP subproblem to acquire an attempt step, and use the filter to weigh the effect of the attempt step so as to avoid using penalty function. The algorithm uses the Lagrange function a...

متن کامل

Smoothing Newton and Quasi-Newton Methods for Mixed Complementarity Problems

The mixed complementarity problem can be reformulated as a nonsmooth equation by using the median operator. In this paper, we rst study some useful properties of this reformulation and then derive the Chen-Harker-Kanzow-Smale smoothing function for the mixed complementarity problem. On the basis of this smoothing function, we present a smoothing Newton method for solving the mixed complementari...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical Programming

سال: 2023

ISSN: ['0025-5610', '1436-4646']

DOI: https://doi.org/10.1007/s10107-023-01957-1