Incremental Without Replacement Sampling in Nonconvex Optimization

نویسندگان

چکیده

Minibatch decomposition methods for empirical risk minimization are commonly analyzed in a stochastic approximation setting, also known as sampling with replacement. On the other hand, modern implementations of such techniques incremental: they rely on without replacement, which available analysis is much scarcer. We provide convergence guaranties latter variant by analyzing versatile incremental gradient scheme. For this scheme, we consider constant, decreasing or adaptive step sizes. In smooth obtain explicit complexity estimates terms epoch counter. nonsmooth prove that sequence attracted solutions optimality conditions problem.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Incremental Method for Nonconvex Optimization

We analyze a fast incremental aggregated gradient method for optimizing nonconvex problems of the form minx ∑ i fi(x). Specifically, we analyze the Saga algorithm within an Incremental First-order Oracle framework, and show that it converges to a stationary point provably faster than both gradient descent and stochastic gradient descent. We also discuss a Polyak’s special class of nonconvex pro...

متن کامل

Manifold Sampling for ℓ1 Nonconvex Optimization

We present a new algorithm, called manifold sampling, for the unconstrained minimization of a nonsmooth composite function h ◦ F when h has known structure. In particular, by classifying points in the domain of the nonsmooth function h into manifolds, we adapt search directions within a trust-region framework based on knowledge of manifolds intersecting the current trust region. We motivate thi...

متن کامل

Manifold Sampling for L1 Nonconvex Optimization

We present a new algorithm, called manifold sampling, for the unconstrained minimization of a nonsmooth composite function h ◦ F . By classifying points in the domain of the nonsmooth function h into what we call manifolds, we adapt search directions within a trust-region framework based on knowledge of manifolds intersecting the current trust region. We motivate this idea through a study of l1...

متن کامل

Manifold Sampling for Nonconvex Optimization of Piecewise Linear Compositions

We develop a manifold sampling algorithm for the unconstrained minimization of 4 a nonsmooth composite function f , ψ + h ◦ F when ψ is smooth with known derivatives, h is a 5 nonsmooth, piecewise linear function, and F is smooth but expensive to evaluate. The trust-region 6 algorithm classifies points in the domain of h as belonging to different manifolds and uses this knowl7 edge when computi...

متن کامل

Without-Replacement Sampling for Stochastic Gradient Methods

Stochastic gradient methods for machine learning and optimization problems are usually analyzed assuming data points are sampled with replacement. In contrast, sampling without replacement is far less understood, yet in practice it is very common, often easier to implement, and usually performs better. In this paper, we provide competitive convergence guarantees for without-replacement sampling...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Optimization Theory and Applications

سال: 2021

ISSN: ['0022-3239', '1573-2878']

DOI: https://doi.org/10.1007/s10957-021-01883-2