نتایج جستجو برای: variable stepsize

تعداد نتایج: 259826  

2013
TOM GOLDSTEIN ERNIE ESSER RICHARD BARANIUK

The Primal-Dual hybrid gradient (PDHG) method is a powerful optimization scheme that breaks complex problems into simple sub-steps. Unfortunately, PDHG methods require the user to choose stepsize parameters, and the speed of convergence is highly sensitive to this choice. We introduce new adaptive PDHG schemes that automatically tune the stepsize parameters for fast convergence without user inp...

Journal: :CoRR 2018
Michal Rolinek Georg Martius

We propose a stepsize adaptation scheme for stochastic gradient descent. It operates directly with the loss function and rescales the gradient in order to make fixed predicted progress on the loss. We demonstrate its capabilities by strongly improving the performance of Adam and Momentum optimizers. The enhanced optimizers with default hyperparameters consistently outperform their constant step...

2006
Farouq M. Al Taweel Mansour M. Aldajani

In this work, we present a new proposal for the second-order Adaptive Sigma Delta Modulation (ASDM). The new proposed Adaptation scheme is based on using Operational Transconductance Amplifier (OTAs) as an Integrator and as an Amplifier to adapt the quantizer stepsize to control the voltage gain by feedback the quantizer output through adaptation scheme. The stepsize is changing up or down by t...

Journal: :IEEE Transactions on Automatic Control 2015

Journal: :Mathematical Problems in Engineering 2010

Journal: :Rairo-operations Research 2022

It is widely accepted that the stepsize of great significance to gradient method. An efficient method with approximately optimal stepsizes mainly based on regularization models proposed for unconstrained optimization. More specifically, if objective function not close a quadratic line segment between current and latest iterates, model exploited carefully generate stepsize. Otherwise, approximat...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید