نتایج جستجو برای: variable stepsize implementation
تعداد نتایج: 612759 فیلتر نتایج به سال:
In this paper we study the efficiency of Strong Stability Preserving (SSP) Runge–Kutta methods that can be implemented with a low number registers using their Shu–Osher representation. SSP have been studied in literature and stepsize restrictions ensure numerical monotonicity found. However, for some problems, observed are larger than theoretical ones. Aiming at obtaining additional properties ...
It is widely accepted that the stepsize of great significance to gradient method. An efficient method with approximately optimal stepsizes mainly based on regularization models proposed for unconstrained optimization. More specifically, if objective function not close a quadratic line segment between current and latest iterates, model exploited carefully generate stepsize. Otherwise, approximat...
This paper extends the recently introduced variable step-size (VSS) approach to the family of adaptive filter algorithms. This method uses prior knowledge of the channel impulse response statistic. Accordingly, optimal stepsize vector is obtained by minimizing the mean-square deviation (MSD). The presented algorithms are the VSS affine projection algorithm (VSS-APA), the VSS selective partial u...
We consider off-policy temporal-difference (TD) learning methods for policy evaluation in Markov decision processes with finite spaces and discounted reward criteria, and we present a collection of convergence results for several gradient-based TD algorithms with linear function approximation. The algorithms we analyze include: (i) two basic forms of two-time-scale gradient-based TD algorithms,...
The Barzilai–Borwein (BB) gradient method is efficient for solving large-scale unconstrained problems to modest accuracy due its ingenious stepsize which generally yields nonmonotone behavior. In this paper, we propose a new accelerate the BB by requiring finite termination minimizing two-dimensional strongly convex quadratic function. Based on stepsize, develop an optimization adaptively takes...
It is well known that DC offset degrades the performance of analog adaptive filters. The effects of DC offset on LMS derivatives such as sign-data LMS, sign-error LMS and sign-sign LMS have been studied to much extent but that on MLMS, VSSLMS and NLMS algorithms have remained relatively ignored. The present paper reports the effects of dc offset on LMS algorithm and its four variations Sign LMS...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید